All Collections
Customization
Other Config Options
Automatic Review Moderation with A.I.
Automatic Review Moderation with A.I.

Fera may use artificial intelligence to automatically approve or decline customer reviews for you.

Jay El-Kaake avatar
Written by Jay El-Kaake
Updated over a week ago

Fera features the ability to automatically approve or decline reviews after a customizable time period has elapsed using a patent-pending artificial intelligence approach.

What is A.I. Auto Moderation?

A.I. Auto-Moderation is a patent-pending feature in Fera that uses artificial intelligence to attempt to automatically approve or decline a review submission from customers.

Why should I use it

  1. You save time: It makes it so you don't have to spend time approving or declining review submissions upon every receipt.

  2. It's extremely accurate: It is extremely accurate at detecting several categories (listed below)

  3. It's more ethical: It is more ethical to use than manually approving/declining reviews.

  4. It's free: The feature is available for free for all Fera users across all plans.

How do I enable it?

It will be enabled by default for new signups after August 15, 2023.

You can manage the setting from the Configuration > Submissions -> Automation section.

How does it work?

Once enabled Fera will try to use A.I to parse your review and figure out if it should be approved or not.

If configured, Fera will wait the required number of days before trying to automatically moderate a review with artificial intelligence.

By default positive reviews are moderated immediately, negative reviews are moderated after 14 days, and reviews with photos and videos are moderated after 1 day.

If the A.I. cannot figure out whether it should be approved then it will be left as pending and you will have the ability to approve or decline the review within 14 days.

What kinds of things does the AI look for?

  • Hate
    This pertains to content that fosters, stimulates, or endorses hate based on aspects like race, gender, ethnicity, faith, nationality, sexual orientation, disability, or caste. It also includes content that harasses non-protected groups, such as chess enthusiasts.

  • Hate/Threatening
    This involves content that is not only hateful, but also implies violence or serious harm towards the targeted group, again based on elements like race, gender, ethnicity, faith, nationality, sexual orientation, disability, or caste.

  • Harassment
    This captures content that advocates, instigates, or endorses harassing behavior towards any person or group.

  • Harassment/Threatening: This refers to content that is not just harassing, but also suggests violence or severe harm to any person or group.

  • Self-Harm
    This relates to content that glorifies, encourages, or illustrates self-harming activities, including suicide, self-inflicted wounds, and eating disorders.

  • Self-Harm/Intent
    This encompasses content where the communicator expresses their participation or intention to participate in self-harming activities, such as suicide, self-cutting, and eating disorders.

  • Self-Harm/Instructions
    This category includes content that encourages the act of self-harm, such as suicide, self-inflicted injuries, and eating disorders, or provides guidance or tips on how to engage in such activities.

  • Sexual
    This consists of content designed to elicit sexual arousal, including descriptions of sexual acts or promoting sexual services, with the exclusion of sex education and wellness content.

  • Sexual/Minors
    This includes sexual content that features individuals under the age of 18.

  • Violence
    This includes content that portrays death, violence, or physical harm.

  • Violence/Graphic
    This refers to content that portrays death, violence, or physical injury in explicit and graphic detail.

For example, the following review would get declined excessive violence:

Example of declined review.

Can I change how long it waits before auto-moderating a review?

Yes, this is done in the Submission Settings under the Automation tab after you turn the feature on (or if it is already on).

You may need to click the "Show More Options" link to reveal the advanced customization options.

How can I see whether a review has been moderated by A.I.?

Reviews that are moderated by A.I. show an entry in the review history tracking like this:

Review moderated by A.I history

Find the review in your reviews management page, then go to the "History" tab to see the entry.

If you don't see an entry regarding A.I. moderation then Fera didn't auto-moderate the review (yet).

What happens if the A.I. is unsure whether to approve or decline a review?

If the A.I. is unable to determine whether a review should be approved or declined then it will leave the review as pending.

You'll receive a notification for pending reviews (assuming you haven't turned off the notifications) and have 90 days from the date of submission to approve or decline it.

Can I reverse the decision of the A.I.?

Yes! Simply find the review in Fera's management UI and change the state of the review within 90 days of its submission like this:

How to decline a review that was moderated by A.I. in Fera

What happens if I don't approve or decline a review that the A.I. wasn't able to moderate?

After 90 days Fera will automatically approve the review if it is from a verified customer if you do not approve or decline it, so make sure you take action!

Did this answer your question?