GIPHY utilizes a combination of technology and human moderation to detect and review prohibited content. This approach allows us to enforce our policies accurately and at scale. The moderation team takes several review factors into consideration when taking enforcement actions, always striving to take reasonable and consistent actions.
GIPHY’s content moderation team employs the following to identify and remove content policy violations:
When the technology services in our moderation pipeline identify prohibited content with a high confidence we may automatically delete it. Content that is identified with a lower confidence is reviewed by human moderators before any action is taken. Some content types may be moderated by humans alone to ensure accurate decisions. Human moderators also review every report submitted by users through GIPHY’s onsite features.
For a list of the third party organizations that GIPHY utilizes in our moderation process, please see this article.
Users can appeal moderation decisions if they believe an action was taken in error. To appeal an action, contact us here