Skip to main content

Moderation

Dust can automatically scan camps, art, and events for potentially problematic content when it is submitted or changed. Content includes:

  • Camps - Names and descriptions
  • Art - Names and descriptions
  • Events - Titles and descriptions

Configuration

  1. Go to Settings in your burn
  2. In Options choose the Moderation Style:
  • No moderation: Moderation is not performed. You will need to manually review content
  • Flag: Content is flagged to the adminstrator but is still published in the dust app
  • Flag and Hide: Problematic content results in the camp, art or event being hidden in the dust app

What is problematic content?

Dust uses the OpenAI moderation API which is designed to detect the following issues:

  • Sexual content - Explicit sexual material
  • Hate speech - Hate speech targeting individuals or groups
  • Harassment - Content intended to harass or bully
  • Self-harm - Content promoting self-harm
  • Sexual content involving minors - Explicit sexual material involving minors
  • Hate speech with threats - Hate speech combined with threatening language
  • Graphic violence - Extremely violent or gory content
  • Self-harm intent - Content expressing intent to self-harm
  • Self-harm instructions - Content providing instructions for self-harm
  • Harassment with threats - Harassment combined with threatening language
  • Violence - General violent content
  • Other - Other problematic content