The information on this page is adapted with permission from Prevention by Design by lead authors Lena Slachmuijlder and Sofia Bonilla.
Use quarantine systems to temporarily hold flagged content in a separate review area, allowing users and platform moderators from the Trust and Safety team to examine it without immediate public exposure. Quarantine systems balance the need for safety with respect for the freedom of expression. By providing a structured approach to handle ambiguous cases, platforms can address concerns about over-censorship and ensure appropriate content management while preventing TFGBV.
The quarantining approach promotes transparency because the inclusion of a "quarantine" section pulls instances of harmful behavior off the platform and invites users to collaborate with the Trust & Safety team in the flagging/triage process, offering them greater agency. A dedicated UI element such as this–rather than customer support email threads or unmanned chat bots–emphasizes the importance of these issues to the platform and simultaneously empowers users. In terms of protection, it ensures that content, which might otherwise be removed entirely (which could be seen as censorship) or ignored (which could be seen as negligence), receives the appropriate attention. This is especially relevant in content moderation “gray areas,” where the Trust & Safety team may not be certain whether the content is problematic, and in response, brings in the user to approach the matter thoughtfully rather than simply removing swaths of content.