TFGBV Taxonomy
Mitigation Strategy:

Transparent feedback and reporting

Last Updated 6/12/25
Definition: Enhanced Feedback Mechanisms for Reporting and Transparency.
Abuse Types:
Intimate image abuse (IIA) Doxxing Cyberstalking Online impersonation Sexual extortion Deceptive synthetic media Internet of things (IoT) abuse Online harassment Voyeuristic recording
Impact Types:
Self-censorship Psychological & emotional harm Social & political harm Economic harm Infringement of rights & freedoms
Targets:
Private individual Public figure Organization, group, community
Responsible Organizations:
Digital platform Internet of Things developer Regulatory entity Digital marketplace Cloud Hosting Providers Payment processor / financial service AI generation organization

The information on this page is adapted with permission from Prevention by Design by lead authors Lena Slachmuijlder and Sofia Bonilla.

According to one study, over half the victims-survivors who submitted reports to platforms did not receive any update, and 50% of users who did receive responses were informed that their intimate images did not violate policies (Refuge, 2022).

Build user-friendly, transparent flagging systems that allow users to report abuse easily and receive feedback on the status and resolution of their reports. Transparent reporting builds trust, allowing users to see how platforms handle abuse reports and reinforcing platform commitment to addressing TFGBV. Clear and efficient reporting mechanisms empower users to participate actively in shaping a safe online environment.

Examples

  • Block Party: Allows bulk reporting and organizes abusive content for easier review.
  • Pirth.org: Allows users to report online threats across any social media platform. Upon submission, it instantly generates a personalized action and resource list, including digital safety tools, helplines, legal aid, mental health support, and more. With user consent, the Pirth.org team reviews reports and escalates threats to platforms, easing the burden on victims.
  • Instagram’s Reporting System: Offers contextual guidance and feedback for flagged content.
  • Instagram report status: Allows for users to see the status of the reported content or account that they submitted to the platform and also allows users to see the full history of their past reports.
  • Instagram disabling screenshots and screen recording for ephemeral images or videos sent in private messages in efforts to combat sextortion and revenge porn that commonly happens when a user sends an intimate image and the receiving user screenshots or screen records it for ransom.

References

  • Refuge. (2022). Marked as unsafe: How online platforms are failing domestic abuse survivors. https://refuge.org.uk/wp-content/uploads/2022/11/Marked-as-Unsafe-report-FINAL.pdf
  • Slachmuijlder, L., & Bonilla, S. (2025). Prevention by design: A roadmap for tackling TFGBV at the source. https://techandsocialcohesion.org/wp-content/uploads/2025/03/Prevention-by-Design-A-Roadmap-for-Tackling-TFGBV-at-the-Source.pdf

Opportunities

  • Platforms should adopt interoperable documentation tools, enabling users to collect and share evidence of abuse—such as screenshots and metadata—across multiple platforms. These measures reduce retraumatization, empower victims to take action, and encourage platforms to improve cross-platform accountability for abuse.
  • Platforms can collaborate with or integrate existing reporting mechanisms developed by civil society to track instances of TFGBV across various platforms. This would lighten the burden on platform engineers and indicate a willingness to cooperate with civil society to tackle this important issue.

Limitations

  • Reporting may take too long and the damage will have been inflicted
Is something missing, or could it be better?
About This SiteGet InvolvedContactCopyrightPrivacy
Loading...