Mitigation strategies are ways to reduce the impact caused by forms of abuse. A single mitigation strategy can often help address multiple forms of abuse when properly implemented. Our aim is to support the various responsible parties in this ecosystem in proactively addressing potential harms where possible, and accelerate a team’s ability to respond to actively perpetuated harms when it is not.
We have worked with the authors of Prevention by Design and are using direct quotes from their white paper for most the language in this Mitigation strategies section. To learn more, we encourage you to read their paper itself!
Many of the mitigations we list here are implemented by companies that build the technology that facilitates the kind of gender-based violence this taxonomy intends to address. We at Humane Intelligence are proponents of the proactive consideration and of potential harms that might occur on the software you are developing. This proactive approach is often referred to as Safety by Design, and it can eliminate the potential for harm before it occurs — protecting a company’s user base, promoting trust, and reducing the likelihood of being taken to court for negligence or accessory to violence.
If you are curious to learn more about Safety be Design, it is explained in greater detail at the Australian eSafety Commissioner’s website.
The mitigation strategies listed here are ones have been used successfully by different groups, but is by no means a complete list. If you know of other effective mitigation strategies or develop your own, please let us know.
For the purpose of introducing viewers to this space, relationships displayed between objects in this taxonomy (abuse types, perpetrator types, mitigation strategies, etc) represent the most common connections we've seen between them. In the real world, any object can relate to any other.
Is there anything you’d like to provide comment on in this taxonomy? We’d love to hear your feedback!