Women and girls face disproportionately high, and often uniquely gendered, risks to their safety in online spaces. These harms range from misogynistic abuse and targeted harassment to image-based sexual exploitation and technology-facilitated domestic abuse. For many, these risks are further intensified by intersecting factors such as race, disability, sexuality, and socio-economic status, creating a compounded pattern of vulnerability that current platform safeguards frequently fail to address.

In response to this growing epidemic, the UK communications regulator, Ofcom, has published new statutory guidance, ‘A Safer Life Online for Women and Girls’, as part of its implementation of the Online Safety Act (OSA). While the OSA already requires platforms to identify, assess, and mitigate risks of online harm, this guidance significantly raises expectations for tech companies. It outlines clear measures platforms should take to reduce gender-based abuse through improving user protections, and addressing systemic failures that enable persistent online harms.

By explicitly calling on tech firms to ‘up their game,’ Ofcom sets a more ambitious regulatory standard-one that shifts responsibility toward proactive safety design, better reporting mechanisms, stronger enforcement, and meaningful transparency. This guidance marks a critical step in aligning platform accountability with the lived realities of women and girls online, challenging the sector to deliver safer, more inclusive digital environments.

The Guidance Broken Down: Focusing on Four Critical Harms

Ofcom’s draft guidance zeroes in on four areas where women and girls face specific threats. The first is Online Misogyny, which describes content that actively encourages or cements misogynistic ideas or behaviours. This category also includes content that promotes misogynistic ideas or behaviour, sexually explicit content that encourages harmful sexual behaviour, and content normalising gendered or sexual violence. The guidance also focuses on Pile-ons and Online Harassment, which are cases where coordinated perpetrators target a specific woman or group, often public figures, with abuse and threats of violence. These pile-ons frequently consist of misogynistic content including threats, and image-based sexual abuse. The third focus area is Online Domestic Abuse, defined as using technology for coercive and controlling behaviour in the context of an intimate relationship. Finally, the guidance addresses Image-based Sexual Abuse, which refers to intimate image abuse (the non-consensual sharing of intimate images) and cyberflashing (sending explicit images to someone without their consent).

The Importance of This Guidance

The necessity of this guidance stems from the severe impact online abuse has on women’s lives, often extending harmful gender dynamics that exist in wider society. Research confirms that women are significantly more fearful than men about being targeted by all 15 forms of online harm surveyed. This fear often translates into measurable negative psychological consequences, with women reporting significantly greater instances of feeling sad, low, angry, frustrated, and experiencing physical symptoms like insomnia or headaches, compared to men.

Crucially, online harms are actively silencing women. Women report being less comfortable than men expressing political opinions online (only 23% of women compared to 40% of men), challenging content they disagree with, or sharing opinions more generally. For example, fear of being targeted by misogyny is associated with a reduction in comfort when expressing political opinions online. This restricted participation means that gender inequality in public discourse is likely to be perpetuated if women feel too fearful to engage. Therefore, an improved framework is essential to foster safer online spaces and ensure equitable public participation.

How Companies Can Be Proactive: A Safety-by-Design Approach

To truly move beyond mere compliance with the law, tech companies must embrace a safety-by-design ethos by adopting the advisory ‘good practice steps’. This approach covers nine actions across three categories: Taking Responsibility, Preventing Harm, and Supporting Women and Girls.

Under the ‘Preventing Harm’ pillar, providers should conduct abusability evaluations and product testing (Action 4) to identify how features could be exploited by malicious actors before they are rolled out. Furthermore, services should set safer default account settings (Action 5), such as requiring more authentication steps to prevent perpetrators from monitoring accounts, and removing geolocation by default. To reduce the spread of abuse (Action 6), companies should implement measures like ‘rate limiting’ to cap the volume of responses during ‘pile-on’ attacks and deploy prompts asking users to reconsider before posting potentially abusive content. The use of hash-matching technology to detect and remove non-consensual intimate images is also urged, a measure Ofcom is consulting on to make mandatory. This should be supported by Actions 2 (Conduct risk assessments that focus on harms to women and girls) and, 3 (Be transparent about women and girls’ online safety), which provide the corporate governance wrap-around. 

Regarding ‘Taking Responsibility,’ companies should ensure governance addresses gender-based harms (Action 1) by consulting relevant experts and implementing specific policies. Under ‘Supporting Women and Girls,’ platforms must improve user control (Action 7) by providing tools like bulk-blocking or muting for users facing coordinated harassment. Reporting systems (Action 8) must be accessible and trauma-informed, and platforms should take appropriate action (Action 9) against offenders. This includes requiring measures to prevent serial perpetrators from re-registering or migrating across platforms to continue harmful behaviour. Ofcom also expects companies to continually consult with experts, including women’s safety NGOs and survivors, to inform their risk analysis and design effective policies.

Companies should be aware that while the foundational steps relate to existing legal duties concerning illegal content and children, the ‘good practice steps’ are purely advisory. However, Ofcom is setting out a five-point plan and will publicly report on industry progress in summer 2027. If action falls short, the regulator has warned it will consider making formal recommendations to the Government on strengthening the Online Safety Act.

If you are looking for guidance in implementing the OSA in your organisation, get in touch. 

Book a Call

We have experts here to help you