

Explained:
6 min read
The ASA’s Report on Harmful Depictions of Women in In-app Ads
TL:DR:
The ASA monitored in-game mobile ads and found 8 that harmfully objectified women. It examined how the ads ended up in games and used 4 case studies to gain insights into how the ad supply pathway could be improved to stop this content being displayed.
'In-app ads that harmfully objectify women: findings revealed'
Here's our overview of the report and how it affects the mobile games industry. If you find it useful and want deeper analysis, tailored advice and support, or help with strategy, then get in touch – we’d be happy to discuss how Flux Digital Policy’s expertise can meet your needs. Even if you’re just curious, we'd love to chat.
Why did the ASA act?
Since 2023, the ASA has investigated several in-app ads for mobile games that harmfully objectified women or otherwise risked condoning violence against them. The ads included imagery relating to sexual violence and assault, harmful stereotypes, and sexual objectification. Some of these appeared in apps likely to be used by children.
Understandably concerned at what they found, they launched a project to uncover how these ads ended up in games. They have completed this work and today published their report, which aims to improve the ability of the mobile advertising sector to prevent harmful content that contravenes the Advertising Codes from appearing in apps and games.
Why should the games industry take notice?
This kind of content has no place in ads, least of all those seen by children. Players do not have a choice over the in-game ads that they see, and the sudden, unrequested appearance of content that depicts and condones assault and sexual objectification is harmful and upsetting – that's why the ASA has taken action.
Playing games should be a safe and enjoyable experience, and while the ads in question are few in number, harmful depictions of women can pose a significant risk, particularly to children. The continued appearance of these ads is severely detrimental to the reputation of the mobile games sector as a safe place to play and contributes to wider concerns about women’s experiences in video games.
Although advertisers have overall mandatory responsibility for the content and intended targeting/delivery of their ads, intermediaries and publishers (the games in which the ads appear) should be doing everything they can to prevent harmful advertising from being delivered to players. The report identifies ways in which this could be done and encourages the sector to engage with the regulator – you can be part of the solution.
What did the ASA do?
The ASA engaged cyber security consultants to create 2 adult and 2 child profiles and monitor ads delivered in 14 game apps over the span of 3 months. They collected 5923 ad impressions – 8 of these contained content that the ASA judged to be harmful depictions of women, akin to those already investigated.
Concerningly, the ads included content that appeared to condone relationships between teachers and underage pupils, as well as non-consensual sexual activity; worse, 7 of the ads were delivered to the child profiles. While this is only 0.14% of all ad impressions collected during the project, harmful depictions of women pose a significant risk, particularly to children.
The ASA contacted the advertisers (the companies that made the ads), the intermediaries (the networks that place the ads in games), and the publishers (the companies within whose games the ads appeared). They asked for their comments on how the ads came to appear in the monitored games, then drew together insights as to how each party in the pipeline can help to prevent this type of harmful content being served in games.
What did the games companies say?
Advertisers
Of the four advertisers, two said that they had removed the ads from circulation and outlined concrete plans to improve processes like moderation and staff training to prevent similar content being used in future. One said that they hadn’t owned the game at the time and therefore weren’t responsible for the ads, although they committed to complying with advertising standards in their own activities. The fourth didn’t reply.
Intermediaries
The intermediary said that they had a content policy prohibiting this type of ad and that they operated a moderation process. They outlined concrete plans to review and improve these systems and their training materials.
Publishers
Most publishers acted to remove the ads from circulation and gave details of the restrictions they applied through the intermediary. These ranged from an age-gate based on reported ages, through to extensive use of several tools and blocking policies. More than once, where these more stringent measures had failed to block one of the ads, this was likely due to a mis-categorisation of the ad. One publisher did not respond, and another did not give details about how they managed content.
App stores
Because the project focused on mobile apps and games, the ASA also contacted Apple and Google for comment; Google confirmed their policies in relation to their Google Ads service and Apple confirmed theirs in relation to ads appearing in third party apps available from their App Store. They stated that these policies would prohibit the type of content seen in the project.
What were the ASA’s findings?
The ASA did not uncover any systemic problems; all instances of the ads appearing seemed to be the result of individual ad hoc issues. As such, their insights focused on how these arose. Broadly summarised, they said:
Intermediaries and publishers were taking steps to prevent the ads appearing and there was no evidence that they placed the ads deliberately or through carelessness.
Not everyone in the pipeline was sufficiently knowledgeable about the UK Advertising Codes, leading to inadequate moderation or restriction processes. However, no respondents defended the ads under their own policies.
Certain games are inherently at a higher risk of creating harmful ads, since they feature highly sexualised themes. There are stricter controls around marketing content than gameplay itself, so not all in-game content is suitable for use in marketing.
Misclassification of content was a frequent factor. Some of the ads had either been labelled as a less restricted category (such as ‘puzzle’), not flagged as containing sexual content, or categorised too broadly to be caught by intermediary or publisher restrictions.
What will the ASA do now?
The Committee of Advertising Practice (the ASA’s sister organisation) has produced specific guidance on the issue, building on the significant wider resource base already available.
The ASA will continue to engage with all parts of the in-game ad supply pipeline and is considering ways in which to further this work, as well as using the report as a foundation for discussions with other regulators, policymakers, and the wider industry.
Finally, they have made clear that, given how straightforward it was for them to find examples of harmful ads, intermediaries and publishers should consider what more they could do to detect and remove such content.
How can Flux help me?
Our team includes a policy expert who led CAP’s work on video and mobile games regulation for several years. Regardless of your size, reach or platform, Flux is perfectly placed to help games companies and ad intermediaries understand their obligations and responsibilities in this area, including identifying ways in which they can act on the insights from the report. We can also help you engage with the ASA and CAP as they continue to look at this issue.
If you want to get involved, let’s chat.
Author: Dr Celia Pontin, Director of Public Policy and Public Affairs