- Newsroom
- >
- How in-game advertising is reacting to kid safety concerns in the digital realm
How in-game advertising is reacting to kid safety concerns in the digital realm
Kids' safety in the digital realm is an issue that tech companies and advertisers have grappled with for a long time. Recently, we have seen many stories in the press exposing malpractices at places like Google and TikTok, showing that we still have a long way to go.
In response to rising concerns, we see digital platforms like Roblox, which recently announced it would hide advertisements from users 13 and younger, and governments trying to implement policies to better protect kids. In the US, congress is considering a controversial bill called the Kids Online Safety Act (KOSA), which aims to create restrictions for online platforms that will — in theory — protect kids from potentially harmful content. In the UK in June, the government made amendments to the online safety bill to try to bolster protections for children online.
The problem with implementing all these local, international, and platform-specific rules is that it puts many advertisers off trying to reach kids, as they either don’t understand how or are worried that they could get penalized due to all the different laws they need to navigate.
During a discussion Anzu held around kid safety, Denise Tayloe, co-founder and CEO of Privo, a children’s online privacy, age assurance, and consent management platform, said that simply ignoring this audience is not an option. “Kids are now engaging in digital experiences from as young as two. They grow up and become long-time customers, so how can brands leave them out and ignore them? Instead, they need to embrace them and open up their doors in a reasonable way”. Also, according to research by the NRF, 90% of parents say their children influence their purchase decisions.
One of the reasons for the lack of privacy protocols in place to protect kids online is that these environments were not originally designed with kids in mind. This means solutions, laws, and regulations are being shoehorned into an existing ecosystem rather than being built from the ground up with kids in mind, which would be far easier.
Where does in-game advertising come in?
When we think about gaming and, more specifically, in-game advertising, it’s a relatively new channel. This means we have had an opportunity to build in standards and regulations from the start to protect vulnerable audiences and learn from the mistakes made across other platforms. The great thing about in-game advertising is that due to how gaming works, there are several different layers that help safeguard audiences, advertisers, and game developers.
You have protections on a local and international level with child protection laws like COPPA. You then have protections on a game level, including PEGI ratings and App Store ratings like those found on the Android, IOS, and Steam stores. You have protections on an adtech level, which include standards and laws set by the IAB and other industry bodies around the globe.
You then have protections on a solution level like those we put in place at Anzu around the games we work with, the audiences we can target, and the areas our ad placements are built into. On top of all this, we also work closely with a number of well-known third-party adtech vendors who verify and report on the performance of all our traffic.
We need to talk about targeting
An area that many advertisers have been called out on when it comes to privacy is targeting and the use of personal data, especially around kids. Many are looking to in-game advertising as a future-proofed platform because, unlike many other traditional advertising channels, it has never relied on cookies. Instead, it offers advertisers several solutions that provide precise audience targeting across platforms and devices without using persistent user identifiers so users retain their privacy.
The first is statistical demographics data, combining estimated user non-precise locations with statistical location-based demographics data. The result? Advertisers gain access to an array of audience targeting options spanning age groups, gender, family, status, and more, all without the need to store specific user data. This allows advertisers to effectively target households with kids without relying on or collecting any personally identifiable information (PII).
Next, and currently, one of the most widely discussed methods is contextual targeting. It revolves around matching game titles to distinct audience types with well-defined demographic attributes and interests. This audience selection is then translated into targeting game bundle IDs based on contextual targeting product data. For instance, a Barbie-themed game might align with predominantly young female audience groups. When an advertiser selects these audience segments, the targeting encompasses not only the Barbie game bundle IDs but also bundle IDs of other games with a substantial audience of a similar profile.
Finally, it's common for games today to ask users to input information, such as birth date, age group, gender, and more, depending on the game developer's need to collect first-party data. When publishers choose to share this anonymized information with advertisers, it enables ad targeting based on age, location, language, and gender, all while preserving user privacy and not identifying specific individual users or their cross-device and cross-app behavior.
These are three of the most common ways to target using in-game advertising today. However, it’s important to note that there are many more. At Anzu, we continuously work to understand how to create meaningful connections between advertisers and players in a privacy-safe way, working with the broader industry to ensure gaming remains a safe space and everyone in the chain is protected and respected.
Originally published on The Drum.
Nick works as Anzu's Content Lead. As a gamer with a background working in AdTech, he has a unique perspective on the industry and the in-game advertising sector.