From the Magnavox Odyssey launch all the way back in 1972 to the current high-end graphics and immersive experiences offered by today's PS5 and Xbox Series X, console gaming has been a force to be reckoned with, and although some things have remained the same a lot has changed, especially when you look at how console game monetization has evolved.
Console gaming began as a simple and straightforward transaction — buy the console, buy the game, and play. There was no such thing as in-game purchases or downloadable content. Although these monetization methods began appearing in PC games as early as the 1990s with The Bard's Tale Construction Set, which was released by Interplay Productions in 1991 and noted as one of the first games to have received DLC, it took a long time for the console ecosystem to consider new forms of monetization outside of the pay to play model.
With blockbuster console titles from well-known franchises like SpiderMan, Mario, Call Of Duty, and EA Sports FC (formerly FIFA) continuing to dominate the headlines, if you aren't a console player, you could be forgiven for thinking the pay-to-play method is still the only monetization method for this platform, however, this couldn't be further from the truth.
With the evolution of technology and the gaming industry, stronger connection speeds, and most games going digital, new monetization methods such as microtransactions and downloadable content (DLC) were slowly introduced to the console gaming ecosystem. These allowed game publishers to generate continuous revenue from a single game.
A well-known example of this was FIFA's Ultimate Team mode, where players can pay real money to unlock new players. A study that surveyed over 1,500 FIFA players on their in-game spending habits found that 38% of players admitted to spending some form of real-world currency to attain success in the game, and while the majority of the spend was relatively reasonable, some worrying extremes were revealed. One player admitted to spending over $280,000 on FIFA points, and 17% of those surveyed admitted to spending between $100-$500.
This fed into a wider conversation around microtransactions and more specifically loot boxes in gaming which resulted in many countries beginning to ban or restrict the use of them. This is an issue that continues to be debated globally and one that doesn't look like it will be resolved anytime soon.
Downloadable content provides an additional stream of revenue as it allows game developers to release new content for an already-released game, keeping the game fresh and exciting while encouraging players to keep playing. DLC was a well-known part of many PC games for a long time. Many of us remember having about ten different disks for The Sims and all its expansion packs and having no idea which one you needed to play the actual game! However, DLC took a while to arrive on consoles for a number of reasons, including storage space and the fact that it took consoles a long time to adopt a digital-first mindset.
DLC can cover a multitude of content, from in-game outfits and weapons to whole new levels, worlds, and activities. DLC has come a long way within the console ecosystem to the point where it's now almost impossible to find a game that doesn't offer some kind of DLC as an add-on to elevate your playing experience. The main problem from a monetization standpoint is that developers have to rely on players wanting to purchase extra content.
As a developer, you are taking a chance that the extra content offered is worth paying more for, meaning revenue generated through this type of monetization is often not reliable or consistent. Many developers, just like the pay-to-play model, see an uptick in revenue around the launch period and then a huge drop-off once the game hype dies down. However, there are a number of exceptions, and this is especially true when it comes to free-to-play titles.
Mobile gaming and the App Store launch was a huge turning point for free-to-play titles, and their popularity has helped a number to begin thriving within the console ecosystem. Fortnite is often looked at as the prime example of the free-to-play console game that continues to perform extremely well and which helped to kick off a stream of free-to-play console games, including Call Of Duty Warzone, Rocket League, and Fall Guys.
The rise of free-to-play titles has begun disrupting the traditional pay-to-play model and challenged many developers to think about how they position and monetize their console games. Unlike free-to-play mobile titles, where many of the early games were funded by disruptive advertising, console players are not and have never been used to experiencing ads within their games so developers have continued to look to DLC and microtransactions to help fund these free-to-play titles. However, this is changing.
Intrinsic in-game advertising provides an opportunity for advertisers to reach a highly engaged audience in a unique and immersive environment and allows game developers to generate revenue without disrupting the gaming experience. This form of advertising is already extremely popular within many PC games, with developers implementing ad placements within their games in places where you would expect to see them in real life, like around sports stadiums, alongside racetracks, and on the sides of buildings and bus stops. By replacing fake ads with real ones, developers are able to generate consistent and reliable revenue directly from advertisers, meaning they don't have to pass the costs along to players.
With Xbox and PlayStation both having announced their intentions to look into this form of advertising, it's not going to be long until we begin to see this monetization method used by more console developers to help them continue to offer the best experience possible to players while covering the high costs and resources required to build and launch console titles.
To find out whether monetizing with in-game ads could be right for your games, get in touch by leaving your details below.