For the vast majority of people, gaming is a hobby. It’s a way to blow off steam, have fun, and spend time with friends. But for some, gaming is a business or a career. Of course, there are some professional players who compete in esports or who earn their crust by streaming their gaming sessions on sites like Twitch, but most people that work in the industry have jobs that involve the creation and distribution of titles.
Video game development is big business. There are around 300,000 people employed in the industry in the US alone, with a further 30,000 in the UK and similar numbers in other European countries.
While it is pretty likely that many of these people got into the industry because of their passion for the medium and the desire to create great content, the fact remains that they need to be paid.
To cover their salaries and the other costs that go into the development and distribution of new games, publishers have to find ways to charge players who enjoy them. Over the decades, their approach has changed several times and has almost come full circle.
The pay-per-play model was the most obvious model for video game publishers in the early days of the medium. They placed large machines in public spaces like pubs and arcades and charged players a fee for a certain number of minutes and/or lives. Once they were expired, they’d have to insert more coins if they wanted to keep enjoying the game.
The model replicates the one used by jukeboxes, pool tables, and vending machines, all of which preceded these arcades.
The pay-per-play model remains in use today, both in arcades and among iGaming companies. For example, sites that offer poker for real money require players to make a deposit before they can join a table. Then, just like in any other setting, players place wagers in rounds as the cards are dealt.
Pay Up Front
As technology improved and players could get their own computer or console for their homes, the gaming industry shifted the way it monetized its content. Instead of charging people per play, they would sell them the entire game in one go, allowing them to install it and run it at home as often and as much as they wanted.
Even has the distribution method and mediums have changed, the monetization model remained the same. That’s why you can now buy a game from a platform like Steam and have it downloaded to your computer in a matter of minutes. This improves convenience for the player and reduces costs for the distributor.
This model remained the most common option from the 1980s through to the present day, but it is slowly being replaced as publishers look to cover the ever-increasing costs of creating modern titles.
Microtransactions first began appearing in the 2000s as a way to sell players additional in-game content after they’d already bought the game. Early examples include map packs for Call of Duty and new cars for Forza.
This was the second attempt for publishers to create additional income from their games.
The first came in the form of an online pass that players would receive for free if they bought the title new, but would have to buy separately if they bought the game second-hand. This was a way for publishers to cash in on the resale market, which they had previously been excluded from.
Gamers hated the idea of being charged to access something that already came with the game, but they were more than happy to pay extra for additional content that was created after it was shipped.
Over time, microtransactions have become some lucrative for developers and publishers that they’ve begun giving their games away for free and designing the entire title around in-game items that players can buy.
Leading brands like Take-Two Interactive now make around half of their revenue from these small but regular payments from players. They are so successful that we are likely to see even more of them in the future.
This does, in a way, bring us back full circle, with players regularly making payments to enjoy their favorite games, just like they did with arcade games in decades gone by.