Video games have seen a tremendous technological evolution over the past four decades, from the days of Pong and Asteroid to Virtual Reality, and the ever-increasing lifelike realism of characters, in-game settings, and advanced mechanics; advances found in games such as Spider-man, God of War, and numerous other Triple-A titles. The industry has come a long way since the time when 8-bit graphics were the latest in cutting edge visuals. Yet with the ever-evolving technology of the videogame industry, the retail market(the price of games and methods in selling them) as well as the principles which shape what is being sold in the market, have changed, bringing the introduction of microtransactions.
In the past, when we purchased a video game for $50 or more, we expected, as the consumer, that we purchased the full product, that what we had was the full package. Yet in today’s generation of updates and new releases, with microtransactions such as DLCs, season passes, loot crates, and more in online and console gaming, the product we purchased and our placement in the game’s rankings at the beginning of the year may be completely different by the end of it. In this article, we will be covering a brief history of microtransactions. Before we continue, we should define what microtransactions are exactly.
Microtransactions can be most simply understood as online (usually small) monetary transactions for in-game items or content, where you are paying real-world money in exchange for something in the game. It can encompass anything from unlocking a new character to in-game currency, or simply paying for additional missions or quests. More broadly, it may also include subscription services to access online multiplayer modes or games.
In the late 1970s and the ’80s, before gamers migrated to their homes with their NES (Nintendo Entertainment System) Consoles, gamers found a community at the local arcade where, for a handful of quarters, you could partake in games like Asteroid, Donkey Kong, Pac-Man, Galaga, and more. These arcades are considered by some to be the first concept of microtransactions in gaming.
1999 - 2009
Amidst the impending doom of Y2K came the birth of MMORPGs, with Everquest coming out in 1999. It brought forth a revolutionary idea behind massively multiplayer online gaming that would inspire countless MMORPGs over the next decade. It also opened the floodgates for new models of revenue generation to be explored in gaming, specifically subscription-based playing where you pay a monthly fee in order to play the game. This model proved to be incredibly successful for future games; most famously, World of Warcraft(2004). Subscription-based gaming emerged as a long-term replacement for the quarters in the arcade machine.
Along with the release of new consoles such as the Xbox and later the Xbox 360 and PS3 we would see the establishment of subscription services for online multiplayer offered through Xbox Live where players could pay to play online multiplayer mode for games such as Halo 3 with other players on consoles around the world. Along with new consoles, also came the emergence of online marketplaces or stores that sold in-game content for the likes of Bethesda Studio’s Elder Scrolls IV: Oblivion, where you could purchase spells, horse armor, or other inconsequential items for a couple of dollars.
Around 2006, Loot Boxes or “loot drops”—what the general public would most closely associate with the idea of microtransactions—got its start with ZT Online, an Incredibly popular Chinese MMORPG that would see players pay real-world money for in-game treasure chests, where they have a chance at getting incredibly rare items though it is often seen as a gamble as there is no guarantee you will get what you want.
Mobile gaming went through a drastic change during this decade. While mobile gaming used to consist of paying a couple of dollars to your provider for a small pixelated game like Tetris for your flip phone, the emergence of smartphones like the iPhone created an entirely new mobile “app” marketplace for gaming. This, in turn, increased the quality of mobile gaming that was available and introduced gamers to “Free-to-Play” mobile gaming with Angry Birds (2009) becoming one of the most popular apps on the market. Yet with all this new content that appeared to be free, a new form of revenue-earning potential emerged with “Freemiuim” gaming. “Freemium” refers to free-to-play games that encourage the purchasing of in-game items or currencies in order to progress at a pace that is faster than if you did not purchase the items or currency.
2010 - Today
One of the most famous examples of “Freemium” gaming would be Clash of Clans which was released in 2012 by Supercell. As of 2016, Supercell’s Clash of Clans and Clash Royale earned $1 billion from that year alone from their “Freemium” business model. Seeing such great success, it is no wonder that free-to-play gaming has become such a massive market of mobile gaming, with Candy Crush Saga, Farmville, and countless others getting a piece of the action.
Mobile gaming wasn’t the only market to evolve this decade. Everquest, the MMORPG to start it all, after 13 years, switched over from a subscription to a free-to-play model, favoring microtransactions as a way to earn revenue. In addition to MMORPGs, MOBA (Multiplayer Online Battle Arena) gaming became increasingly popular with the creation of League of Legends (2009), DOTA 2 (2013), SMITE (2014), and Heroes of the Storm (2015). Paying for in-game currency to purchase new skins or unlocking new characters has become a common occurrence in such games.
The new generation of console gaming and Triple-A titles have more-or-less embraced the various revenue-earning methods of microtransactions. It is not at all uncommon to find season pass offerings to pre-order DLC or other additional content before release as well as the inclusion of loot crates, skins, and emotes in popular games such as Overwatch, or unlocking characters in games like Star Wars: Battlefront II or Middle Earth: Shadow of War. Yet despite the increasing presence of microtransactions in gaming, there has been no shortage of dissuasion or anger amongst the gaming community.
While inconsequential items such as character skins, or other items which don’t impact the ability for players to excel in-game are not quick to anger players, an incredibly common criticisms of microtransactions in gaming is around the paywall that microtransactions put up that gets in the way of players actually excelling, with key content of the game they already paid to play to begin with locked behind loot boxes or in-game purchases.
EA was at the center of a massive micro-transaction pushback with the release of the 2017 release of their Star Wars: Battlefront II game and its loot crate system. Some players estimated they would need to spend an estimated $2000 of actual money to unlock all the content in-game in a timely manner, including characters like Luke Skywalker and Darth Vader, who couldn’t be purchased outright. According to a report by CNBC, this led to an 8.5 percent plummet in EA’s stock, losing them $3.1 Billion.
EA eventually removed their loot boxes from the game as of March 2018, though the increasing controversy of microtransactions such as loot boxes has led to actual legislative work being done in the USA and China to limit the age of players who can purchase loot boxes as well as whether to consider it a form of gambling. In-game purchases have become a serious problem in China and, as a result, the government has passed legislation as of 2016 meant to hold game producers accountable and transparent with the actual odds of earning certain items in loot boxes.
What Does the Future Hold?
Despite the increasing controversy, microtransactions continue to generate huge amounts of revenue for the gaming industry every year. In order to keep with future trends, we must all keep an ear to the ground with new releases from the major gaming studios to see what lessons, if any, the industry has learned from the pushback against EA and other Triple-A titles.