Blame Activision for Ruining Gaming
Published on Feb 2, 2024
Go BackBackground
Before I was born, my mother was an avid gamer in the late 90s, always hunting for game CDs or bootleg atari cartridges being sold in Egyptian book stores to bring back home and play on her home computer. She watched the tech around games advance around her, and she even convinced my dad to upgrade our home PC to a Pentium 4 when it was released a few months before I was born just so she can continue playing the latest games. Even after I was born, she would continue sharing her love for gaming by teaching me and my sister how to use the computer and play games on it and letting us play on her Atari.
But as time went by, my mother couldn’t keep up with all the hot and new advances in tech for gaming due to her life being over-encumbered with work, house-chores and taking care of me and my siblings, so she passed down her crown as a gamer onto me, and ever since I’ve been in love with the world of gaming; and have continued watching it advance and grow in her place from a niche hobby into a medium much bigger than TV and Movies.
I’ve been in so many gaming communities online and offline, played in LAN parties, taught myself how to make games and studied how most games work behind the scenes from a technical and creative aspect, and was even part of the golden Xbox Live era; when multiplayer games were at their peak at the time.
But as I exited my edgy teenager phase, I noticed that gaming stopped evolving. Things stopped being exciting, and there were very few things I could share with my mother enthusiastically around gaming while having breakfast. At some point, she noticed my enthusiasm for gaming had disappeared, thinking that it was either a new hobby or me studying hard was the cause; but in reality, I was just waiting for something new to be excited about.
If you’ve been on the gaming side of YouTube for the past 5 years, you might have noticed a trend of people complaining about gaming not being fun anymore. That gaming has become worse. That newer games were just not as good as the old ones. For a short while, I seemed to agree with those videos, until I started thinking about why things have gone worse.
It is not because of nostalgia, games nowadays are rotten at their core.
Rotten Cores - Software
Some History,
Video Games are complex pieces of software that require several moving parts to function and different creative minds and talents to create. A game developer is not just a programmer who just knows how to make computers go beepboop; game developers come in different types and specialties, who all have to work together and combine their expertise to create tools and systems that will be the foundation for every single part you will see, interact with and hear in the game you’ll be playing.
But not every game developer knows how to do art or music, and not every developer knows how to code a Hidden-surface determination algorithm or touch low-level rendering code. So in 1980s, video game studios started developing their own set of standard tools to accelerate their game development process. This allowed them to shift their focus from writing low level code that communicated with the hardware, to designing the best games they could think off that would stand out on store shelves.
Today, these tools are all grouped up in a single piece of software, and we call it a “game engine”.
If you want to make a game play a sound nowadays, you don’t need a technical sound engineer to write a system that reads data from a sound/midi file to stream it to your Sound Blaster Pro. Today, you just tell the engine to play the sound and call it a day; and you’ll have 3D stereo sound by default and the ability to setup effects within a few clicks.
Game engines are an abstraction for non-technical minds to have an easier time creating what really matters, the video game.
Game developers have no control or say in most engines they use nowadays, and they willingly accepted the convenience of bloated engines like the Unreal Engine, which led to a large portion of the industry to adapt the workflows of these engines, and how their teams are managed.
This caused many studios with in-house engines having to face difficulties, as investors and upper management who only see numbers continuously pushed developers into releasing half-finished products to meet quotas and deadlines that were designed around timeframes of studios that use fully packaged game engines and had no concept of tooling or engine maintenance.
Higher ups in game development spaces are not game developers, so they don’t understand - or even think about - why the team that has to maintain an in-house engine that only receives new features when needed take longer to develop a product than another team that use a bunch of ready made tools and a library of bought assets from the engine’s store. Their understanding of game development time is not based off experience, it instead originates from commercial engine creators themselves. Engines like Unreal market themselves as the professional engine that can turn a week long task into a one click solution for game developers, which makes the higher up drool as all they can hear from Unreal’s marketing is “less time, more money”. As a result, you’ll have these higher ups or the studio’s investors demand that the new feature must be used in the next product; without understanding any of its implications or designing a “fun” game around it.
And as the higher ups continue squeezing the time limits on developers due to them believing that “new engine/tech means less time to make money”, game developers were forced to take shortcuts to meet deadlines; which led to many games releasing at an unfinished state while running like complete dogshit on day one with lots of bugs and glitches.
And to make it all worse, new game developers don’t even understand the concept of hardware limitations due to engine requirements constantly rising. Why bother teaching new developers how to optimize your game if your own engine is constantly becoming harder to run, and you have no time to ever optimize? As long as it runs at 30FPS on consoles, who cares? Gamers on PC will gooble up Nivida’s shrek cock and buy their latest GPUs to run your game.
So Why? Why are we in this mess? How did we even get here?
Rotten Cores - Consumers
Back in the late 2000s, gamers were united on one simple phrase: “Mobile games are not real games.”
Gamers were heavily against the exploitative nature of mobile games when they took the word by storm as andriods and Iphones became more common in the hands of the average Joe. If you liked a mobile game, you’d receive a 20 page long essay on why you are being exploited by greedy mobile developers, and how your data will be sold to some chinese hitmen in some Balkan country because you watched an ad get an extra life in some match-3 game. This probably also had some roots in xenophobia and racism, as the more successful mobile games at that time were Gacha games that originated from Asia.
As a result, most higher ups at game studios stayed away from the idea of aggressively monetizing PC and Console games; especially after the release of Oblivion, which caused the gaming community to constantly mock any form of exploitative monetization as horse armor DLC. However, there was one company that was not shy of experimenting on a crowd of gamers that behaved similarly to Mobile gamers, and that company is Activision.
Activision is the publisher and the owner of the game development studios of the Call Of Duty franchise. For those living under a rock, Call of Duty is the of the best selling games, It is mostly played by casual players who suffer from what I call “FIFA syndrome”.
FIFA Syndrome, is when someone gets into gaming but only decides to play one game franchise for the rest of their life, instead of exploring other gaming experiences. It is called that because of how football (soccer) gamers tend to always buy the same FIFA game over and over again every year, and usually they become incredibly unhappy about how they’re getting bored with playing the same game every year.
Due to the disconnect between the average gamer and gamers with FIFA syndrome, Activision took the opportunity to test out an exploitative monetization scheme in the form of loot boxes in 2014.
Call Of Duty: Advance Warefare (AW) forced players to grind for hours on multiplayer to earn Supply Drops to have a very random chance to earn a specific item for their loadout that affected their gameplay. This gave players an edge if they rolled for a very specific weapon variant, and has introduced the concept for limited time items via the “Retired” rarity, which you can see in popular games like fortnite today. This could all be bypassed if you have paid 1.99$ USD for a single supply drop, or 39.99$ USD for 28 Advanced Supply Drops.
Loot boxes were not a new concept at the time. In 2010, Valve made Team Fortress 2 free to play as they started monetizing their in-game cosmetics inside crates that you have to purchase keys for in order to open them. Despite that, TF2 loot boxes were considered “fair” due to them being purely cosmetic without giving any player a leading edge, like how most mobile games do.
Due to the lack of backlash from the casual gamer with the FIFA Syndrome and Valve not visibly exploiting gamers in an unethical way, Activision and the rest of the gaming industry saw this as the “OK” sign to start milking their consumers more aggressively as Activision saw record profits from their experiment, despite most hardcore Call Of Duty fans criticizing the game’s scifi themes and gameplay, which felt out of place in the COD franchise.
The mobile-fication of video game monetizations had begun, as many higher ups signaled their studios to start working on live service titles to sell micro-transactions. For the lucky games that avoided that curse, they received horse-armor-grade DLC. By the time gamers had noticed what was going on, the damage was already done.
This was just the start of Activision’s damaging contributions to the gaming industry, as sometime after 2014, they were starting to prepare for yet another experiment.
In 2014, A group of gamers attempted to grab the attention of Activision to try and provide dedicated servers for their titles, as many call of duty games ran on a peer to peer (P2P) architecture. Before 2014, this was seen as an effective way to save money for multiplayer games; as game publishers did not have to pay for game servers or maintain them after the game as released. However, as games started moving towards a live-service model, this P2P method was seen by many publishers (including Activision) as a way to lose money; due the the publisher having no control over game lobbies and what happens in them. If you’re running a P2P game, the host can easily mod the game to give players items locked behind loot boxes and item shops, or outright host modded lobbies.
Activision did not Gamers to have the ability to host their own servers for the games that they purchased anymore, as they saw it as a potential way to lose money. As a result, with the release of the Modern Warefare Remastered (2016) and Modern Warefare 2019, we’ve the lost ability to host our own servers; as Call Of Duty had moved to a client-server architecture.
Call Of Duty wasn’t the only title from Activision to receive this treatment in 2016. Overwatch 1 also released with a client-server architecture that prevented players from hosting their own community servers and P2P lobbies due to Activision never providing the dedicated server executables anywhere.
Both the consumer and the publisher viewed dedicated servers as convenience. The consumer gave away their right to own the connection with his fellow gamers in exchange for better ping, but as a consequence:
- The consumer never objected against the publisher controlling who they gets matched against.
- The consumer never objected against the publisher taking away the games they’ve purchased.
- The consumer never objected against the publisher making single-player games require an internet connection.
- The consumer never objected against the publisher forcing you to install a literal fucking rootkit on your system to play a video game
It is late to protest the terrible standards game developers are now forcing onto gamers. The damage is done.
Activision’s success in milking its consumers was a clear message to all investors and higher ups in the gaming industry, gamers are okay with not owning their games as long as there is some convenience exchanged.
Rotten Cores - Suits and Ties
Despite the negative backlash against Infinite warefare, and it selling worse than Black Ops 3, Activision’s Q3 revenue reports of 2016 still hinted at them not receiving any financial loses. Investors and upper management saw this as an opportunity to take bolder risks on their transition into the live service model, and if they felt any bit hesitant by the fact that most of Activision’s consumers having the FIFA syndrome; No Man’s Sky would release to mixed reception due to it missing an entirely major component (multiplayer) then it being added 2 years later with lots of approval from gamers.
The gaming industry watched and studied every decision that Activision made so they can learn from the best money makers.
- If Activision forced their employees to crunch, then everyone else will overwork their employees.
- If Activision hoarded all the space on your hard-drive to stop you from playing other games, better reinvent compression; because the entirety of the gaming industry forgot about it.
- If Activision researched and patented a way to continue milking the consumer, you better believe there will be a patent arms race.
- If Activision’s CEO is looked upon as the most hated person in the entire world, start calling yourself John Riccitiello.
The core of what makes a game, is the game engine and the game developer. If you start stripping all the humanity out of them to meet the demands of investors and shareholders; you’ll have a truly rotten core that is un-salvageable.
Activision is not secretive about how their Call Of Duty development cycle is only 3 years, while the average development time for triple A games is 4 years. Every year, they release a new call of duty game created by overworked employees working for varying different studios; and with Call Of Duty: Modern Warefare 3 (2023), that time was halved. Activision’s crunch culture caused many game studios to suffer due to unrealistic expectations from shareholders and upper management. If Activision can do it, then why can’t we?
This is why games like Cyberpunk 2077 were launched in an unfinished/broken state. The corpos saw the game take longer than what Activision and the rest of its copy-cats do, and they demanded that the game be released this instant; maybe with a small apology and a day one patch to fix the issues later. Just like No Man’s Sky.
The people leading the gaming industry do not care about player satisfaction, as long as gamers continue opening their wallets for half-assed products, they will not change their attitudes. While I do believe that Microsoft’s acquisition of Activision will most likely break off the mindless copying of everything Activision does, Activision already gave the secret formula to the rest of the industry; and they will all continue abusing it until gamers start voting with their (closed) wallet.
As long as gamers continue giving their hard earned cash to companies that provide live service games, nothing will improve.
Convenience Exchanged
It is convenient to say that gaming is no longer fun or that it wasn’t like it used to be. But in reality, the only games affected by this are triple A games that follow a live service model; and this feeling that most of us have can be easily mitigated by simply looking beyond triple A gaming.
Most gamers who have FIFA syndrome do not know what an undertale is, if you mentioned that game to them; they’d probably assume you’re using alternative slang for a booty bump. It is hard to convince others to let go of their convenient micro-transactions and established relationship with a game franchise. But remember, the first reason why we got into this mess in the first place was because we wanted convenience. So the best way to help others explore the realm of gaming beyond Triple A titles, is to look into their needs and showing them alternatives.
Alternatively, we can go the loud punk route. To make change against the system, we must exchange our own convenience to send out a clear message; that we are against the industry standards.
Support Indie Games and create content around them. Learn Godot instead of Unreal/Unity. Become a loud and proud Linux gamer. Invite others to follow your suit. Keep your wallet shut unless it is for the small guy. Marry a nerdy feminine boy with a CS degree and have/adopt 6 children and teach them all to run games inside a virtualized container.
If you’re cool about how you rebel against the standards of society, people will get interested in your cause. Some people might call you “annoying”, but if you think about it, they’re no more annoying than you when they keep talking about their “convenient operating systems”. Bleh.