Serotonin: Failure to launch and its consequences

December 5, 2014


The cynical part of me rears its annoying head once in a while, reminding me video games are a business first and always. The industry has expanded since the NES launched in 1985, bringing us unmatched content for nearly every digital desire we could possibly imagine (within reason, holodecks are yet to come). People have become deservedly rich from creating, distributing and selling games and consoles.

Most of us are willing participants because we feel games are, mostly, a strong product. They can be played as many times as you wish, and many have multiple reasons to return. They can be discussed in great detail, played with others in the same room or even around the world. They are constantly evolving, bringing new ideas, challenges and artistic interpretations straight to our living rooms. They are truly a marvel, and we are lucky to be in a position to play games so often when others are much less fortunate.

When reading about recent big-budget game launches, I haven’t felt fortunate. I haven’t felt optimistic about how amazing games are these days. I’ve felt angry, irritated and frustrated, and rolled my eyes every time I read about how games with insane budgets are broken upon release. It violates our trust, pours a ton of criticism on the development team and creates a toxic precedent in the gaming community that has no place in 2014. Recent examples of Assassin’s Creed: Unity, Halo: The Master Chief Collection and Battlefield 4 are precisely the type of launches a developer needs to avoid, and we need to avoid allowing.


As a consumer of anything, not just games, we expect a certain level of quality. Brands and prices will affect variables of the product’s performance, we understand that. But the core level of functionality should be there, even when purchasing something new. A $5 watch should tell time, just as a $500 or $5,000 watch should. The more expensive the product, the higher the expectations we have. A $5,000 watch better do more than just tell time; it should have more functions, feel better on my wrist and look absolutely impressive. Otherwise, I could have just bought a $5 watch.

Big-budget franchises have a few advantages over new intellectual properties. There’s a certain level of quality expected – the previous Assassin’s Creed games, for example, looked amazing, so the next iteration should build upon that. We know what the gameplay is going to be and Ubisoft has a massive marketing budget, so we’ll see a lot of previews and commercials, giving us a good idea of what to expect for a setting. Our expectations grow higher and we trust Ubisoft to develop a certain level of quality and entertainment, particularly if we’re going to pay the full price and purchase the game at launch, usually around $60.

It’s not a small amount of money – that’s about six movies, or three books, or a night out at a nice restaurant. It’s a conscious choice and sacrifice many gamers make. Buying Assassin’s Creed: Unity means I won’t be able to buy other things. The expectation is it will be worth it and the enjoyment received will far exceed $60. Many of the best games of our generation never even let us think about the money spent, as we are far too immersed in the world to think about such trivial things.


I shake my head as I type this, but far too many examples of incomplete games launching have been occurring lately. I don’t mean the games are unplayable, but it’s rather obvious they aren’t ready. They needed more time.  Why did my character just walk right through another character? This isn’t the first generation of 3D games, this is 2014. We’ve seen decades of visuals, gameplay, audio and presentation yield marvelous results, yet I see cutscenes in which a character’s entire face is missing. This isn’t an obscure bug, it’s in plain sight.

Why does that lead to anger? I couldn’t program my way out of a wet paper bag. I know how complex it must be to debug a modern game. There are so many variables that it makes my head spin. But it’s not my job, as a consumer, to empathize this way. I also know putting together a $5,000 watch must require insane intricacies and precision to make. Or how much planning it must take an engineer to design an airplane safe for commercial use. Or how hard farmers must work to put fresh vegetables, grains, meat and fruit into the grocery store for me to eat each day.

But that’s not my job. It’s not my problem. That’s why I pay my hard-earned money. It’s a fair trade. Nobody is forcing me to buy that $5,000 watch, eat at that particular grocery store or buy Assassin’s Creed: Unity. The situation greatly dissatisfies me. Why should I trust Ubisoft again? It clearly released a game in need of more testing, more fine-tuning. But Ubisoft didn’t care. The powers that be decided the holiday season is far too lucrative a date to miss and the game must be out before then. It’s nowhere near as bad as E.T. for Atari, but that too was the result of a rushed game and it crippled the entire industry.


Many publishers feel they can get away with it, and that’s why they do it. It’s all right if Halo: The Master Chief Collection’s multiplayer is completely unplayable at launch, because it can be patched later. Consumers have already purchased the title, and very few will return it. They’re so eager to replay classic Halo multiplayer, they’ll begrudgingly forgive any shortcomings. Sales will continue to pour in. If there’s no incentive to ship a bug-free game, we’ll continue to see bug-filled games.

This isn’t a rare case. That’s the problem. It’s almost expected now that if a game has some form of multiplayer, or even ones that don’t, there will be problems preventing me from playing my product. The fundamental functionality isn’t there, and it’s unfair for consumers to foot the bill and be patient. If your game isn’t ready, wait. There must be enormous pressure to get games out for the holiday rush, but it’s not my job to empathize or rationalize.

Recently, EA’s Andrew Wilson had this to say for Battlefield 4’s unacceptable launch: “You could go down the really conservative path, which some people did in the industry, and your game didn’t have any of those problems, but you also got the feedback of, it just feels the same as it used to.” I feel like he’s totally missing the point. Games, movies, plays and other art mediums should push boundaries. They should grab us, rattle us around and attempt to change our perspectives of each other and the world we live in. They should take risks, leading to potentially exciting developments. If they fail, they’ll fail gloriously and others will build upon what they’ve created.


But it’s not avoiding a conservative path if your game doesn’t function as it should. You want to create a new graphics engine? Go ahead. Try a new type of protagonist? Sounds great. Alter how each gun feels? How teams score points? The levels? The soundtrack? All good, keep going. Sounds great. But if I fall through your “new” level’s ground, or clip through a door, or can’t load the game properly, that’s not reaching for new heights. That’s building an eight-story building with only four sets of stairs.

It’s really all about immersion. Games are fun, I want to keep having fun. Being reminded that they’re just products, and you’re just a consumer, isn’t fun. It damages our faith in our favorite hobby. Take all the risks you want, we’re willing to pay. But the developer has to know it’s not ready. They all know it. They just don’t care, because they keep getting away with it. I will do my small part to ensure they don’t, and continue to harness my focus on supporting the games made by people who care, and care deeply.