The ability to download new content and patches to games has fundamentally changed the console gaming market over the last few years. New multiplayer maps, game modes, expansion packs, and even Oblivion’s horse armor have managed to eat a new hole in gamers’ pockets. For the most part, expanding games through downloads seems to be a very positive thing; if you don’t want something, you don’t have to buy it. Enthusiastic gamers get what they want, and developers still get compensated for their additional hard work.
However, as great as DLC may seem, it has a dark side.
In certain instances, it seems developers don’t put in the level of hard work necessary to produce a great game before its scheduled release. Next thing you know, you’re playing Alone in the Dark, all the while trying to resist the urge to yell at your Xbox for not pausing the game while you equip new items from your trench coat. Sound familiar? How about not being able to unlock certain armor in Halo 3 for a month? Or having Ninja Gaiden 2 freeze on you during cutscenes for no apparent reason? And let’s not forget about Bully: Scholarship Edition, which was dogged with technical issues for months.
And the issue doesn’t stop at consoles. Recent iPhone games have been particularly notorious for being broken. If you’ve managed to get every banana in Monkey Ball, then congratulations; those seven years of medical training to become a surgeon have paid off. How many of you have managed to play through Aurora Feint without a single graphical bug or low memory warning? None? Didn’t think so.
This brings us then to our big question: why do developers put out buggy, unpolished games into the market? Developers like Bethesda, Ubisoft, and BioWare are known for taking their time and publishing games that are complete. There’s a reason you haven’t heard about Splinter Cell: Conviction in two years; it’s being taken back to square one. Rather than try to rush a broken game out the door and reap the potential profits from suckers like you and I, they’re taking their time and doing it right.
But it seems that far too often these days, developers are taking the “sell it now, patch it later” approach to making games. Alone in the Dark, for example, is planning to patch in new camera controls, a better inventory system, and new driving mechanics. This is great news for everyone who bought the game, but why on Earth would Atari wait until AFTER the game has gained such a poor reputation to fix it?
Just today news came out that the Xbox 360 version of The Incredible Hulk would be getting a 2-person multiplayer mode. Why now? Why not develop a game that’s feature complete upon release? Surely developers realize the influence that enthusiast press reviews and Metacritic ratings have on gamers. Why not take advantage of that?
Last week both Aurora Feint and Monkey Ball were updated; Monkey Ball now has features that should have been included at launch time. Aurora Feint, meanwhile, seems to finally be stable, but blocks still freeze and many have reported losing game saves from previous versions. Again, I have to ask: why now? Granted, Aurora Feint is free, and Monkey Ball is the best-selling iPhone game; but how many gamers decided to forgo the $10 download after reading about it’s difficulty?
Some may argue that DLC is great because it allows developers to fix problems; others, like me, see it as a curse for exactly the same reason. Developers should get things right on the first try; for our sake and theirs.
So readers, how much do we REALLY love the effects of DLC?