Loading icon

The Game Plan

News & Features | December 9, 2013

64% of Americans play video games according to a 2012 study by Magid Associates. Let me put that into perspective: Wikipedia says that’s the same proportion of us who are overweight. It’s not everyone, but that’s what makes this next statistic even more significant—in 2012, video games were number two in entertainment expenditure, second only to Internet and cable, which are counted together (no fair). So, even though we don’t all play video games, as a country we’re willing to spend money on them…a lot of money. 14.8 billion dollars, in fact, were spent on game content in 2012.

Yet, just five years ago, we spent $22 billion, even before figuring in inflation. And the percentage of Americans who play games is inexplicably dropping by the year. This is despite a consistently more accessible video game market courtesy of smart phones and social media applications. If you even sporadically play Candy Crush, Words With Friends, or Farmville, you are a gamer by this model. Still, marketing research firm NPD Group found that between 2011 and 2012, the number of people who reported at least occasionally playing video games dropped 5 percent to an estimated 211 million.

In a feature article for Imagine Games Network, journalist Colin Campbell investigated the missing 12 million gamers’ reasons for quitting cold turkey. While he entertained the possibility that it’s entirely due to the 2011 decline of either Zynga’s Farmville Facebook application or Nintendo’s Wii console, the likelihood of these single events having such a great impact is low. While it’s impossible to know for sure, he eventually concluded that 1 in 20 gamers were enamored with one game in particular, such as Farmville or Wii Fit, and instead of moving on to a different game when they tired of that particular one, they abandoned gaming completely.

Overall, as Campbell explained, it’s also very difficult to collect accurate data about video games; although NPD Group surveyed 8,000 Americans, they did so online. However, a recent Pew Research study found that 15 percent of Americans don’t use the Internet at all. Wouldn’t it make sense that these less technologically connected citizens also don’t play video games? But this is all speculation.

Maybe we can’t trust the numbers exactly, but the trends they show us are clear: the video game industry is facing an impending crisis. Even those gamers who do remain are gravitating away from the $500 Xbox One and $60 games from Best Buy, opting instead for free Internet games and mobile applications. It seems logical. We’ve entered an age in which, for video games, anything more than free seems expensive.

The issue is that we, as consumers, continually want newer and better games to play. For example, when Infinity Ward recently released Call of Duty: Ghosts, the tenth in the highly successful Call of Duty series, buyers expected something new, if only because the company had started a new story arc separate from Modern Warfare and Black Ops. When not much had changed, people were extremely disappointed. Games in Asia reviewer C. Custer concluded, “Call of Duty: Ghosts is not a revolution, or even an evolution, for the Call of Duty series….

It’s hard to know what caused this—was it a genuine lack of inspiration? A cynical cash grab?—but for consumers, it doesn’t matter. Here’s the bottom line: …don’t buy Ghosts.”

In even more recent news, this holiday season the gaming community expects a showdown between the Xbox One and the PlayStation 4. Science website Phys.org game reviewer Troy Wolverton, however, argued that any debate between the two is irrelevant because neither shows any real innovation on the level of their competition from the Wii and the Kinect, which pioneered motion-controlled and controller-free gaming. Furthermore, he said, “The new consoles, at least to my eye, just don’t seem as compelling as their predecessors. The PlayStation 3 and the Xbox 360 were the first game machines that could display high-definition games, and the PlayStation 3 was the first console with a built-in Blu-ray player.”

We feel entitled to constant improvements, but these cost money. How can the companies continually update these games and consoles if no one is paying for them? The industry as a whole has already hit on one strategy: just like with Hollywood blockbusters, making a sequel to an already successful game guarantees money. In fact, all 20 of 2012’s top-selling console games were part of previous franchises. There’s only one Angry Birds, but there are seven different special editions and one spin-off, called Bad Piggies. But, as Call of Duty: Ghosts proves, the simple release of a new edition is not necessarily creative enough to please the consumers.

This is not to say, however, that all of these follow-ups are unoriginal. The fifth edition in the Grand Theft Auto series, released in September, has been lauded for its innovations—Gamestop reviewer Carolyn Petit called it an “outrageous, exhilarating, sometimes troubling crime epic that pushes open-world game design forward in amazing ways.”

But as The Atlantic tech journalist Taylor Clark wrote, “It needs to be said: video games, with very few exceptions, are dumb. And they’re not just dumb in the gleeful, winking way that a big Hollywood movie is dumb; they’re dumb in the puerile, excruciatingly serious way that a grown man in latex elf ears reciting an epic poem about Gandalf is dumb. Aside from a handful of truly smart games, tentpole titles like The Elder Scrolls V: Skyrim and Call of Duty: Black Ops tend to be so silly and so poorly written that they make Michael Bay movies look like the Godfather series.” In addition, also similar to Hollywood, this dominance of sequels and spin-offs leaves little room for indie game success in the mainstream.

People are looking to change that, though. Game designer Jonathan Blow struck it rich with his time-warp platform game Braid, despite having independently financed the $200,000 for its development. Within 2008, its first year, Braid sold several hundred thousand copies through Microsoft’s Xbox Live Arcade service—a far cry from Wii Play’s industry-high 5.28 million that year but still a coup for an indie game. But beyond that, Braid, with its stunning graphics, ingeniously complex concept, and rich story complete with plot twist, proved to the gaming community (and the world) that video games could be more than dumb.

“I think the mainstream game industry is a fucked-up den of mediocrity,” Blow told The Atlantic. “There are some smart people wallowing in there, but the environment discourages creativity and strength and rigor, so what you get is mostly atrophy.”

Now, Blow is turning his significant fortune towards financing The Witness, a game that he hopes will spur the video game industry to start making games that can be taken seriously. This works when it comes to bankrolling the nontraditional, but it’s not realistic for industry-wide change. The bald-crowned Blow may appear to be a new-age Daddy Warbucks, but we can’t expect him to swoop by the orphanage in his Tesla and singlehandedly save us from mediocrity. If we want to use video gaming to its fullest extent and get what we seem to want out of it, it’s necessary to expand the preferences and engagement of the mainstream to match the way the industry seems to be heading.

I’m not worried about the falling percentage of participation; gamers are still more than 200 million strong, and that provides a lot of space for different people with their own distinct preferences. Put away your preconception that the average video game player is a lazy college kid with a box of pizza at his side—according to the ESA, the average gamer is actually 30 years old, just five years less than the national average age. NPD claims, in addition, that almost half of all Americans over 50 also participate. With 91 percent of children from age 2 to 17 playing games, the downward trend mentioned earlier is unlikely to continue.

And why should it? While video games may get a bad rep for their productivity-killing power, studies have found that they actually provide many learning, health, and social benefits. And, just as in the games, the control(ler) is in our hands. That power gives us a choice; either we can play “dumb” games, or we can call for something new. A change in the industry needs to happen because otherwise we’re at a stalemate: without money there are no new games, and without new games, no one will spend money. I’m not saying we should all go out and pay $60 for Call of Duty, but I do know that causing this revolution requires us to come to terms with spending more than nothing on a game.

And the companies are making this transition easier for us; a new alternative to the traditional upfront flat fee is the micro-transaction, or optional small charge in an otherwise free game. Think of in-app purchases on your iPhone or gift cards to Farmville. But micro-transactions extend beyond indie games and mobile or social applications. While multiplayer online battle arena League of Legends is free to join, the game’s 32 million assassins, mages, and marksmen have the option to use real-world money towards advantages in the game.

If that change happens, the fact that our video game spending is going down won’t be an issue—it will be a fact of life. Indie games are, after all, less expensive than your average blockbuster: GTA V also runs $60, and you need the right console to play it. In contrast, a PC version of Braid may be purchased on eBay for $2.88. And, if the process of transforming the industry has already started, which seems likely, the inexplicable statistics make sense. As Eric Savitz wrote for Forbes, “The loss of some players at the margin suggests [sic] a maturing industry in transition.”

Indie video games like Braid and The Witness are different from what’s out there in the mass market, and these are just examples—the options for gamers outside the mainstream are virtually endless. Engaging the average gamer in a more diverse game pool could go even further, pulling in new players who previously thought video games weren’t fun or who abandoned gaming with Farmville. This would be beneficial both for the industry and for video game players themselves.

One particular company goes further, allowing customers to actively do good in the world. Humble Bundle sells PC game packages each week through a unique transaction process: the customer pays what they want and splits up that money between the developers and two previously selected charities. For less than $6, the buyer gets access to just four of the six games, but above that cut-off, the price is completely up to them. While the games are worth more than two lattes—this week, over $70—the average Humble Bundle customer paid just $3.83. Then again, some donated up to $300. Overall, as of December 1, 2013, the company has given more than $29 million to various charities, from the Red Cross to GamesAid. It’s not $14.8 billion, but this is definitely better for the world than repeatedly running over the same virtual pedestrians, and better for the market because independent game designers get their products on consumers’ laptops.

Film critic Roger Ebert famously argued in 2008 that “video games can never be art” by the traditional definition used for music, paintings, books, film, and more, because you can win a game. He believes (along with Plato and Aristotle) that art is an interpretation of life with no objective or levels to achieve along the way. The video game industry is changing, though, and we’re beginning to recognize that, regardless of fine-art status, these digital media have become an integral part of the modern world, one that has the power to enact positive change.

And who knows? Someday, they may be considered art. But I’m not sure that fitting in with that traditional definition of art is the goal, or that arguing that they do is a constructive use of our time. Games are different for a reason. Xbox World’s Michael Gapper wrote in an opinion piece on Computer and Video Games, “Games move too fast and any comprehensive definition of what games are is too vast for your argument to have meaning so long as you’re playing by the rules established by books, paintings, and movies. Ebert was right—by his terms games aren’t art, so let’s define our own.” For now, there’s no reason for video games not to be well-made, productive, and satisfying for the 64% of us—and probably more, in the future—who are game to game.