*But not in the way you think.
A lot of virtual and real ink has been spilled regarding the issue of violence in games somehow causing the players to become violent, or perhaps to become desensitized to violence. Most of the research has suggested the causal link between violence in games and violence in the real world is tenuous at best.
But this not about that.
Jamie Madigan, who I only know from my interactions on Broken Forum , is a Ph.D. psychologist specializing in the psychology of video games — and he’s got a blog. He’s got a podcast focusing on psychology of gaming. But it’s his very first podcast, that I feel is well worth a listen. It’s an interview with Dr. Andrew Przybylski, of Oxford University. Dr. Przybylski specializes in violence and games, and the podcast discusses his recent research.
One of the more fascinating bits was how games cause people to act with greater hostility to others. Curiously, the violent content of the game didn’t seem to be the causal factor. Instead, people became angry and frustrated when they lacked competence in a game, particularly if the game actively inhibited the ability of the player to become competent. So the players turned their pent-up frustration on other people.
We’ve all been through this. When you first fire up a new game, there’s always a learning curve. Some games have simple learning curves; others are more complex. Some of the learning curves involve navigating the game itself — the user interface and controls. In one of Przybylski’s studies, they modded Tetris so that the controls were unpredictable and counter intuitive. People playing this version became quite angry and in subsequent tests, took that anger out on other people in various minor ways.
I’ve been hosting LAN parties in my basement lab for twenty years now. I’ve seen first-hand the kind of anger that terrible user interfaces and poor designs can create. We recently fired up Payday 2. Somewhere in Payday 2 is the core of a good game, but we’ll never find out. After trying to succeed for over an hour, we collectively gave up. The lack of useful feedback, the somewhat obtuse controls (outside of the simple navigation and shooting, which is like any other FPS), and the lack of partial progress — you either failed or succeeded in a mission — proved too much. And there was no little anger and frustration vented while trying to unsuccessfully learn the game.
And that’s just one game. My group, which has fluctuated in membership over the years, has dropped many games because of these issues. I’ve seen players get very heated when a game behaved unpredictably, whether it was the controls, poor feedback, or incorrect information given. I’ve certainly pounded my keyboard a few times myself.
It’s also interesting when a game clicks, even though you’d think the same issues would show up. Let’s take Ubisoft’s The Division as an example. All of us are playing it, to the extent that one of us even bought a second copy of the game, so he could enough different characters to play with more than four people regularly. Did he express anger at Ubisoft’s four-character limit? Nope, he ponied up another $60.’
What’s apparent is that it’s not just an issue of competence, but the competence curve. If a game’s design allows the player to progress in bite-size chunks, then the player feels somewhat in control. Even if some aspects of gameplay seem mysterious, if the moment-to-moment interactions with the game allows someone to feel somewhat in control, that anger they might feel from being helpless doesn’t bubble up.
Clearly Przybylski is onto something here. Game designers should take a closer look at issues of UI, player feedback, and predictable systems. Giving the user a sense of accomplishment and competence may soothe the savage beast that may lurk inside game players. It’s not just the UI or feedback, though. Designing the progression curve of the game also affects attitudes toward the game.