I am fascinated by economic theory. Time is the ultimate resource, and what are we, then, if not economists dealing in the exchange of fun for time and money. I find there are many parallels to game design, and the tools and theories they provide are a great comfort to me. So when my good friend told me to read Predictably Irrational, I pounced on that book like a hungry hungry hippo.
The book is about Behavioral Economics. A relatively new field in the study of economics, and it starts with the assumption that human beings are not as rational as traditional economics likes to think we are, and in fact, we are irrational to the point of being—wait for it—predictably irrational (herp). It goes on to detail several major points of irrationality, and then backs them up with field experiments that are almost as much fun to read about as they are edifying.
Of the myriad of topics, that ones that stuck with me the most were, no surprise, the ones that provided clear insight into common game design problems. The topics of relativity, the power of zero, the social verses market exchange, and the power of ethics and cheating all showed me new ways to think about common problems, and I’m sure there is even more insight just out of my minds eye, waiting to be tapped. But insight for insights sake is pointless, so it’s time to share my thoughts with you. Let’s start with how the value of things is not as simple as it appears…
It’s All Relative
The ultimate goal, when deciding between two options, is the choice that brings the greatest Utility. Which means, in short, that people strive to pick the choice that brings the most perceived happiness. According to economic theory, a rational human looks at their options, judges each on its merits, and then calculates the expected level of gained utility. Makes sense. But behavioral economics says “whoa there buddy, it ain’t so simple.” Human beings do not compare things in absolute terms, but in relative terms.
“…humans rarely choose things in absolute terms. We don’t have an internal value meter that tells us how much things are worth. Rather, we focus on the relative advantage of one thing over another, and estimate value accordingly.” – Dan Ariely
What does this mean? Well, if I want to sell you a [Hoojaz] for five dollars, chances are you have no idea if that is a good deal. Is a [Hoojaz] really worth five dollars? I don’t know. Now, what if I am offering a [Hoojaz + Whatsit] also for five dollars? You might think me a horrible businessman, but in fact I am a genius, as I have just given you a relative comparison. Before, you had no idea if a [Hoojaz] was worth it, but now you can’t stop thinking about what a steal it would be to get a [Hoojaz] AND a [Whatsit] for the same price.
This is all very interesting, though I admit it might be more interesting to someone running a business. There are, however, even more implications to relativity. Humans focus on comparing things in a relative way, but we also tend to fixate on comparing things in the easiest way possible, while doing our damnedest to avoid difficult comparisons. Let’s say I have given you the choice between these two pieces of gear:
Option [A], a sword that does an average of 7.5 damage per second, and Option [B], a piece of armor with a defense of 10. Which is better? Which do you choose? What you have here, is a choice between two things that excel in comparatively different categories, and human beings, unfortunately, are rubbish at comparing things like this. What we need is a way to give your brain a rest; to give it something to compare “easily”. Look what happens when I introduce a third option.
I have now introduced a third option, [A-]. Its name, obviously, gives you a bias, but you can hopefully feel the effect this has had on you. (Mike, you sneaky shit) Your brain now has something to easily compare. Look: even though it is totally irrational, put most people in this situation and they will now choose option A. You can do this with almost anything, and its called “using a decoy”. By introducing a “shitty” option, that is easily comparable, you can nudge people into the option that you want them to make, while still providing an option B for the min-maxers out there. This has great benefits to game design, so let’s look at some real world applications.
One of the great tricks to competitive multiplayer games, especially FPS ones, is to make the base options actually pretty good (comparatively). The base gun, let’s call it an AR15, might not be the BEST gun, but it should be better than most. This ensures that the mastery curve is not too steep for new players, and ensures that new players are not at too much of a disadvantage. So it would be in our best interest to nudge new players into choosing the AR15 as their starting weapon. Enter behavioral economics! If you introduce a new weapon, let’s call it AR14, that people can easily see is “like an AR15 but worse”, you will naturally nudge people into picking the AR15. Neat, huh, but maybe FPS aren’t your thing, so let’s look at an RPG.
Have you ever played an RPG where you got to a boss that spams some kind of bullshit status effect, say poison, and all you can think to yourself is, “God damnit! There was some item back in town that makes me immune to this.” Dude… been there, done that, but let’s rewind. Back in town you enter the item store, and you are presented with three items: one makes you immune to poisons, one makes you immune to petrification, and one makes you both immune to poison and increases your health. What’s better, immunity to poisons or immunity to petrification? Debatable, and if the player already has an item that makes them immune to poison, that’s fine, but we (the designers) want to make sure that you have something for this boss fight. And, this is the important bit, we want to do it without forcing the option down your throat. By introducing this third item, and making sure we price it comparatively low, we can nudge people into purchasing an item of poison immunity without them even knowing it. Like a great magician, the system designer has used misdirection to make the game better, and the player doesn’t even notice. Like a boss.
Market Norms Versus Social Norms
We live in two worlds. In one world the social norms prevail. I do things for you, you do things for me, and neither of us really expects anything in return. The world of social norms is not governed by reciprocity, which means that even though I help you to move a couch I don’t expect you to immediately come over and help me move my couch. The world of market norms, however, is different. It is cold, impersonal, and everything has a price. Neither is better. They are just different, and they are governed by different rules. How does this apply to game design? Well, before we delve into that, let’s take a deeper look at how these two worlds function by looking at two experiments.
Experiment 1:
Participants are sat in front of a computer for 5 minutes. They are asked to drag a circle from the left side of the screen onto a square on the ride side of the screen. When they do, the circle disappears, and is replaced by a new circle. Sound boring? That’s sorta the point. You see, the participants are broken up into 3 groups: group 1 is paid 50 cents, group 2 is paid 5 dollars, and group 3 is simply asked to participate (the social group). It was the basic assumption, that group 1 and group 2 would apply market norms to the situation, and they would apply their labor in direct proportion to how much they were paid. This held true. Those who recieved 5 dollars dragged, on average, 159 circles, while the group that was paid 50 cents dragged 101 circles. But how do you think the last group, the social group, faired? They dragged 168 circles. What?!
Interesting, but here is where it gets more interesting. They tried the experiment again, only this time instead of giving people money, they gave them candy. Group 1 was a small snickers bar; group 2 was a box of chocolates; and group 3 was nothing, like before. The important part, here, is that the two gifts (snickers and chocolates) were actually the same value as before (50 cents and 5 dollars respectively), but what do you think happened? The Snickers group dragged 162 circles, the Chocolate group dragged 169 circles, and the social group dragged 168 circles. Crazy!
But it doesn’t stop there. They tried one more variation, and this time they wanted to see what would happen if they just mentioned money. Like the previous experiment, they used gifts. But this time, instead of calling them a snickers bar and a box of chocolate they called them a “50 cent snickers bar” and a “5 dollar box of chocolates”. Sure enough, the mere mention of money was enough to push people out of the social norms, and they acted just as they had before, as if they had simply been paid money. Fascinating! Where am I going with this? Just wait, let’s look at experiment number 2.
Experiment 2:
The next experiment wanted to see if getting people to think about money would be enough to change how they behave. They asked participants to do what is called the “scrambled-sentence task”, where they ask people to rearrange sets of words to form sentences. In one group, the task was based on neutral sentences like “It’s cold outside”. The second group, the task was based on sentences that related to money like “High paying salary.”
Following this test, the participants were given a second task, where they had to arrange 12 disks into a square. Before leaving, the experimenter told them to ask for help if they needed it. The true test was to see how long it would take for people to ask for help. The group that had worked on money-related sentences waited about 5 minutes before asking for help, while the other group waited only 3 minutes before asking for help. The mere implantation of the concept of money made one group more self-reliant than the other group. Even more interesting, though, was that this “market norm” group was not only less willing to help themselves, but also less willing to help others:
Overall, the participants in the “salary” group showed many of the characteristics of the market: they were more selfish and self-reliant; they wanted to spend more time alone; they were more likely to select tasks that required individual input rather than teamwork; and when they were deciding where they wanted to sit, they chose seats further away from whomever they were told to work with. – Dan Ariely
Ok, that’s interesting, but how does it apply to game design? Jeez, stop pestering me, I’m getting to it!
What I am trying to show, here, is the correlation between teamwork and thinking in terms of the market. If you make people think about money, they will stop behaving in a manner that is social, and start behaving in a selfish manner. Additionally, it is my belief that concepts like Experience points (EXP) are no different, to players, than concepts of money. To a player, if you tell her that a task grants her 2000 experience, then she will immediately (even if she doesn’t realize it) go into a market mode. This has very important implications to competitive multiplayer games today.
Game designers keep putting more teamwork related competitive modes into our games, and the average person keeps running around like a self-reliant jack ass. Is it any wonder? Every time he gets a head shot he sees a nice little +5. Every time he captures a flag he sees +100. Every time the round is over he sees numbers all over the damn place. He might even get pissed if he didn’t earn enough! Who cares if he won, he wants his MONEY. All he can think about is what benefits him, and how much money he is earning for his time! Do you see what this means?
I realize I am being heretical here, but if you want people to work together, then you cannot make them think in market terms. You must keep them thinking in social terms. First, stop constantly spamming the screen with how much experience I am making. Not only is it muddying up my screen with useless drivel, but also it is forcing me think about everything in terms of money. Second, stop expressing everything in terms of money (read: experience), and try expressing things in terms of gifts. Remember, to the participants in the experiment, a snickers bar and a box of chocolate were just gifts, and even though to a rational person they are worth the same as money, to the irrational human, they were too small to register.
Consider a game where your actions earned badges; things that convey absolutely no observable market value, and while behind the scenes these badges have value, to the player they are just gifts. At the end of a round of play you could see what badges you earned, and again, though behind the scenes there are values being tallied, as far as the player is concerned, he has received gifts. The point is to hide their monetary value! As SOON as you say how much experience they are worth, like when they referred to the candy as “50 cent snickers bar”, people will go into market mode.
I posit that if you strived to this ideal of keeping players within the social context and not a market exchange, you would see a marked improvement in overall teamwork between random individuals. But what about people that just like to cheat and be dicks? Hmm, let’s see what behavioral economics has to say.
Ethically Irrational
One of the more interesting topics in Predictably Irrational, was the topic of ethics and cheating. Through a series of experiments, they showed that when given the freedom and opportunity to cheat, people will take it. (Duh). But where things got interesting, was when they attempted to prevent it. They took their original cheating experiment and ran it exactly how it was run before, but this time they split people into two groups: group 1 was asked to write down 10 books they read recently, while group 2 was asked to write down as many of the 10 Commandments as they could remember.
Shockingly, even when given the opportunity to cheat, group 2 cheated far less than group 1. So they ran the test again, and this time they asked one of the groups to sign a document stating that they would follow the “MIT student honor system.” Not nearly as intimidating as the 10 Commandments, but amazingly it had the same effect. The simple act of stimulating the moral centers of our brain, as long as it was within close proximity to the opportunity to cheat, helped to curb (and in some cases eliminate) the cheating.
This latter case is interesting to me, because it implies that it is not necessary the specifics of the source, but merely that it activates the “honor” section of our brains, and does that mean, then, that you could create the same kind of effect in a game? Let’s say you are playing a competitive multiplayer game set in a sci-fi future setting. What if before each round, while the game is loading, the commander in chief reads allowed the “Star Rangers Oath”, which champions honesty and integrity. Would this be enough? Maybe, maybe not. But what if, in order to launch your fighter, you had to “say” the oath yourself. Now we are getting interesting, and I think, at least for some people, this might make a difference. This understanding of ethics is important, because this section in Predictably Irrational on cheating delves deeper, and the implications get scarier.
People cheat, but only so much. In all of the experiments, though they would make it gradually easier and easier for people to cheat, and the danger of getting caught would gradually diminish, the amount of cheating stayed consistent. People would only cheat so much. It seemed there was some limit before their moral centers would automatically kick in, and this was shown to be true no matter what they did. Everything, that is, until they began to separate people from the concept of money.
In all of the experiments they did, the people were being paid money for saying how many questions they got right on a test. In this case, it was 50 cents per question they got right. In a new experiment, they ran tests like they did before, but this time they added a new group. This time, instead of getting 50 cents for every correct answer, they would receive “tokens”, then they would take those tokens, walk 12 feet across the room, and exchange these tokens for 50 cents. Now, should this make a difference? If we were rational beings of course not. Nothing has changed, but would you believe that adding this one step between the cheating and the concept of money increased the amount of cheating by more than double! We are crazy!
The further we get from the concept of money, and the more impersonal things get, the more we act like complete dbags. Shocking? Maybe not, as anyone who has spent more than 5 minutes playing shooters online will tell you, cheaters gunna cheat. And there is almost nothing more impersonal and removed from the concept of money than games over the internet. Our understanding of human nature is truly infantile, and behavioral economics is only baby steps, but anything that takes steps to make play experiences better is surely worth investigation.
The Power Of Zero
Last, I’d like to take a moment to talk to the indies out there. Hey, what’s up? Man, it is pretty rough out there, right? Listen: as you are probably aware, it is not just about understanding game design; you are also responsible for understanding the business side of things, and when it comes to selling your product, be aware that zero is not just another price. It is an emotionally charged heart string, and you should definitely be pulling it. People will, quite irrationally, totally ignore their own perceived notions of utility as soon as you introduce the option of free. We’ve all been there. Who hasn’t come home from E3 with a giant bag of CRAP, and then immediately thought: “why do I have all this junk?”
“Zero is not just another discount. Zero is a different place. The difference between two cents and one sent is small. But the difference between one cent and free is huge! If you are in business, and understand that, you can do some marvelous things. Want to draw a crowd? Make something FREE! Want to sell more products? Make part of the purchase FREE!” – Dan Ariely
Snap, there it is right there. This doesn’t mean, by the way, that you have to give away your product for free. For example, let’s take the concepts we’ve discussed, and apply them to selling your game. Consider what would happen if you ran a deal on your website like this:
- Your Game: 10$
- Your Game + Another Game : 30$
- Your Game + Another Game + Hand Drawn Map: 30$
Do you see what I’ve done here? For a second, remove the third option. 10 dollars for one game, or 30 dollars for two games. Given a choice like that, most people would choose to spend 10 dollars for one game. They will calculate the potential for loss and judge that 30 dollars is too great a risk. Now, what happens when you reintroduce the third option. Holy shit, free map?! Not only have I made use of the power of zero, but I have also made use of the power of relativity. A rational person would see what you are doing here, but people are not rational. We are suckers for zero. It is an effective means of making you a good bit of dough, and not nearly enough people understand this. Plus, people get a free map! Who the hell doesn’t like maps.
Conclusion
Well, this post ended up being a lot longer than I intended, but even with all of these words, there is still so much to write about. I wish I could include more practical applications of the stuff in this book, but I’m already pushing that TLDR boundry (if I haven’t already crossed it!)
Point being, I love this kind of stuff. DON’T YOU! (Stares intently at you).
This book is wonderful (thanks Mike D), and I am sorry it took me so long to read it – but read it I did! By the second chapter I knew that it had earned its place in my “Books Every Design Must Read” list, and truly, every design must read it. Why the hell are you doing this job if you aren’t reading books like this? Like seriously. Get off your fluffy duff and read the shit out of this book.
Do it.