Imagine I’m opening a restaurant.

I lease a prime location, buy high quality equipment, and bring aboard some experienced (and expensive) chefs. But instead of letting my chefs build out the menu, I give that responsibility to a small team of Meal Designers.

These people are passionate about food; they love to eat it, and to talk about it. They are familiar with current food trends, and regularly try new dishes from nice restaurants around the city. In fact, they’re a lot like the chefs. The difference is that they don’t actually make food, and plenty of them don’t think it’s important to know how.

This causes a number of problems. For a start, they have a hard time communicating to the chefs how the elements of their meals should taste, smell, and feel. They also have a hard time articulating what they like or need changed. Dishes are worked out by a long process of iteration, which is made more difficult because cooking involves making tradeoffs, and changing the salt level (for example) might affect the texture.

They also don’t have a complete understanding of what sort of things are hard to make, and which are easy. Or which things can be cranked out in big batches by a line cook, and which much be done to order by a more skilled worker. Or which techniques are “forgiving” in a hectic kitchen, and which require strict attention to come out properly (reducing overall kitchen bandwidth). Or what can be done readily with existing kitchen equipment, and what would require costly upgrades to do efficiently. Or the alternatives that might be available for a dish, to get a similar result with lower food cost, less labor, fewer expensive specialists, shorter cooking time, or less risk. You get the idea.

By the time opening day rolls around, the meal designers will generally be pretty satisfied with what they’ve got, though the menu will be shorter than they had hoped. The kitchen will have been trashed a bit by the elaborate cooking procedures that evolved in response to design changes as the dishes were tuned. Everyone will be a bit burned out. And most people will wish they could go back and start over, given all that they had learned, and do things a bit differently.

That’s crazy, right? Well, that’s how a lot of big budget games are made. In fact, as projects get larger, the hierarchy that emerges makes it more likely to be true. And in some places, the line between designer and manager is a blurry one, which puts the designers in charge of running the kitchen too.

Now, before my game designer friends get all huffed up, I’m not trying to pick on you, and I have to point out that this goes both ways. A chef can learn every technique under the sun, but still not have the taste or imagination to create new food. Plenty of developers spend all their time repairing small pieces of a big machine, but have no idea how to make “fun” happen. How empty would it be to spend your career building guitars, but never learn to play music? Code doesn’t make video games, any more than wood can make songs. Those things are just tools… the good stuff only happens when they move.

However, I do think that if you want to design games, you have a responsibility to understand how they work, and be at least able to cobble together prototypes independently. This is your medium, after all. You don’t have to be an elite hacker. The important thing is to have some clay under your nails. Designing systems is hard, and getting good at it requires a lot of practice and failure. You need to be able to do that for yourself, or you’re going to waste a lot of time. Trust me, you can do it.

Good games are art machines. They are sensitive living systems. They are huge, twisted robots with shiny faces, made out of clicks and taps and a little bit of luck. If we want to make great ones, we all need to embrace both the craft and the art in what we do, and whether you’re a developer or designer, there is plenty of both required.

Anyway, that’s what I think.

[If you'" target="_blank"> and subscribe to the RSS feed!]