The single most dangerous type of programmer you can possibly have on a project is one who has a knack for Getting Things Done.
We all know the type of person I’m talking about – they’re the ones who can be pointed at any random bit of the project, given a description of the problem and a day later it’ll be fixed. Oh, there might be the odd knock-on bug, sure, but that’s to be expected when someone is working in unknown territory, right? The important thing is that the bug is gone or the feature is in and we can move on.
The bug is gone. The feature is in. And the code… is a little, tiny, but very significant bit more crap than it used to be.
It probably won’t be something huge. It’ll be a little change somewhere where a function call that used to have no side effects now has some. Or a global variable which stores a bit of state outside the system that it’s part of. Or an assumption that no-one will ever try and spawn more than <n> of a certain type of enemy.
Nothing to worry about. If it becomes a problem we can fix it then, right?
But these things are like rust. They are corrosive, and once a few small flaws have crept in, they spread until you’re wondering why half of the data is in globals called things like g_temp_hack_dont_blame_me or your getButtonPressed() function is playing sound effects (yes, these are real examples).
The person who wrote the system (hopefully) had a clear mental picture, or even better, documentation, of what they were trying to achieve, how the bits fitted together, how as-yet-unbuilt future expansion would be done.
The person who added the new feature did not.
And that’s where it gets scary. Suddenly all the assumptions about that system become invalid, and everyone, even the original programmer, have to tread with great caution to avoid the land mines littering the code. No-one can tell if a given design pattern was part of the original plan, or a sort of software version of those dodgy patio/conservatory extensions. And soon, the whole area becomes a disaster zone that no-one wants to touch. Nuking the site from orbit becomes not only desirable, but The Only Way To Be Sure.
So, the problem here is clearly that the person making the changes didn’t take the time to understand the system, or read the documentation, or ask the person who wrote it. We all do it sometimes, occasionally for good reasons but usually for bad ones. A tendency to do that a lot makes a person dangerous… but they still don’t invoke “run, don’t walk, to the emergency exits” level of terror. They can still be reformed, brought back into peaceful society.
What makes for a real Bond Villain level of dangerous, though, is when that person is also good at it. They don’t just go in and flail around wildly in the wilderness of foreign code until someone comes and fishes them out again. They chop and hack and hack and then when they’re done it works. Mostly. For now.
And we all know that “working” is another word for “done”, don’t we? Managers, and in particular non-technical ones, have long-since grasped this elementary principle. When something is working, that means you can tick it off and move on. So someone who is good at ticking things off on a list, especially things that everyone else said had to wait until the person who wrote the code got back from holiday, is management’s New Best Friend.
And so in the interests of ticking things off, New Best Friend will be deployed more and more frequently on such missions, bursting into subsystems with the ruthless efficiency of a Navy SEAL, and a similar amount of discretion and tact. If this happens four days before gold master, this might well save the project. If it happens four months before, it might well kill it.
And so, I postulate that the most important thing in engineering a game well is, to paraphrase Lord Blackadder:
If you want something done properly, kill anyone who might Get Things Done before you start.
(on an unrelated note, this horse I’m sitting on seems really quite unnecessarily high and I’m starting to get vertigo. Can I get down now, please?)