More than 20 years ago Tim Berners-Lee had a vision. Today, terms like HTML, HTTP and URL could be used by my grandmother to buy groceries and get them delivered to her house. We certainly have come a long way and to say that web technologies are being used everywhere is an understatement.
On a completely unrelated note, in software development we have this thing called Pipeline. A magical (sorry, sufficiently advanced) set of pipes where one throws a design document on one side, and after a while it drops a completed game on the other. If only were that easy, right?
Okay then, now let’s take those two unrelated concepts and put them on a blender. Will it blend? Read on to know what happens next…
A paradigm shift
Traditional software development models have been around for ages. And when I say ages, I mean a few decades. There was no need for a set of rules or components to build a game at the beginning, the teams were usually small and we were learning as we went. Experimentation, trial and error were the only norm. Once the need arose, it was obvious that a backbone was required to have a well oiled machinery and guarantee that a lot of people with dissimilar disciplines, were getting along both technically and productivity wise.
Don’t get me wrong, pipelines are as old as software development but as time went by, they started to gain more importance. In gaming, we have inherited a whole set of procedures and methodologies that were not really tailored to our industry.
We keep building client-server software, all kind of tools, setup them all in every machine, make sure everyone has the latest version, have the leads meet up and debate whether this tool is working along with that one over there, which programmer to hire to handle all the integration hell and all kind of problems we didn’t create but we follow out of tradition.
How come the panorama is changing rapidly on every step of the way, from design and development to testing and community management but not on the pipeline side?.
Old methodologies are a dying breed. We need to adapt.
But, why should we use web technologies?
Because there are real and practical benefits. There is no more sound argument than that.
“And what are these benefits you speak of, stranger?”
The client is everywhere: Let’s start with the most obvious one. Nowadays, everyone has a web browser installed on their PC, Macs, Netbook, mobile phone and even consoles. It doesn’t matter where you are, where you come from, you already have it. So what? somebody says at the back. Well, just compare that to the fact that you need to stay up to date on several types of clients, on different computers, which are on different buildings and sometimes on different parts of the world, where people may or may not listen to you. And even if you don’t have it installed, it’s just a few clicks away and a few megabytes download.
Standardization: Of course, using web technologies doesn’t necessarily mean everyone is using the same protocols and tools. But fortunately there is a solid foundation and lots of proven software out there, with open standards that make it easy for any team to implement them, give or receive support from other people who may be using the same frameworks and even contribute back.
Maturity: We have reached a level of maturity for communications and web technologies, client and server side, that’s very respected. Take for instance, everything we use everyday like Facebook, Twitter, Amazon, Google, Reddit, whatever App you use, etc. At the core, big players with thousands of servers all around the world are trusting these technologies and for good reason.
Client-server architecture out of the box: There have been talks about real-time editing software, collaborative environments, automated pipelines and up-to-date processes. All of that need a client-server architecture, a multi-tiered model that has its challenges to build and maintain. There you go, the web and more specifically throwing dynamic pages into the mix, are by definition multi-tiered. The client is on your browser, the storage layer and logic is on the Web Server. It has redundancy built-in, it’s inherently more secure (arguable, of course!) and it’s built from the same software that’s powering the huge companies out there.
Ease of deployment: The automatic client-server model comes with a great addition. Right now, there are plenty of games that use a patching mechanism of some sort. On Xbox, Playstation, Wii, PC, Steam, AppStore, Marketplace, you name it. Deployment of changes has never been easy but think about the next time you login into that content editing tool, you have the latest version, and the next time, and the next. Naturally, if somebody on the other side did actually update the system.
Okay, then what for?
We have all these technologies at our disposal. So specifically, why would we benefit from integrating them into our pipelines?
Content editing: Not many projects require a dynamic content editor, but the ones that do would benefit from having a centralized tool, where they can see in real-time (with some Ajax and polling mechanisms) what other people are doing, get the latest updates and be able to easily modified content in just one place. Access control is snappy and the perks that come with centralization are clear, logging, easier to support and backup and can be expanded easily to multiple departments.
Builds: Programmers have been using version control for a while now, but they have also been using websites to see the latest version log, to check for the automatically generated documentation or to get the latest build. Some version control systems already use HTTP as part of their communication protocols. Tools can be easily written in PHP or Python to make some tedious tasks less cumbersome. Nightly builds scripts that put the latest version on a website, with the changelog and all that little nuisances that usually got locked-up in that programmer computer over there.
Project Management: Well, we now have Agile and many methodologies you can think of. They all benefit from being in the web, be that a private local network, or over the Internet. Producers and leads needs to be on top of things faster than ever, they don’t need to worry about calling the IT department to check on why the hell this software doesn’t work after another program replaced a bunch of DLL files.
QA & Testing: Mantis, BugZilla, many proprietary solutions come to mind and many open source ones more. QA needs their bugs reported fast and the programmers, designers, artists and leads need to know quickly what they have to fix to reach that elusive deadline. With the renaissance of Virtual Servers alongside Cloud Computing, it’s now easier than ever to create testing environments in a jiffy and then turn them off to start others and do stress testing. Big games and development teams benefit greatly from these.
Marketing and social media: This one hit us so hard and fast that we never saw it coming. Everyone knows what Twitter, Facebook, Reddit and Google+ (among others) are right? And in the words of Gary Oldman, I mean, “EVERYONEEE!”. This gives us the opportunity to reach our audiences faster than ever, and just as important, vice versa. Using web technologies is now easier to connect internal development tools to the outside world, through tools and API’s or via automated scripts that disclose information about the game and even the development team.
IT Administration: IT Administration has always been an important part but underappreciated, you not only have to deal with out-of-date and buggy software but to top it all, with actual people! With the automated dispatching nature of the web, those days are over. IT admins can now concentrate on improving the system. Indeed this sounds like wishful thinking but at least, it seems like a step in the right direction.
Everything but the kitchen sink: I think it’s safe to say that many other components of the pipeline can be improved by these technologies. You can have better localization tools, better front-end for most of the tasks involved in the process, immediate reach to developers that you didn’t have access to. Even a small thing like an internal blog could be the thing that the team needed to strengthen communication.
You always gotta have cons: There are still some areas that are complicated to translate to the web. Mostly those which have to do with high-performing 3D graphics in real-time (or even 2D) and computing intensive tasks such as physics, encoding and rendering. Flash is widespread but not the greatest performance champion in 2D, there is hope with HTML 5 and WebGL but we are not still there. Of course, you can build your own browser plugins and put your renderer in there, but then again, is not without its pitfalls.
And oh, there is this little detail. Unless you are making your own web software platform, you depend on other people to create a good browser, a good web server or any other part involved. And that goes without saying, it’s absolutely out of your control.
So, what now?
Oh look, we are already there!
The best thing about all of this is that, we are already there! Yes sir, indeed. I would be surprised if I threw a rock into the air and I didn’t hit a company that’s not using some technology mentioned above. But not everyone is using it at the core, maybe because they have a license they are paying, or there is not enough time or resources to apply to improve this non-critical part. But is it non-critical?
The faster we keep moving into this model, the more flexibility will exist and it will be easier to learn from other people experiences while they learn from ours. Last but not least, It’s worth mentioning that there are serious productivity benefits, giving developers more precious time to do what they do best. Which is what pipelines are all about.