This is hopefully going to be a two part post because I didn’t get to fully explore it as much as I wanted to, as I’ve been losing my mind during another eventful week of final exams in college. I considered postponing this and pulling something out of my ass for today… but then I saw fellow contributor Wheezie had already stolen that spot with his post “What Did I Do Today?” If you haven’t already, you should check it out to at least read the amazing comic that he put at the end of it. Especially because half of the time I don’t think people who aren’t designers understand what designers actually do. I mean, it’s pretty clear that graphics programmers spend all their time inventing new acronyms for anti-aliasing techniques except for when they’re tightening up the graphics on level 3, but designers- they’re a whole different mystery. Wait, what was I actually going to be talking about in this post? Particles, that’s right…

Particles

Particle systems are cheap, flexible, and easy to set up, especially if you’re using Unity3D, as is often the case with the projects happening at my University. This is why on a lot of my projects that I’ve been on (small student projects with tight deadlines), a designer often picks up the task to reduce the workload of other team members. This was the case with one such project that I was working on the fall. That project was Dust.

As you might guess from the name, there’s a lot of dust in Dust. So much so that it takes place in a desert. Some of the designers on the project were tasked with helping to build the ambience for the game with some particle systems. An innocent enough task, but when it came time to integrate the work into our initial prototype, it was clear something was awry.

The problem with video game content creation is that people often have to learn the hard way how their work can impact performance. In this case, we wanted substantially more particles than anything other than a top of the line computer could accommodate (like the one we had been working on). And to be fair, a game like Dust should have as many particles as we can manage to have without blowing perf. One of the biggest impacts on performance from large numbers of particles is overdraw, especially when the particles are filling large sections of the screen:

Massive overdraw from dust particles

This is a shot of the Unity editor’s visualizer for overdraw. The bright areas are spots where many pixels are being drawn over and over again. This is particularly an issue with particles because the system contains many overlapping quads, where pixels will be drawn over many times. Keep in mind that the given shot is from the current version of Dust, his systems were originally much, much heavier on overdraw.

Doing it Offscreen

A solution to this particular consequence of particle effects is presented in the fabulous GPU Gems 3 by Iaian Cantlay. The technique boils down to reducing the number of pixels being rasterized by rendering the particles to a texture that is then composited back into the main image. The rendering is done after a depth buffer has been formed, so that the pixels in the particles that fail the depth test can be discarded properly as you render them. This means that the color can be applied directly back into the scene, which is especially easy if your particles are additive like ours were.

I started considering it as a possible solution to allow for thicker particles in future iterations of the project when I read through Shawn White’s post on his implementation in Unity for Off Road Raptor Safari HD. It didn’t take too long for me to adapt my own implementation a week or so ago, but there are definitely several issues that come to light very quickly (many of which are discussed in the GPU Gems 3).

Problems / Solutions

Zoomed in view of cracks from point sampled depth

Some of the issues that I immediately encountered were visible halos between solid objects (notably the boat that the player controls in Dust) and particles after compositing. One solution to this problem is to have the depth buffer, which is being down-sampled due to the low-res target, take the minimum or maximum depth to be used when downsampling. As noted in the gems article, this is really just a rule of thumb, but it did indeed fix issues with cracks.

Zoomed in view of depth farthest heuristic

However, it made it apparent that the much more serious issue is the aliasing that occurs due to the buffer being a lower resolution. It’s very obvious along the edge of the sail in Dust. My goal was to get the particles to render at quarter res, but right now only half res comes close to an acceptable quality. One thing I could do is render the edges at full res in yet another pass, but the big question in my mind is whether or not two passes for particles would still result in a performance increase. We already have to create a depth buffer specifically for the purpose of the off-screen particles, because we don’t do deferred rendering, it’s not used for much else.

Next Time

Hopefully by the time my next post rolls around (and I’m done with finals), I’ll have some more results (and perf numbers) and give some insight onto whether or not we decide to actually use it in the game. On the plus side, we modified Dust’s design recently to be more separated, so potentially we can split it so that it’s only in effect where the particles are prevalent enough to warrant its use. Also, I’ll try to set up some fresh test scenes that show the effects better, considering that upsampling the in game shots went horribly.