[This article can also be read on my development blog].

A good way to get an authentic look for retro-pixel art is to simulate the distortion caused by encoding the image into an NTSC signal, decoding it again (as a TV would), and projecting it onto a virtual CRT. This gives you natural-looking artifacts, like fringing and color bleeding.

It also makes copyrighted hedgehogs look even more dashing.

Tube simulator example

Console emulators do this sometimes, and if you’re old enough to have actually played games on a CRT TV, it really helps with the sense of immersion. This post gives a quick overview of the process, in case you’d like to try it for yourself. All of these steps are texture operations performed by pixel shaders.

  1. We start by encoding the low resolution input image as an NTSC signal. Each input line is converted into voltage over time, in the same format an NTSC signal would be sent across a wire (except for the sync and color pulse stuff).
  2. A “cable reflection” shader smears the signal out a little to the right. I’m not sure how much it looks like cable reflection, but it does kind of evoke the streaking artifacts you see on some old TVs.
  3. The luma is split out of the signal, and then used in the NTSC decoding process. This is also where the standard OSD parameters (brightness, contrast, sharpness, etc) are applied. Now our image is RGB again.
  4. The image is projected onto a curved tube. This step also takes care of tracing the scan lines and applying the phosphor pattern.
  5. The phosphors from the previous frame are decayed, and the new values are accumulated. This allows for ghosting of moving images.
  6. A standard post-processing stack is applied (bloom, glare, and tone mapping). This give users a taste of the eye-burning glow produced by a real CRT. (Do you remember when staying up late to play games caused physical pain? Kids these days are soft.)

Naturally, there are a few problems.

Moire

AUGH, TEH MOIRE.

Moire artifacts

The combination of the scanlines and the phosphor texture make a little bit of moire pretty much unavoidable. Tuning the brightness, contrast, bloom, scaling, or NTSC parameters can produce moire, or move it from one place to another. The good news is that it looks far worse in screen shots than it does on a running game.

(I cranked it up for the screen shot above… it’s not that ghastly under normal circumstances).

Crosstalk

Crosstalk between chroma and luma is an key part of the effect, so it’s a feature, not a bug. The problem is getting it to look bad in a “good” way.

A band-stop FIR filter can be used to chop out the chroma signal, but it’s tough to find the right balance between soft (filter too wide) and stripy (filter too narrow).

Tuning crosstalk

Phosphor resolution

My original goal was to have RGB phosphors visible in the image when you examined it up close. That’s really hard, because if you make the phosphors small enough to look realistic, the RGB pattern blurs out and all you can see is a sort of vertical striping. If you don’t mipmap or supersample the phosphors, you again get more moire.

Phosphor patterns at different scales

These examples have the phosphor texture enlarged and strengthened to exaggerate the problem. I wasn’t able to get individual phosphors to look good at 720, and they are only barely tolerable at 1080.

Balance

Cranking all these techniques to the max gives you a delightfully bad video signal, which is also no fun to look at for more than 10 seconds. There is also a fair bit of interplay between the parameters: adjusting the scanline gap changes the brightness, and so on. Tuning all this can be a touchy process.

Below are the parameters that control the effect. As you can see, there are a lot of ways I can screw things up.

Effect tuning parameters

Up next

In future posts I’ll get into the technical details and show you what the shaders look like. If you’re interested, go to pureenergygames.com and subscribe to the RSS feed!