This post is going to be much shorter as I’ve sadly been very pressed for time this last couple of weeks and as a result been unable to finish the other (longer) post I had original planned to the high standard that you all deserve. This post is talking about measuring stuff in frames per second and why you shouldn’t do it.

Many people who haven’t got any experience in profiling and optimising a game will approach performance measurements in a frankly scary way. I’m taking of course about measuring performance in terms of frame rate. Of course it’s unfair to say the games industry never mentions ‘frame rate’, we do! We have target frame rates for our games which are usually 60Hz or 30Hz (increasingly the latter), but for anything beyond this I think it’s fair to say that most sane people take the leap into a different, more logical and intuitive space. It’s called frame time and it’s measured in; you guessed it, units of time! Typical units of time used include ms, µs or even ns. The two have a relationship which allows you to convert between them given by t=1/r.

So why am I concerned about this if it’s trivial to convert between them? Well, the reason is that measuring anything in terms of the frame rate suffers from some big flaws which measuring in frame time does not and can lead to problems with your view of the situation:

1. Traditional frame rate measurements are often dependent on synchronisation with a hardware-driven event such as the vertical reset on your display device. If you’ve locked your application to this vertical sync to avoid frame tearing this can mean no matter how fast your application is going it will always appear to run at say, 60fps. If you’re application is straddling the border of 60fps and locked to vsync this can give rise to occassions where small fluctuations in the amount of time something takes can make the game to appear to go either two times quicker or slower!

2. Frame rate can be very deceptive! To show what I mean let’s use an example that crops up time and time again among beginners, it goes as follows:

Coder A: “Check this shit out, my game is running at 2000fps! I’m such a ninja!”
Coder B: “Nice one chief, but you’re accidentally frustum culling the player there…”
Coder A: “Ah shit, okay… Let me fix that… Sorted!”
Coder A hits F5 and looks on truly horrified at the frame rate counter.
Coder B: “Heh! Not so ninja now are you? Running at 300fps!”
Coder A slopes off looking dejected.

What happened here? Apparently performance has dropped drastically by a whopping 1700 frames per second to a mere three hundred! However, if these pairs of idiots worked in a set of sensible units they’d instantly see that it’s only taking 2.5ms to render that player mesh. Small drops in frame rate in an application with an already low frame rate can point to big performance problems, where as large drops in an application with a very big frame rate are usually not much to worry about.

3. And perhaps the most compelling reason which should give any remaining doubters a firm nudge into reality; All the wonderful profiling tools out there work in time, not rate.

At Stig (among others, such as Hulk Hogan and Austin Powers) for anyone that broke the build. In honour of Bizarre which sadly closed last Friday, I offer these final Stig inspired words to bring this post to a close: Next time you even contemplate telling your colleague that your code runs at a particular frame rate remember that what your saying is more backwards than Jeremy Clarkson telling you that the Stig got the Ariel Atom 500 round the top gear track at 37.54993342 metres per second!

This post is reproduced over at my personal blog, which you can find here.