Today I want to talk about the baseline load times for games on consoles. I’m going to dive right in with the equations here, so be prepared.

One of the useful metrics for understanding game load times is RAM fill time. It’s more-or-less true that most console games tend to fill the available RAM entirely with very little left over. There are exceptions, of course, but it’s not bad as a general rule.

Similarly, we can more-or-less assume that a game can’t really start until RAM is basically full. There are plenty of exceptions here too which I won’t get into, but again – not a bad first order approximation.

To the math-mobile!

So: For a given hardware platform, what is the minimum time possible to fill RAM? The answer:

tfill = sRAM / vfill


tfill = minimum time to fill RAM
sRAM = size of RAM to fill
vfill = data load speed

In plain English, the time it takes to fill a given amount of RAM is equal to the size of the RAM divided by the speed at which you can fill it.

You may have noticed that I subtly weaseled a bit there. Why did I use vfill instead of, say, vdisk? Because of data compression! Many games keep their data compressed on disk and then decompress it into RAM. So the effective rate at which you can load data depends quite a lot on your compression ratio.

vfill = vdisk / rcompression


vdisk = speed of the disk
rcompression = compression ratio = scompressed / suncompressed
scompressed = size of data after compression
suncompressed = size of data before compression

In plain English, the speed at which you can load data is the speed of the disk divided by the compression ratio of the data.

By the way, I’m also presuming here that your data-loading subsystem is optimized for fast loading: basically that you’ve chosen a data compression algorithm which is not CPU-bound, and you overlap reading with decompression.

Putting values to some of the unknowns

There are only three real unknowns here: rcompression, vdisk, and sRAM.

Let’s start with compression. In general, I’ve found that game data of this generation tends to have a compression ratio of about 50% to 60%. There isn’t a recent published corpus of data on this that I’m aware of, so you’ll just have to take my word on it. We’ll be conservative and look at two compression ratios: 60% and 100% (uncompressed).

rcompression = 0.60 or 1.0

The other two parameters vary depending on the console, and the drive on which the data resides, and often even the location on the disk.

sRAM is fairly straightforward: it’s known to be 448 MiB on the PS3 and 480 MiB on the Xbox 360. Thanks, Wikipedia!

vdisk requires a bit of research. For simplicity’s sake we’ll look at the original hardware for each unit, since those are typically the minimum-spec devices that set the standard. Both the first Xbox 360s and the first PS3s used Seagate LD25.1 hard disks (source), and public benchmarks are available for those which place the min/max throughput at about 20-40 MB/s. The PS3 uses a 2x CLV Blu-ray, which is defined to be a constant 72Mbps = 9MB/s. The Xbox 360 uses a 12x CAV DVD, which ranges from 5x DVD to 12x DVD speed across the disc.

(Know your units! Since we’re talking about bandwidth, in this article I’m using “MB” to mean 1 million bytes, and “MiB” to mean 1024*1024 = 1048576 bytes.)

Putting it all together

Let’s take all of that and build a table.

Console sRAM Drive Data location vdisk rcompression vfill tfill
PS3 448 MiB HDD outer diameter 40 MB/s 0.6 67 MB/s 7.0s
1.0 40 MB/s 11s
inner diameter 20 MB/s 0.6 33 MB/s 14s
1.0 20 MB/s 23s
2x Blu-ray 9 MB/s 0.6 15 MB/s 30s
1.0 9 MB/s 51s
Xbox 360 480 MiB HDD outer diameter 40 MB/s 0.6 67 MB/s 7.4s
1.0 40 MB/s 12s
inner diameter 20 MB/s 0.6 33 MB/s 15s
1.0 20 MB/s 25s
12x DVD outer diameter 16.6 MB/s 0.6 28 MB/s 18s
1.0 16.6 MB/s 30s
inner diameter 6.93 MB/s 0.6 11.5 MB/s 42s
1.0 6.93 MB/s 71s

Or, here’s the same data graphically. Shorter bars are better.

Important caveats

The above numbers are for perfect loads, i.e. a linear load of completely sequential data.

In practice, game loads are rarely that perfect… so this is really only a first-order approximation of load times.

To get a more accurate approximation, you’d next want to estimate the average seek cost, and the average number of seeks per load. However, since that starts to creep out of the realm of publicly available data, I’m afraid I’ll have to leave that as an exercise for the reader. :-)

What good is all this, anyway?

So, yeah, that’s a whole bunch of numbers and a bunch of data. What was the point?

Well, this type of analysis can be very useful whenever you’re looking at a console for the first time. Maybe you’re bringing a game over from PC, or porting a game from X360 to PS3, or maybe you’re working on a launch title for a next-gen console of some kind. These kinds of analyses will give you a ballpark idea of how fast your game should be able to load.

It’s also useful for cross-platform console games to understand; you need to understand the strengths and weaknesses of each console, and what you can do to maximize your throughput.

For example, you can see quite clearly in the data that naive loading of uncompressed data directly from optical disc is really quite hurtful on current consoles. On all consoles you should really be using compression. On Xbox 360 (where you can’t rely on the existence of a HDD) you’ll enjoy a big boost from organizing your DVD layout. On PS3, you should both compress your data, and prefetch it from optical disc to the HDD whenever possible.

Food for thought

For the past ten years or so, RAM costs have dropped much faster than HDD/BD/DVD speeds have increased. As a result, the time to fill RAM on PCs has been creeping upward. However, now we’ve got solid state (SSD) drives which are finally bucking the trend.

But the big problem with SSD for game consoles so far is cost: anything but the most basic SSD upgrade probably costs more than your entire game console.

What do you think will happen in the next console generation?