Nearly 5 years on the market, the PS3 has enjoyed strong success. It’s usually around this time the internet begins to light up with various tidbits of rumours and speculation as to the nature of it’s successor. Having read various articles scattered around some of the news sites, I figured it would be cool to (from a developer’s perspective) add my own flavour of speculation (in more of the “this is what I’d like to see” as opposed to “this is what I expect” view of things).

Now it wouldn’t be fair to discuss a possible PS4 without first considering the hardware of the current system. From a developer’s perspective the PS3 is a rather exotic beast. Housing a shader model 2.0 era GPU and split memory pools, it’s real differentiating factor was the Cell broadband engine that made up the CPU. At it’s initial inception into the market, I recall many developers were divided when considering the architecture inside the box, as it forced those battle-hardened types to break tradition. Those coming through 2 generations of console development as well as PC developers making new in-roads into the console space, had to radically re-think their coding strategies.
As hardware vendors industry-wide had cottoned onto the same philosophies of increasing performance by leveraging the benefits of parallelism, none took this as radically and as literally as the Cell. The heterogeneous nature of the PPU vs the SPUs caused a great deal of pain for developers & many were quite vocal about this including the likes of industry figureheads such as Carmack & Newell.

However over the years, as many a codebase (re-written from the ground up)  matured, the effects of the “Cell mentality” in the way in which game code could be architected to make use of the new parallelism more effectively, could be seen increasingly across all platforms and areas of development. Following on naturally from the Data-Oriented Design methodologies,  the introduction of concepts such as division of labour into job-based and task-based dispatch systems benefitted not only games running on PS3 but also other architectures as well, as it brought with it extra benefits such as better cache coherency and data locality.

This leads me to think about the potential for PS4.

As video games code has grown over the years, so has the demand for greater computational capacity, mostly achieved through large-scale parallelism. CPUs will find it increasingly harder to offer fast execution speeds by bumping clocks as the general convergence towards the upper limits of how fast micro-processors can actually go draws closer. Similarly in the console space, power consumption and heat management will always be a concern for hardware designers and there really isn’t a solution that could support larger console systems with bigger, more expensive cooling solutions in the long term.
Also as consoles generally target decade lifespans and typically enter the market as loss leaders, the pressure is huge for the company vendors to have extensive cost reduction strategies in place from inception, in order to swing the business model into longterm profitability (and with the advent of the Wii, which proved that you don’t strictly need the most cutting edge hardware making losses-per-unit-sold to have a successful gaming hardware business, I believe this is something their competitors will be considering much more moving into the next hardware generation).

This means that in order to see the kind of revolutionary leap in hardware capability typically assumed by consumers, piling on the parallelism seems to be the best bet. How far Sony decide to go on this with PS4 is an interesting point to ponder however, as there seems to be generally two camps on this:

The first typically view the PS3′s Cell as a failure in many ways, touting the PS3 as “difficult to write code for” and generally stressing some of its weaknesses (lack of single-threaded performance on the PPU in many ways stemming from the lack of Out-of-Order-Execution,  esoteric SPU programming model etc.) as the cause. People coming from this side would generally opt for a successor to the Cell CPU that directly addressed these shortcomings. If Sony did go this route maybe one could expect to see a more full fat, PC-like CPU on offer. 6-8 homogenous cores, wide SIMD vector units, OoOE and a nice big fat cache hierarchy. This could be anything from another IBM PowerPC-based design to something completely different from the likes of possibly AMD, Intel or even ARM (however this would likely have ramifications on support for backwards compatibility). Overall though this would be generally accepted as an improvement performance-wise over the previous architecture (however somewhat incremental at best).

The second camp (of which I stand firmly rooted in) see things little differently. These folk generally view the Cell as a pioneering architectural design. A signpost pointing the way towards a future of video games hardware, which is built for both speed and efficiency and (even though it requires more effort on the code side to make the most out of) can provide much greater gains in terms of what’s possible going forward. Given the exact same silicon / transistor budget and the same designated power envelope as the speculated chip above, this camp would expect PS4 to house a somewhat natural progression of the Cell concept, scaling gracefully towards it’s logical conclusion of massive parallelism, maximum throughput with a high raw-performance-per-watt ratio. One could expect to see a Cell-like design, heterogenous again in nature but this time sporting a bigger PPU (or two), larger cache with possibly OoOE on the main chip and sharing it’s die with a much larger array of SPUs, 16-32 in total. Similar in nature to the design shown in an early Cell broadband engine road map by IBM (below):

 

 

Some of the benefits of this option would include immediate support of backwards compatibility out-the-box, a hardware solution both Sony’s internal and external developers are already familiar (and by-now getting much comfortable) with and the luxury of providing many development studios the re-use of vast proportions of current/legacy code, getting up to speed very very quickly on the new platform. The last point has become increasingly more important in an age where publishers, developers and middle-ware vendors have invested thousands (and in some cases even millions) of dollars into their technology in order to make the most of the hardware capabilities of PS3. At times, even if just to retain visual and performance parity in their software releases across all supported platforms, see DICE’s Frostbite Engine 2 and Crytek’s CryEngine 3 presentations for some great examples of this.

It could also be some wider variation on this philosophy, possibly more akin to that of Intel’s now cancelled Larabee project or even something else entirely.

Now I’ve specifically left out any discussion on the subject of the GPU. This is mainly because as GPU technology moves at a much faster pace and is typically more visible, it wouldn’t be unreasonable to expect any solution from either NVidia or AMD to provide the kind of leap in capability expected in a next generation GPU. Largely due to the fact that even today’s market desktop offerings already stands at an order of magnitude more capable than the likes of Xenos and RSX.

Overall if we take these factors into account, it’s clear to see that the considerations companies such as Sony have to make leading into the next generation of video games console hardware, are increasing in depth and complexity and it really isn’t as simple as attempting to ship the most powerful hardware you can at all costs, like it may have been in the past (sorry Ken).
Considerations such as development costs, hardware cost reduction strategies (there’s only so many process die shrinks they can make the most of south of 45nm without some radical shifts in semiconductor manufacturing tech), longterm profitability, technology ramp-up costs all play larger roles on the new stage and it will be interesting to see which direction Sony ultimately go, the day they make that faithful E3 announcement.

I hope these ramblings have been somewhat enlightening…