In the never-ending war between PC and console gamers, one of the PC side’s favorite points is the fact that console hardware remains frustratingly static for years, while PC users can upgrade everything from the RAM to the graphics card as technology improves. So by the end of any given console generation (and sometimes earlier), a price-competitive PC will almost always be able to outperform its aging console competition.
This is true, as far as it goes. But as any console owner can tell you, unchanged hardware doesn’t mean graphics performance remains unchanged over the life of a console. On the contrary, over time, developers are often able to get more out of a console’s limited architecture than anyone ever thought possible when the system launched.
In the beginning, new processors and memory chips in the actual game cartridges contributed to this evolution. More recently, it has become a function of developers who have the time and experience to know how to extract every last ounce of power from an architecture that is intimately familiar.
As we look back nostalgically at how this intra-generational progression has played out in the past, keep in mind that the same process will most likely play out in the current console generation as well. In a few years, we’ll look back at even the impressive launch titles on the Xbox One and PS4 and wonder how we ever tolerated such low-quality visuals.