One correction here, whilst AMD's Jaguar cores aren't exactly known for their single threaded performance, they are still significantly more powerful than their Xenon/Cell predecessors.
Both the PS3 and 360's CPUs were actually very forward thinking with a strong emphasis on multi threaded performance. Single threaded performance was actually very poor relying almost entirely on the high clock speed to really go anywhere. No out of order execution, limited cache and poor branch prediction (With none whatsoever on the PS3's 'DSP' cores) were highlights amongst the issues faced here.
Saying they are comparable to a Core Duo is giving them too much credit in many ways, I'd have them more in the territory of a Pentium 4/D personally and even that could be considered generous.
I'd say your about right on your estimates of the PS4/Xbone but don't put so much stock in the raw clock speed of the chip, that hasn't meant much for over a decade now (eg compare a top P4 EE, AMD 9590 and i7-4790K)
This tangent is kind of out of context though as once again I'll stress, the PS3 version of FFXIV was not CPU limited in my eyes.
The market will be 'demanding' new generation consoles long before even 8K checkerboard rendering becomes viable at a console friendly price point.
I also don't really get your comments about 'true HDR'. At overly basic level, HDR is the media that is mastered to absolute levels sometimes way beyond what existing displays can deliver. A typical modern HDR screen takes that data and uses tone mapping to display the media in a way that (hopefully) suits that screens capabilities with the goal being that it doesn't simply clip or blow out everything that's beyond it's reach. Rec.2020 is a similar story, even absurdly expensive studio orientated reference monitors don't hit 99% coverage yet but that doesn't stop it being used as a target standard despite most high end HDR TVs falling around 70-80% coverage. Simply outputting it to a screen isn't problematic even if the screen isn't going to be able to show it in it's entirety.
You're being too pessimistic here. GPU thermals have been worse in the past, I vividly remember my stock GTX480 would merrily approach 100c at full load and had no issues running like that all day long. Rather the issue with GPU workloads is simply that the game needs to actually use those resources. Compare a 1080 and 1080Ti running something easy going such as cs go at 1080p and you'll barely see any improvement, both cards are simply sat around waiting for work. Switch to Farcry 5 at 4K and it's a completely different story, both cases are GPU limited and thus there's enough load there for the 1080Ti's extra muscle to run away with it.
This is an just an unfortunate side effect as we cross over between 1080P, QHD and 4K. The current flagship cards are flat out overpowered for almost everything at 1080P, but still not quite powerful enough for full fat 4K in some situations.
Thermals should never be an issue in a desktop, rather they should only really come into legitimate play in thinner and lighter laptops.



Reply With Quote


