
Originally Posted by
Kirsten_Rev
I'm not sure why we've shifted from discussing HDR, but I want to point a couple of things out here, because your post is a bit deceptive.
Firstly, referring generically to "4K" really doesn't mean much. If you're talking about resolution, then XIV is already 4K capable - native on a PC, and upscaled on the PS4 Pro. People aren't asking for it, because it's already here. And for rendering in higher resolutions, you're right about a GPU's speed being more important than its memory.
However, this isn't what most people refer to when they ask for 4K. The resolution in and of itself is basically just a resource-intensive, high-quality anti-alias if it isn't also paired with 4K assets, primarily textures. I can run Half-life 2 at 4K, but it doesn't look all that improved until I install UHD texture mods. So while textures != resolution, they are absolutely related, and having one without the other is largely useless.
As to your comment about GPUs, I really can't understand that. There really aren't cards with 4x the speed and half the memory. They virtually always scale with each other. You won't find 8GB of memory on a 980 Ti for example.
Finally, your comment about the model polygon edges also makes little sense. Higher resolutions never, ever result in 'more' polygon edges being visible than in lower resolutions, and this visibility has nothing to do with some claimed upper-limit on smoothing. However, it is worth noting that polygon counts do not upscale with resolution. As with textures, polygon counts designed primarily around standard HD resolutions will look less impressive in 4K when compared to models designed with 4K in mind. So there is that.