If PS4 Pro can run 4K at 30fps, I'm not sure that's really a problem. It also indicates that it will happily run at 1080p60. No need to be a genius to realize that a PS4Pro won't compete with a PC containing a graphics card that costs more than a PS4Pro.
They do and they don't. They can perceive a difference in the fluidity of the image at higher framerates, but the eye cannot distinguish specific difference between frames past 60Hz. The entire argument is pointless because full HD at 60fps is quite sufficient, and 120Hz TVs will create the extra frames to smooth things out regardless. 4K resolution is a more debatable benefit. Resolutions above 1080 do improve picture quality, but at the 4K mark, you are creating pixels smaller than the ability of the human eye to resolve, unless you have a truly massive screen or are sitting 2 feet from the screen. At a normal viewing distance, a 40-60 inch screen provides little to no appreciable benefit. Forcing hardware to push 4 times more pixels for a negligible benefit is kind of pointless, I'd rather have a higher quality 1080 resolution image with a solid 60fps.
Last edited by Kosmos992k; 11-01-2016 at 06:20 AM.
It wont, you can save my posts for future reference if you want for when it is released, the PS4 simply doesn't have enough CPU Power to run in towns at 60FPS even in 1080p, since that's what matters while there's a lot of people around, the CPU, not the GPU, i mentioned the GPU for the simple reason of playing at 4k.
I like how you jump into people yet you are the one who has no clue, here let me help you: https://www.youtube.com/watch?v=m6igZbQm75s
The CPU is clocked about 30% higher than the PS4's CPU. Unless you have evidence that a) the framerate in crowded towns is specifically CPU bound, and b) the PS4's CPU pegs out in a crowded town; then you're using conjecture instead of fact. Also, you are not able to quantify the actual performance improvement in the new APU in-game until it is seen in the wild. Finally, the Jaguar cores had to be re-worked for the die shrink, they are not the same cores as in the original CPU, there are unspecified changes and optimizations in that which can improve performance.
I have quite a large clue, and don't rely on You Tube videos as evidence. The human eye is indeed an analogue device, however, it's ability to detect specific changes to an image at more than 60 frames per second is extremely limited. The eye is very good at seeing fluidity in motion, and so you can sense the difference between, 30, 60, 120 frames per second. eventually though, the increased number of frames no longer matters because the eye cannot sense the difference.
Here's some actual research for you.
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2826883
'Young' people had a mean sensitivity of about 18ms which is just under 56 frames per second.
Older people have a mean of about 22ms which is about 45 frames per second.
The best 'young' person in the study rated at 5ms which is around ab out 200 frames per second. Fighter pilots have been measured to respond at a speed that would equate to 220 frames per second. However for the general population something between 30 and 120 is the limit, regardless of age.
According to Wikipedia, the PS4 CPU Runs at 1.6Ghz which is pretty low, my CPU is overclocked at 4.2Ghz, and in MMOs the higher the frequency the better, simply because most MMOs run in single threads, if you go compare benchmarks of new CPUs with previous Gen CPUs, and you compare new one with say, 3.6Ghz and the older one with 4.0Ghz the single thread benchmarks are better on the higher frequency CPU.
Even if PC CPUs were only 30% clocked Higher than the PS4 one it's a significant change, my CPU Default clock is 3.4Ghz and the difference to 4.2Ghz is great in every game.
If you want evidence, here's a reply from Trion CEO on Reddit: https://www.reddit.com/r/MMORPG/comm...bound/clj0tyf/
And what stresses the CPU in games: http://www.tomshardware.co.uk/answer....html#11371222
And?
And?Even if PC CPUs were only 30% clocked Higher than the PS4 one it's a significant change, my CPU Default clock is 3.4Ghz and the difference to 4.2Ghz is great in every game.
Seriously?
The first linked post is talking more about accusations by people who don't understand very much that coders are lazy because something doesn't work the way they think it should. That's not quite a treatise on why MMOs are CPU bound in crowded areas. A Reddit discussion that's about as deep as a puddle is not exactly compelling evidence that FFXIV is CPU bound.If you want evidence, here's a reply from Trion CEO on Reddit: https://www.reddit.com/r/MMORPG/comm...bound/clj0tyf/
And what stresses the CPU in games: http://www.tomshardware.co.uk/answer....html#11371222
As for that second link, I'm certainly not taking a Tom's hardware forum discussion as some form of objective proof either.
You're going to have to work harder than that. Modern games off load a lot of the work that was originally CPU work onto the GPU. The whole GPGPU concept should be a clear indicator of that. The biggest issues in any multi-player game of rendering a large busy area come down to two things, loading and rendering the huge number of models ande animations needed in the environment. Also Player animations are not like scripted motions of NPCs. Showing player motion and actions needs a constant stream of data regarding those things to allow the game to perform the animation and rendering. Depending on the engine design, and therefore what has/has not been offloaded to the GPU, your CPU might get stressed. On PS3 for example, the CPU actually took some of the load that might otherwise be handled by the GPU because the SPEs were well suited to it. Games that made use of that functionality absolutely ran smoother with better graphics and effects than games that did not.
In addition to that, when you have so many player characters milling around, the game is constantly receiving data from the server telling it where each of the players is, what they are doing, where they are moving, etc. Just handling that incoming data itself can cause problems on systems because the data is arriving asynchronously, and has to be handled as it arrives. That doesn't even begin to talk about loading the models and textures for all the characters as they arrive or enter the draw distance of the character. You know how it takes more than a few seconds for a HDD based client to load you in a new zone, but a SSD system is very much quicker? That's not the CPU, that's IO delay my friend.
You know when you are in a crowded town and everyone is essentially idle and the framerate is great, that's because the game isn't CPU bound in that crowded area and once the models and textures are loaded, it's up to the GPU to keep up. But when everyone lights up their actions and the effects are flying and characters are moving, your GPU is working overtime to keep up, and the frame rate suffers.
Anyone who's participated in a hunt with a lot of players has seen this happen. As people arrive and wait to begin, the frame rate is pretty steady unless some PCs are having a fireworks contest while they wait. Then when battle is joined the frame rate takes a hit as soon as people open up with their attacks. All the animation and effects have to be rendered, and there are a lot of them. Once the mark is dead, and people cease the light show, the frame rate recovers, and obviously recovers even more as people leave the area. Again, part of the problem here is IO speed because the game has to load a lot of models and textures for the GPU to work with.
Want another way to see the impact of this? Enter a really crowded space and walk right up to a wall, or into a corner and point the camera at the corner. The game doesn't render what is not in view, and that frame rate will recover. The zone is still packed, but guess what, because the game isn't stressing the GPU rendering everything in the zone, only what is in sight, it isn't stuttering. Turn around and face the crowd and it'll drop again.
In a crowded area what is it that you think the CPU is having to do that is taxing it? The CPU isn't running the effects, doing the physics, geometry or rendering the models of every PC, NPC, item, building, background, etc... Even when you are pointed into a corner, the CPU is still loading and handling the textures, and models of players in the zone. So what is it that you think is so taxing? It's loading data for the GPU, but that is far more IO bound than CPU bound.
BTW, the PS4's GPU has direct system memory access as well as a shared L2 cache with the CPU including synchronization. Hand off between CPU and GPU is very efficient and the GPU can independently load things from memory as needed, not just hit the memory set aside for the GPU at the time. This reduces the load on the CPU bringing assets into memory for the GPU.
Another way to see this impact is to turn off every GPU based graphical enhancement and feature and see how your plain Jane vanilla game runs. I bet you your frame rates are much better. What do you suppose that would indicate?
|
![]() |
![]() |
![]() |
|