Results 1 to 10 of 325

Hybrid View

  1. #1
    Player
    KisaiTenshi's Avatar
    Join Date
    Sep 2013
    Location
    Gridania
    Posts
    2,775
    Character
    Kisa Kisa
    World
    Excalibur
    Main Class
    White Mage Lv 100
    Quote Originally Posted by Kewitt View Post
    I can see why se would want to do this. But honesty they are already talking about ps5 or Xbox two will be coming likely by 2020
    Why develop for a system that is 5 years in its life cycles in a 5 year old game.
    When most console Gen's last around 7 years.

    knowing se development cycle they should start work on there next mmo for ps6 and Xbox three
    One can't develop for what doesn't exist. Hardware isn't getting faster either. The PS4, Xbox One have slower CPU's than their predecessors, just with more cores (which makes backwards compatibility nearly impossible without recompile's.)

    The Xbox 360 CPU was 3.2Ghz PPC with 3 cores, the Xbox one is an x86-64 with 8 cores at 1.75Ghz (X 2.13Ghz). The Xbox 360 has 48 shader (unified) cores, the Xbox One has 768 shader (X 2560) cores.

    The PS3 has one 3.2Ghz PPC core with 7 DSP cores. The PS4 is an x86-64 has 8 cores at 1.6Ghz (Pro 2.13Ghz). The PS3has 32 shader(non-unified) cores, PS4 has 1152 shader cores (2304 Pro)

    By comparison, The PS3 GPU is a GeForce 7800 GTX, while the Xbox 360 GPU is comparable to a AMD X1900. Their CPU cores are roughly equal to that of a "Core Duo" systems from the same era.

    The PS4/Xbox One GPU is roughly equal to a AMD RX 480, While the Pro/X models are closer to a AMD RX580, at least in "power". The RX480/RX580 is basically direct competition to the Geforce GTX 1060. The CPU's are equivlent to about a AMD FX8120, and have no Intel equivalent unless you count hyperthreading (which cuts the ALU in half), by which then a quad-core Intel Core i7-4700HQ @ 2.40GHz is probably the closest thing.

    Which means that the PS4 Pro/Xbox one X, is roughly on par with a PC that has a 2.4Ghz CPU with 8 hyper threaded cores and a Geforce 1060. Your average laptop falls very far short of this.

    A developer can not target a PS5 without a devkit, and no devkits exist, instead the "Pro/X" models came out as 4K versions of their previous models, otherwise unchanged. 4K performance is going to be miserable on these devices as they can't do 4Kp60 without sacrificing everything else, so you're still better off running these at 1080p. Likewise many 4K Monitors and Televisions don't actually support "4Kp60" because they don't have HDMI2.0 ports, and HDMI2.0 only allows for 4:2:2 pixel encoding at 4Kp60, so you lose half the color information. Rec.2020 still isn't available yet because no panels actually have true HDR yet. A theoretical PS5 would have to target 8K, Rec.2020, and there just is no hardware that does that yet. To do a 8K render presently you would need at least two Geforce 1080Ti's, and most "gaming" PC's aren't capable of this as the PCIe Bandwidth doesn't exist on anything designed for desktop use.

    No, I think we will see 8K's only real application in VR, and it will have a limited shelf-life, since people don't want to destroy their eyes. That is a dead end without some drastic change in brain-computer interfaces. HMD's simply will never take off as anything other than a way to privately watch video or play a game without distraction.

    CPU's aren't getting any faster, they don't have any where else to go to shrink the die. GPU's are likewise tied to the same problem. So expect the PS4/XboxOne X to stick around longer than the PS3/Xbox360 by virtue of the fact that the CPU/GPU's won't be getting any faster, just fatter. Instead of trying to have 5Ghz CPU's, we will instead see 32 core CPU's that operate at 2.4Ghz. Instead of 3Ghz GPU's we will see 1.5Ghz GPU's with 8000 shader cores (eg two Geforce 1080Ti's glued together.)
    (2)
    Last edited by KisaiTenshi; 05-06-2018 at 01:50 PM.

  2. #2
    Player
    Tint's Avatar
    Join Date
    Aug 2013
    Location
    In the right-hand attic
    Posts
    4,344
    Character
    Karuru Karu
    World
    Shiva
    Main Class
    Fisher Lv 100
    Quote Originally Posted by KisaiTenshi View Post
    Bahnhof
    i have nooooo idea what you are talking about, but it sounds good ^^
    (0)

  3. #3
    Player
    Kewitt's Avatar
    Join Date
    Nov 2012
    Posts
    1,359
    Character
    Ewitt Rainbow
    World
    Balmung
    Main Class
    Fisher Lv 100
    Quote Originally Posted by KisaiTenshi View Post
    One can't develop for what doesn't exist snip
    Rumors are ps5 dev kits are out and 2 years before releasing hw is correct timing may not be final version of dev kit but close.
    Also development of ffxv was original set for ps3 the ps4 was 5 years away when developing started. Sure it was mostly tossed out but you don't need hw to make assets in 4k which is most of development time. Ever make 3d wireframes in the 90s it took a while with more and more polygons and what people expect from triple a games.
    https://www.youtube.com/watch?v=Vh9msqaoJZw in a few years time when home midrange gaming pc can do this, consoles will still lag behind but next Gen pro interation will get close.

    As far as CPU not getting faster.
    https://www.cpubenchmark.net/compare...-3770K/3098vs2
    Looks like good difference in synthetics benchmarks both single thread and all cores.
    That's over 5 years of CPU change

    In the same time for GPU we went from 680 gtx to 1080 ti gtx on Nvidia side and that is a world of differnce.
    https://www.videocardbenchmark.net/c...9&cmp%5B%5D=41

    Sure it's not double speed every 18 months but more law is ending not over.
    Less power more speed we are still close to twice as many transistors on die size.
    Sure 7 nm will be around for a while but that is coming shortly.
    (1)
    Last edited by Kewitt; 05-06-2018 at 06:58 PM.

  4. #4
    Player
    KisaiTenshi's Avatar
    Join Date
    Sep 2013
    Location
    Gridania
    Posts
    2,775
    Character
    Kisa Kisa
    World
    Excalibur
    Main Class
    White Mage Lv 100
    Quote Originally Posted by Kewitt View Post
    Rumors are ps5 dev kits are out and 2 years before releasing hw is correct timing may not be final version of dev kit but close.
    Also development of ffxv was original set for ps3 the ps4 was 5 years away when developing started.
    A single tweet on Twitter is not proof, it's speculation.

    I can speculate too:
    The PS5 will be Sony's final attempt at make a VR Crown for itself. After it utterly fails to sell consoles (like Kinect's motion controls in the Xbox 360/One) it will get discounted without the VR kit, and VR will hopefully be buried for another two decades.

    Apple is working on AR/VR stuff too. There is some practical things that can be done with AR (augmented reality), but it's largely specific to STEM fields, and not gaming. Gamers don't commit to fads, so you'll see a few more Pokemon Go clones and then we're done with that mess too.

    Microsoft already had to lick it's wounds with the Kinect failure. It was a nice try at doing motion controls, but that is simply NOT what gamers want to do. Had they combined this with maybe a reasonable HMD for the PC, perhaps it would have been the right controls for VR. But I think that's buried now.

    Kinect's are actually one way of doing MMD (That's MikuMikuDance) motion input (See Virtual Youtubers) but no other product out there does this except FaceRig, and FaceRig doesn't need a Kinect. However it's not worth Microsoft's investment if they can't sell millions of the Kinect, when the market for them in non-gaming is about 50. And just like gaming, you need a high end system to do real-time 3D animation.

    Like it may sound like I'm being really pessimistic about what will come out next, but that's because CPU performance is flat, even among the high end, and has been so for 6 generations. This is why games never benefit from a better CPU.
    https://www.cpubenchmark.net/singleThread.html

    There is a much steeper curve on the GPU side, but you will only get the maximum performance under perfect cooling conditions, otherwise the GPU will just throttle itself back and that extra $400 you spend on the GPU just sits there doing nothing.
    https://www.videocardbenchmark.net/high_end_gpus.html
    (0)

  5. #5
    Player
    Sebazy's Avatar
    Join Date
    Aug 2013
    Location
    Gridania
    Posts
    3,468
    Character
    Sebazy Spiritwalker
    World
    Ragnarok
    Main Class
    White Mage Lv 90
    Quote Originally Posted by KisaiTenshi View Post
    One can't develop for what doesn't exist. Hardware isn't getting faster either. The PS4, Xbox One have slower CPU's than their predecessors, just with more cores (which makes backwards compatibility nearly impossible without recompile's.)
    One correction here, whilst AMD's Jaguar cores aren't exactly known for their single threaded performance, they are still significantly more powerful than their Xenon/Cell predecessors.

    Both the PS3 and 360's CPUs were actually very forward thinking with a strong emphasis on multi threaded performance. Single threaded performance was actually very poor relying almost entirely on the high clock speed to really go anywhere. No out of order execution, limited cache and poor branch prediction (With none whatsoever on the PS3's 'DSP' cores) were highlights amongst the issues faced here.

    Saying they are comparable to a Core Duo is giving them too much credit in many ways, I'd have them more in the territory of a Pentium 4/D personally and even that could be considered generous.

    I'd say your about right on your estimates of the PS4/Xbone but don't put so much stock in the raw clock speed of the chip, that hasn't meant much for over a decade now (eg compare a top P4 EE, AMD 9590 and i7-4790K)

    This tangent is kind of out of context though as once again I'll stress, the PS3 version of FFXIV was not CPU limited in my eyes.

    Quote Originally Posted by KisaiTenshi View Post
    A theoretical PS5 would have to target 8K, Rec.2020, and there just is no hardware that does that yet.
    The market will be 'demanding' new generation consoles long before even 8K checkerboard rendering becomes viable at a console friendly price point.

    I also don't really get your comments about 'true HDR'. At overly basic level, HDR is the media that is mastered to absolute levels sometimes way beyond what existing displays can deliver. A typical modern HDR screen takes that data and uses tone mapping to display the media in a way that (hopefully) suits that screens capabilities with the goal being that it doesn't simply clip or blow out everything that's beyond it's reach. Rec.2020 is a similar story, even absurdly expensive studio orientated reference monitors don't hit 99% coverage yet but that doesn't stop it being used as a target standard despite most high end HDR TVs falling around 70-80% coverage. Simply outputting it to a screen isn't problematic even if the screen isn't going to be able to show it in it's entirety.

    Quote Originally Posted by KisaiTenshi View Post
    There is a much steeper curve on the GPU side, but you will only get the maximum performance under perfect cooling conditions, otherwise the GPU will just throttle itself back and that extra $400 you spend on the GPU just sits there doing nothing.
    https://www.videocardbenchmark.net/high_end_gpus.html
    You're being too pessimistic here. GPU thermals have been worse in the past, I vividly remember my stock GTX480 would merrily approach 100c at full load and had no issues running like that all day long. Rather the issue with GPU workloads is simply that the game needs to actually use those resources. Compare a 1080 and 1080Ti running something easy going such as cs go at 1080p and you'll barely see any improvement, both cards are simply sat around waiting for work. Switch to Farcry 5 at 4K and it's a completely different story, both cases are GPU limited and thus there's enough load there for the 1080Ti's extra muscle to run away with it.

    This is an just an unfortunate side effect as we cross over between 1080P, QHD and 4K. The current flagship cards are flat out overpowered for almost everything at 1080P, but still not quite powerful enough for full fat 4K in some situations.

    Thermals should never be an issue in a desktop, rather they should only really come into legitimate play in thinner and lighter laptops.
    (3)
    Last edited by Sebazy; 05-06-2018 at 08:39 PM.
    ~ WHM / badSCH / Snob ~ http://eu.finalfantasyxiv.com/lodestone/character/871132/ ~

  6. #6
    Player
    KisaiTenshi's Avatar
    Join Date
    Sep 2013
    Location
    Gridania
    Posts
    2,775
    Character
    Kisa Kisa
    World
    Excalibur
    Main Class
    White Mage Lv 100
    Quote Originally Posted by Sebazy View Post
    I'd say your about right on your estimates of the PS4/Xbone but don't put so much stock in the raw clock speed of the chip, that hasn't meant much for over a decade now (eg compare a top P4 EE, AMD 9590 and i7-4790K)

    This tangent is kind of out of context though as once again I'll stress, the PS3 version of FFXIV was not CPU limited in my eyes.
    This is really the wrong site to do tech nitpicks about, hence I was trying not to post every thing I know about these things because I know that the average person is not impressed by specs, only visuals.

    Hence, 8K, rec.2020 would "blow someone away", but the hardware isn't there, and isn't going to be there for quite a while.


    Quote Originally Posted by Sebazy View Post
    Thermals should never be an issue in a desktop, rather they should only really come into legitimate play in thinner and lighter laptops.
    That's why there is potential for a Nintendo Switch FFXIV. Even that device is more powerful than most "thin and light" laptops, because the thermal limits are hit really quickly in those.


    I linked to the CPU single-threaded passmark to illustrate how flat the CPU performance is, it's only "bumped" by die shrinks, nothing else. And the last 6 generations haven't actually purposed that extra die space into CPU power, but trash iGPU, built in video decoder/encoders, and other fixed-function logic that most people don't use and a PCIe GPU runs circles around. These fixed logic blocks are primarily designed to put the CPU's into SoC's soldered to the MB, hence every desktop Intel chip since Haswell is been identical to the laptop parts, just clocked lower to lower the TDP. If you used the fixed function blocks in a laptop, magically you get more battery life. But only if you use the system native browsers/media players.

    At any rate, The PS4/Xbox One crossplay is basically just a software developer politics thing, and there is no technical reason for it not to happen. The Switch is technically capable. The fact that there was a PS3 version, thus PS3 tier assets already, means that SE already has the tools necessary to lower graphics quality down for such a device, but it would be the inferior experience regardless.
    (0)

Tags for this Thread