Page 23 of 33 FirstFirst ... 13 21 22 23 24 25 ... LastLast
Results 221 to 230 of 322
  1. #221
    Player
    KisaiTenshi's Avatar
    Join Date
    Sep 2013
    Location
    Gridania
    Posts
    2,775
    Character
    Kisa Kisa
    World
    Excalibur
    Main Class
    White Mage Lv 100
    Quote Originally Posted by Kewitt View Post
    I can see why se would want to do this. But honesty they are already talking about ps5 or Xbox two will be coming likely by 2020
    Why develop for a system that is 5 years in its life cycles in a 5 year old game.
    When most console Gen's last around 7 years.

    knowing se development cycle they should start work on there next mmo for ps6 and Xbox three
    One can't develop for what doesn't exist. Hardware isn't getting faster either. The PS4, Xbox One have slower CPU's than their predecessors, just with more cores (which makes backwards compatibility nearly impossible without recompile's.)

    The Xbox 360 CPU was 3.2Ghz PPC with 3 cores, the Xbox one is an x86-64 with 8 cores at 1.75Ghz (X 2.13Ghz). The Xbox 360 has 48 shader (unified) cores, the Xbox One has 768 shader (X 2560) cores.

    The PS3 has one 3.2Ghz PPC core with 7 DSP cores. The PS4 is an x86-64 has 8 cores at 1.6Ghz (Pro 2.13Ghz). The PS3has 32 shader(non-unified) cores, PS4 has 1152 shader cores (2304 Pro)

    By comparison, The PS3 GPU is a GeForce 7800 GTX, while the Xbox 360 GPU is comparable to a AMD X1900. Their CPU cores are roughly equal to that of a "Core Duo" systems from the same era.

    The PS4/Xbox One GPU is roughly equal to a AMD RX 480, While the Pro/X models are closer to a AMD RX580, at least in "power". The RX480/RX580 is basically direct competition to the Geforce GTX 1060. The CPU's are equivlent to about a AMD FX8120, and have no Intel equivalent unless you count hyperthreading (which cuts the ALU in half), by which then a quad-core Intel Core i7-4700HQ @ 2.40GHz is probably the closest thing.

    Which means that the PS4 Pro/Xbox one X, is roughly on par with a PC that has a 2.4Ghz CPU with 8 hyper threaded cores and a Geforce 1060. Your average laptop falls very far short of this.

    A developer can not target a PS5 without a devkit, and no devkits exist, instead the "Pro/X" models came out as 4K versions of their previous models, otherwise unchanged. 4K performance is going to be miserable on these devices as they can't do 4Kp60 without sacrificing everything else, so you're still better off running these at 1080p. Likewise many 4K Monitors and Televisions don't actually support "4Kp60" because they don't have HDMI2.0 ports, and HDMI2.0 only allows for 4:2:2 pixel encoding at 4Kp60, so you lose half the color information. Rec.2020 still isn't available yet because no panels actually have true HDR yet. A theoretical PS5 would have to target 8K, Rec.2020, and there just is no hardware that does that yet. To do a 8K render presently you would need at least two Geforce 1080Ti's, and most "gaming" PC's aren't capable of this as the PCIe Bandwidth doesn't exist on anything designed for desktop use.

    No, I think we will see 8K's only real application in VR, and it will have a limited shelf-life, since people don't want to destroy their eyes. That is a dead end without some drastic change in brain-computer interfaces. HMD's simply will never take off as anything other than a way to privately watch video or play a game without distraction.

    CPU's aren't getting any faster, they don't have any where else to go to shrink the die. GPU's are likewise tied to the same problem. So expect the PS4/XboxOne X to stick around longer than the PS3/Xbox360 by virtue of the fact that the CPU/GPU's won't be getting any faster, just fatter. Instead of trying to have 5Ghz CPU's, we will instead see 32 core CPU's that operate at 2.4Ghz. Instead of 3Ghz GPU's we will see 1.5Ghz GPU's with 8000 shader cores (eg two Geforce 1080Ti's glued together.)
    (2)
    Last edited by KisaiTenshi; 05-06-2018 at 01:50 PM.

  2. #222
    Player
    Tint's Avatar
    Join Date
    Aug 2013
    Location
    In the right-hand attic
    Posts
    4,332
    Character
    Karuru Karu
    World
    Shiva
    Main Class
    Fisher Lv 100
    Quote Originally Posted by KisaiTenshi View Post
    Bahnhof
    i have nooooo idea what you are talking about, but it sounds good ^^
    (0)

  3. #223
    Player
    Kewitt's Avatar
    Join Date
    Nov 2012
    Posts
    1,351
    Character
    Ewitt Rainbow
    World
    Balmung
    Main Class
    Fisher Lv 100
    Quote Originally Posted by KisaiTenshi View Post
    One can't develop for what doesn't exist snip
    Rumors are ps5 dev kits are out and 2 years before releasing hw is correct timing may not be final version of dev kit but close.
    Also development of ffxv was original set for ps3 the ps4 was 5 years away when developing started. Sure it was mostly tossed out but you don't need hw to make assets in 4k which is most of development time. Ever make 3d wireframes in the 90s it took a while with more and more polygons and what people expect from triple a games.
    https://www.youtube.com/watch?v=Vh9msqaoJZw in a few years time when home midrange gaming pc can do this, consoles will still lag behind but next Gen pro interation will get close.

    As far as CPU not getting faster.
    https://www.cpubenchmark.net/compare...-3770K/3098vs2
    Looks like good difference in synthetics benchmarks both single thread and all cores.
    That's over 5 years of CPU change

    In the same time for GPU we went from 680 gtx to 1080 ti gtx on Nvidia side and that is a world of differnce.
    https://www.videocardbenchmark.net/c...9&cmp%5B%5D=41

    Sure it's not double speed every 18 months but more law is ending not over.
    Less power more speed we are still close to twice as many transistors on die size.
    Sure 7 nm will be around for a while but that is coming shortly.
    (1)
    Last edited by Kewitt; 05-06-2018 at 06:58 PM.

  4. #224
    Player
    KisaiTenshi's Avatar
    Join Date
    Sep 2013
    Location
    Gridania
    Posts
    2,775
    Character
    Kisa Kisa
    World
    Excalibur
    Main Class
    White Mage Lv 100
    Quote Originally Posted by Kewitt View Post
    Rumors are ps5 dev kits are out and 2 years before releasing hw is correct timing may not be final version of dev kit but close.
    Also development of ffxv was original set for ps3 the ps4 was 5 years away when developing started.
    A single tweet on Twitter is not proof, it's speculation.

    I can speculate too:
    The PS5 will be Sony's final attempt at make a VR Crown for itself. After it utterly fails to sell consoles (like Kinect's motion controls in the Xbox 360/One) it will get discounted without the VR kit, and VR will hopefully be buried for another two decades.

    Apple is working on AR/VR stuff too. There is some practical things that can be done with AR (augmented reality), but it's largely specific to STEM fields, and not gaming. Gamers don't commit to fads, so you'll see a few more Pokemon Go clones and then we're done with that mess too.

    Microsoft already had to lick it's wounds with the Kinect failure. It was a nice try at doing motion controls, but that is simply NOT what gamers want to do. Had they combined this with maybe a reasonable HMD for the PC, perhaps it would have been the right controls for VR. But I think that's buried now.

    Kinect's are actually one way of doing MMD (That's MikuMikuDance) motion input (See Virtual Youtubers) but no other product out there does this except FaceRig, and FaceRig doesn't need a Kinect. However it's not worth Microsoft's investment if they can't sell millions of the Kinect, when the market for them in non-gaming is about 50. And just like gaming, you need a high end system to do real-time 3D animation.

    Like it may sound like I'm being really pessimistic about what will come out next, but that's because CPU performance is flat, even among the high end, and has been so for 6 generations. This is why games never benefit from a better CPU.
    https://www.cpubenchmark.net/singleThread.html

    There is a much steeper curve on the GPU side, but you will only get the maximum performance under perfect cooling conditions, otherwise the GPU will just throttle itself back and that extra $400 you spend on the GPU just sits there doing nothing.
    https://www.videocardbenchmark.net/high_end_gpus.html
    (0)

  5. #225
    Player
    Sebazy's Avatar
    Join Date
    Aug 2013
    Location
    Gridania
    Posts
    3,468
    Character
    Sebazy Spiritwalker
    World
    Ragnarok
    Main Class
    White Mage Lv 90
    Quote Originally Posted by KisaiTenshi View Post
    One can't develop for what doesn't exist. Hardware isn't getting faster either. The PS4, Xbox One have slower CPU's than their predecessors, just with more cores (which makes backwards compatibility nearly impossible without recompile's.)
    One correction here, whilst AMD's Jaguar cores aren't exactly known for their single threaded performance, they are still significantly more powerful than their Xenon/Cell predecessors.

    Both the PS3 and 360's CPUs were actually very forward thinking with a strong emphasis on multi threaded performance. Single threaded performance was actually very poor relying almost entirely on the high clock speed to really go anywhere. No out of order execution, limited cache and poor branch prediction (With none whatsoever on the PS3's 'DSP' cores) were highlights amongst the issues faced here.

    Saying they are comparable to a Core Duo is giving them too much credit in many ways, I'd have them more in the territory of a Pentium 4/D personally and even that could be considered generous.

    I'd say your about right on your estimates of the PS4/Xbone but don't put so much stock in the raw clock speed of the chip, that hasn't meant much for over a decade now (eg compare a top P4 EE, AMD 9590 and i7-4790K)

    This tangent is kind of out of context though as once again I'll stress, the PS3 version of FFXIV was not CPU limited in my eyes.

    Quote Originally Posted by KisaiTenshi View Post
    A theoretical PS5 would have to target 8K, Rec.2020, and there just is no hardware that does that yet.
    The market will be 'demanding' new generation consoles long before even 8K checkerboard rendering becomes viable at a console friendly price point.

    I also don't really get your comments about 'true HDR'. At overly basic level, HDR is the media that is mastered to absolute levels sometimes way beyond what existing displays can deliver. A typical modern HDR screen takes that data and uses tone mapping to display the media in a way that (hopefully) suits that screens capabilities with the goal being that it doesn't simply clip or blow out everything that's beyond it's reach. Rec.2020 is a similar story, even absurdly expensive studio orientated reference monitors don't hit 99% coverage yet but that doesn't stop it being used as a target standard despite most high end HDR TVs falling around 70-80% coverage. Simply outputting it to a screen isn't problematic even if the screen isn't going to be able to show it in it's entirety.

    Quote Originally Posted by KisaiTenshi View Post
    There is a much steeper curve on the GPU side, but you will only get the maximum performance under perfect cooling conditions, otherwise the GPU will just throttle itself back and that extra $400 you spend on the GPU just sits there doing nothing.
    https://www.videocardbenchmark.net/high_end_gpus.html
    You're being too pessimistic here. GPU thermals have been worse in the past, I vividly remember my stock GTX480 would merrily approach 100c at full load and had no issues running like that all day long. Rather the issue with GPU workloads is simply that the game needs to actually use those resources. Compare a 1080 and 1080Ti running something easy going such as cs go at 1080p and you'll barely see any improvement, both cards are simply sat around waiting for work. Switch to Farcry 5 at 4K and it's a completely different story, both cases are GPU limited and thus there's enough load there for the 1080Ti's extra muscle to run away with it.

    This is an just an unfortunate side effect as we cross over between 1080P, QHD and 4K. The current flagship cards are flat out overpowered for almost everything at 1080P, but still not quite powerful enough for full fat 4K in some situations.

    Thermals should never be an issue in a desktop, rather they should only really come into legitimate play in thinner and lighter laptops.
    (3)
    Last edited by Sebazy; 05-06-2018 at 08:39 PM.
    ~ WHM / badSCH / Snob ~ http://eu.finalfantasyxiv.com/lodestone/character/871132/ ~

  6. #226
    Player
    KisaiTenshi's Avatar
    Join Date
    Sep 2013
    Location
    Gridania
    Posts
    2,775
    Character
    Kisa Kisa
    World
    Excalibur
    Main Class
    White Mage Lv 100
    Quote Originally Posted by Sebazy View Post
    I'd say your about right on your estimates of the PS4/Xbone but don't put so much stock in the raw clock speed of the chip, that hasn't meant much for over a decade now (eg compare a top P4 EE, AMD 9590 and i7-4790K)

    This tangent is kind of out of context though as once again I'll stress, the PS3 version of FFXIV was not CPU limited in my eyes.
    This is really the wrong site to do tech nitpicks about, hence I was trying not to post every thing I know about these things because I know that the average person is not impressed by specs, only visuals.

    Hence, 8K, rec.2020 would "blow someone away", but the hardware isn't there, and isn't going to be there for quite a while.


    Quote Originally Posted by Sebazy View Post
    Thermals should never be an issue in a desktop, rather they should only really come into legitimate play in thinner and lighter laptops.
    That's why there is potential for a Nintendo Switch FFXIV. Even that device is more powerful than most "thin and light" laptops, because the thermal limits are hit really quickly in those.


    I linked to the CPU single-threaded passmark to illustrate how flat the CPU performance is, it's only "bumped" by die shrinks, nothing else. And the last 6 generations haven't actually purposed that extra die space into CPU power, but trash iGPU, built in video decoder/encoders, and other fixed-function logic that most people don't use and a PCIe GPU runs circles around. These fixed logic blocks are primarily designed to put the CPU's into SoC's soldered to the MB, hence every desktop Intel chip since Haswell is been identical to the laptop parts, just clocked lower to lower the TDP. If you used the fixed function blocks in a laptop, magically you get more battery life. But only if you use the system native browsers/media players.

    At any rate, The PS4/Xbox One crossplay is basically just a software developer politics thing, and there is no technical reason for it not to happen. The Switch is technically capable. The fact that there was a PS3 version, thus PS3 tier assets already, means that SE already has the tools necessary to lower graphics quality down for such a device, but it would be the inferior experience regardless.
    (0)

  7. #227
    Player
    Zeonx's Avatar
    Join Date
    Mar 2014
    Posts
    957
    Character
    Zeon Darksol
    World
    Cactuar
    Main Class
    Lancer Lv 100
    I wouldn't want it.

    Did you all see what happen when FFXI couldn't be upgraded anymore?

    FFXI was beautiful I remember watching my cousin play it and it was shiny sunny out in bibiki bay and the dunes, the rain and fog on selbina ship and weather effects were so nice everything was so good.

    Because of limitations they started to tear graphics out of other places in the world colors and other stuff, the sky used to look good then you look up now you see square grids all over sunny shine effects were gone, they removed really cool fireworks that spelled stuff in the sky and things just got really ugly as time went on.

    You want that to happen to FFXIV? once cross play happens then they say oh limitations can't do this or that unless do this or that.

    Then feel bad because they say oh we dropping switch and other console support I don't think they will and we will be stuck.

    Then FFXIV will become FFXI until down the road they decide ok well maybe we should drop it but by that time it will be too late.
    (1)

  8. #228
    Player
    ReplicaX's Avatar
    Join Date
    Mar 2012
    Location
    Gridania
    Posts
    1,020
    Character
    Methos Ranperre
    World
    Jenova
    Main Class
    Ninja Lv 70
    Quote Originally Posted by Lynart View Post
    Look at Fortnite and Rocket League.
    Quote Originally Posted by alimdia View Post
    SE needs Sony to stop being greedy and allow crossplay with XB1.
    I'll put both your comments to rest. Sony is not being greedy.

    Both Rocket League & Fortnite devs designed their games around using a Console User's Data. Sony does not want Player PSN data cross-shared, period. It's more than just a PSN ID, its network access. If you actually read into both Fortnite & Rocket League development you would understand this.

    FFXIV, just like FFXI are not in the same boat as SE has its own servers, user accounts, and nothing is shared between any platform's Network and User information. You turn on your console, login to PSN, and you are handed over to SE using a SE account to log in as.

    The PS2 had the highest install base for XI for years. When 360 attempted to attract JP players with XI. Sony didn't go, "our install base is better so no". MS waved Gold, and failed the JP market.
    (2)

  9. #229
    Player
    Lynart's Avatar
    Join Date
    May 2011
    Posts
    84
    Character
    Machiko Lienwyn
    World
    Goblin
    Main Class
    Paladin Lv 51
    ^Do you know why Sony doesn't want their network shared? lol I do

    We can speculate further, but take a look at the numerous cyber attacks and the lack of security on Sony's entire network infrastructure. Guess what costs money to do?

    Also, nothing in your post signifies that I was wrong about Sony not wanting to play ball...especially when Epic Games accidentally turned on cross play for a few days (and it worked.)

    Quote Originally Posted by KisaiTenshi View Post
    No, I think we will see 8K's only real application in VR, and it will have a limited shelf-life, since people don't want to destroy their eyes. That is a dead end without some drastic change in brain-computer interfaces. HMD's simply will never take off as anything other than a way to privately watch video or play a game without distraction.
    Do you have a Rift, PSVR, or VIVE? I don't think you'd be saying this if you did.

    While I don't use my Rift much due to the entire VR market being pretty much prototypical development...VR will either become the future of gaming for commercial use, or people with houses will dedicate entire rooms to them. If you don't have a VR setup, I understand why you wouldn't understand this because I was skeptical at first as well. The best analogy I heard was: advertising a VR system on a 2D screen is like advertising color on a black and white screen.

    VR is A LOT MORE than just 3D btw.

    Oh...forgot to mention: even if VR or AR fail the entertainment sector, the commercial sector has already adopted it. Think anything CAD related.
    (1)
    Last edited by Lynart; 05-08-2018 at 05:44 AM.

  10. #230
    Player
    KisaiTenshi's Avatar
    Join Date
    Sep 2013
    Location
    Gridania
    Posts
    2,775
    Character
    Kisa Kisa
    World
    Excalibur
    Main Class
    White Mage Lv 100
    Quote Originally Posted by Lynart View Post
    Do you have a Rift, PSVR, or VIVE? I don't think you'd be saying this if you did.
    I have eyeglasses, and thus far no HMD adapts for this.

    But I was really alluding to this
    https://www.cnn.com/2018/04/27/healt...udy/index.html

    Blue LED's are found in the backlights of everything that doesn't use a CCFL, and CCFL's contain mercury which is primarily why we moved away from those, likewise we moved away from CRT's due to the leaded glass requirement to reduce x-ray exposure. It's kinda bizarre how every advancement in "television" type screens just moves the problem instead of eliminating it entirely.

    Quote Originally Posted by Lynart View Post
    While I don't use my Rift much due to the entire VR market being pretty much prototypical development...VR will either become the future of gaming for commercial use, or people with houses will dedicate entire rooms to them. If you don't have a VR setup, I understand why you wouldn't understand this because I was skeptical at first as well. The best analogy I heard was: advertising a VR system on a 2D screen is like advertising color on a black and white screen.

    VR is A LOT MORE than just 3D btw.

    Oh...forgot to mention: even if VR or AR fail the entertainment sector, the commercial sector has already adopted it. Think anything CAD related.
    Yeah, however commercial applications often don't contain the costs. To get people to actually buy "VR" kit, they need to figure out to create a VR system that doesn't require more physical space than a conventional game console. Half the draw in "VR" potential is to actually use your hands and legs, and that just is not practical.

    BCI's need to evolve in a way to actually figure out what the person is thinking for movement, and there has been research on this:
    https://research.utwente.nl/en/publi...-in-bci-gaming

    But until there is a way to simply "think" about moving in the game without an invasive implant, most VR stuff is just not going to have mass-market appeal. Solving the problem of people not getting sick is going to require higher resolution, and higher frame rate devices that simply don't exist, and won't exist as long as the developers of these products keep trying to use low-quality cell phone screens for them.
    (0)

Page 23 of 33 FirstFirst ... 13 21 22 23 24 25 ... LastLast

Tags for this Thread