Results 1 to 10 of 565

Dev. Posts

Hybrid View

  1. #1
    Player
    Tiraelina's Avatar
    Join Date
    Mar 2011
    Posts
    476
    Character
    Tiraelina Kyara
    World
    Sargatanas
    Main Class
    Pugilist Lv 70
    Quote Originally Posted by SilvertearRen View Post
    Then, be a dear and explain why ArmA II, which is data-intensive and set to maximum graphics settings, runs my CPU and GPU at 50% load and runs great, while FFXIV runs my GPU at 99% load and causes my video card to reach borderline overheating temperatures unless I set the fan manually to full speed. Both of these games are DirectX 9 games.

    It is obvious here that the client was not fully optimized for the PC platform, and is causing these performance issues.

    You're trying to make a straw-man argument with insufficient cooling and poorly-built graphics cards as an excuse. Programmers know that even the best hardware can be destroyed by poorly written software. When a game is "actually" using the CPU/GPU, it may be functional, but there's a marked difference between "using the CPU/GPU efficiently" and "using the CPU/GPU like a retard".

    Edit: My point will be proven when Guild Wars 2 comes out and uses my Radeon 5870's GPU workload up to only 75%.
    No it really is only poorly built GPU's that burn out, that or you are trying to stress test with FurMark. Everything ranging from VSync to having AA/AF on is going to change it, CPU/GPU interaction being a bottleneck will also change it.

    All you are doing is making a big fuss over nothing, my old 4850's ran at 95-100c all the time under load and are still working after several years. Want to guess the thermal limit before the hardware failure? ~120c. 5870 will throttle itself back if it even gets near the danger zone of 100-105c which it won't do unless you have a very bad cooling arrangement/overclocking/FurMark.

    Witcher 2 is DX9 and uses 99% of my own 5870, that doesn't mean its lifespan is getting shortened. It's working as designed
    (0)
    Last edited by Tiraelina; 05-28-2011 at 04:35 AM.

  2. #2
    Player
    Elkwood's Avatar
    Join Date
    Mar 2011
    Posts
    591
    Character
    Elkwood Davidson
    World
    Excalibur
    Main Class
    Thaumaturge Lv 50
    Quote Originally Posted by Tiraelina View Post
    No it really is only poorly built GPU's that burn out, that or you are trying to stress test with FurMark. Everything ranging from VSync to having AA/AF on is going to change it, CPU/GPU interaction being a bottleneck will also change it.

    All you are doing is making a big fuss over nothing, my old 4850's ran at 95-100c all the time under load and are still working after several years. Want to guess the thermal limit before the hardware failure? ~120c. 5870 will throttle itself back if it even gets near the danger zone of 100-105c which it won't do unless you have a very bad cooling arrangement/overclocking/FurMark.

    Witcher 2 is DX9 and uses 99% of my own 5870, that doesn't mean its lifespan is getting shortened. It's working as designed
    Except SE has come out and said that the game is not running on our PCs right because it is not optimized and that's why there heating up, I guess what part of them admitting the game is potentially doing damage was not understood?
    (0)