Resolution only stresses the TMUs but this game usually bottlenecks at the vector-transformation. So lowering the resolution doesn't always mean that the game will run faster. Overclocking the GPU can give a few frames more though.
Printable View
Bring up your task manager and a gpu monitor, and see what they're doing and post some screenshots, because I've well exceeded 44fps on high settings at 1920x1200, and my cpu never went past 20-30%. I have a feeling either something is wrong with your processor or you have some other limiting factor, there' sno way the cpu will get bogged like that.
I beg to differ, as far as I can tell, the slowdowns when there are a lot of characters on screen isn't really due to gpu/cpu limitations (or even hdd for that matter, if you're not moving).
Case in point: right now I'm hovering between 49-51 fps @ 1920x1200 windowed mode, standing in front of a very populated area in uldah. My gpu load? 60-65%, and cpu load is around 25%, with no cores exceeding 50%. My guess is that there's some network-involved slowdowns caused by the brilliant design that is the ff14 server/engine.
my system is a i7 920 and a 580gtx, which is decidedly less capable than a 2600k.
http://imageshack.us/f/38/semttulorgk.jpg/
there you go HT off so you can notice it better. And theres nothing wrong with this 2600k its actualy my 3rd since i do alot of oc/benchmark and took me 3 cpus find a good bin it does 5.6ghz with 2 cores on air.
@ zorlin
I think we're just disagreeing on what's bogging the game down. You pin it on a cpu bottleneck caused by the game engine, I say it's network related, as the slowdowns for me scales directly with how many other players are on screen, as opposed to what's actually rendered on screen, and have zero correlation with my cpu/gpu load.
"as the slowdowns for me scales directly with how many other players are on screen" you know players have alot more polys then the world itself right? Honestly i really doubt network has anything to do with FPS makes no sence at all. Specily since if i OC to 4.5Ghz i get 60fps in the same place. Its pretty clear to me that one of ff14 threads taxes one of yr cores making single core speed important here since that thread cant be handled by multiple cores due to engine issues.
It's just a guess, as high server load in eve online can turn the client into a slide-show (though admittedly I don't remember if the game was still running at normal fps while the screen was frozen due to server lag). I just can't see how it's cpu-bound when the load gets to barely 60% on any of the cores with overall utilization in the mid-low 20%. (my 920 is @ 4.0ghz)
Now that I think about it, I wonder if it's something specific to the cities that's causing the slow downs. Right now i'm out here at halatali, with more characters on-screen than I had in uldah in a dust-storm, and it's at a smooth 60 fps.
Last time i went to natalan with like 40ppl quite lot of enemies around us and windy weather my fps came down to low 20s while my gfx load was hiting 60-70% thats when i decided to look at core graphs for the 1st time and one of em was hiting 90s while others were in the 30s then i did this few tests in uldah. Btw this doesnt happen in gridania or limsa since theres pretty much noone there and that core barely hits 50% load.
Oddly, something about gridania always gives me the worst FPS there. Character models don't make much of a differeance in gpu load, if any at all, just hdd, but since giving ffxiv a dedicated ssd that problem has gone away. I don't want to cap 60fps, I want to be int he 40's and 50's, meaning that i'm hitting some limitation, however it's never been close to being the cpu, always the gpu or hdd.
Also, you have only ffxiv running, i have ffxiv with higher settings, higher resolution, 3 monitors, a million chrome tabs, video playback, excel, and a dozen other things, and yet my cpu usage is lower than yours... And I don't overclock anything. Hmm...
What are you using to monitor your gpu?
I use MSI afterburner and GPU-Z.
i7 2600k Stock: http://imageshack.us/f/88/semttulo2kg.jpg/
i7 2600k OC no HT: http://imageshack.us/f/692/finalula.jpg/
as you can see that core isnt taxed anymore if i oc the cpu and therefore i get cap FPS.
Hope this helps making my point where 2600k stock bottlenecks FF14 cause the engine sucks.
The "pegged" core was only at 60-70% though, it still had nearly a third of it's processing capability. I wouldn't call that a limiting factor.
Even tho it wasnt 100% was prolly using all its cache its like when i export video cores dont go to 100% they just hang in 80-90%. Actualy ive never seen 100% load unless im using prime or linpack to stress out cores to max.
But the point is if you OC the cpu and fps go up means there was a bottleneck.
lol you guys are really techincal lol
anyways next questions is this. Nvidia or radeon?
good point, though this means ff14 is REALLY inefficient at utilizing resources... even more so than I thought.
@ neorei
the gtx580 is still the king as far as single gpu, single card goes, but other than that both nvidia and amd have competitive offerings across all price ranges, it's really up to your budget, and what deals you can find IMO.
AMD = more bang for buck
NVIDIA = Best highend cards therefore alot more expensive
what other games do you play other then ff14? cause if its just for ff14 go with AMD6950 or a Nvidia GTX560TI both perform about the same.
If you go with the 560 make sure its the TI version the non TI is alot weaker, also if you can wait a month AMD 7xxx series are being released in january.
Yeah just wait for the 7000 generation of AMD. It's a gamble but definately better than the 6000 generation.
Btw stop talking about gpu usage percentages... no software can show that accurately anyway. But if you have a debugging tool for the hardware, go ahead ^_^
I play DC online universe and wow (though you can play wow at ultra with my gcard) but FFXIV is my main game on computer for sure I play FPS on consoles.
i'm also interested in Rift and the new starwars game.
but again thank you all for the suggestions, i am getting alot of knowledge (esp from the 2 guys arguing lol)
Hands down, nVidia has AMD beat card for card. However the 6950 is less than half the cost of the 580, and two 6950's outperform the 580, so for less than the cost of the 580 you can exceed the performance of the 580. If you needed higher performance of two 6950's, and only had two card slots, your only option would be two 580's, but seldom would someone need that, so it's pretty much left to the perfectionists to have 580's, and those who don't need the best of hte best would most likely go with the 6950 if cost were a factor.
yeah i've been hearing alot of praises for the 6850, online sources saying it's more bang to your buck.
Ya, that or a 6870 is all I imagine I'd ever need for current games.
This game relies heavily on the processor.
If you have the cash to switch to intel, the i5 will do you some justice.
You can make the switch for as little as 300 dollars. I would spend possibly 400-500; for an SSD and 8 gigs of ram.
8 gigs is overkill but it's so cheap today; why the hell not?
I had hte 6870 before the dual 580's. The difference is like night and day. The 6870 can play on medium-high, the 580 is everything maxed, 16x aa, no ao or dof. However I freaking LOVE eye candy. so it's a bit of a factor for me. I do all the stuff, but i love it to look freaking amazing while i'm doing it. A lot of people get eye candy at first, then it fades with content. Not me, I love both equally.
I dont consider replacing that phenom a good choice atm since its still a pretty decent processor although if you can sell it for a decent price you could get an i5 2500k + z68 motherboard + 6950 for about 500$. The 2500k easily oc's to 4.5ghz with a cheap cooler for example a Cooler Master H212+ for about 30$. That rig would likely max any of those games. And dont forget most 6950 can be mod into 6970 unlocking extra shaders.
any quad core can play this game maxed with a decent gfx but for solid 60fps anywhere yea you do need an high end cpu.
That I can agree with. Hoping that won't be the case come 2.0 as it's just ridiculous that an 2600k can't get 60fps maxed out anywhere without OC'n.
another questions!!! what is the difference between GDDR3 and GDDR5?
the same difference between ddr1, ddr2, ddr3 memory in your computer. one is faster than the other.
Atm there really is no question sandy bridge trounces amd even the new bulldozers get easily pwn. But for gaming phenoms are just fine. My brother has a phenom X4 965 we both do alot of video editing you have no idea how fast the 2600k is against his phenom specialy if you OC. Sometimes he even asks me to render video for him so it doesnt take all day. The 2500k are just like the 2600k without HT, actualy its easier to get higher OCs on the 2500k since HT is vcore hungry i always end up disabling it to achieve max OC.
oh so DDR = the ram of a cpU?
DDR is RAM memory so is GDDR but GDDR was built for graphic cards and its alot faster. DDR is also used in graphic cards but thesedays only budget cards have DDR memory on the otherside CPU's can only use DDR not GDDR.
For current generation cards the best ram is GDDR5 so if yr looking for one get one that has this.
To the OP -
You can see why looking for advice on the Internet about computer parts can be a pain. People mostly speak from personal experiences with the setup they have because most of us aren't rich enough to have multiple setups. For example, my current setup is an i7 965 Nehalem (one I was lucky enough to get used!) with an EVGA GTX 460. Clocks are 800/1600/1850 (Core, Shader, Memory) with 1050 mV core voltage. I cannot get this game to run at a fluid 60 FPS in 1280 x 720 windowed mode. Gen Drawing Quality 8, Background Drawing Quality 3, Highest Shadow Detail, no AA. AO off, Depth of Field off, Standard Texture and Filtering Quality.
From your budget of $500, I think you have two options, as I think running a dual SLI GTX 580 setup is overkill for this game - that's already $1000 which I think is absolutely insane unless you're going to use that setup for some other crazy combo like running a game at 2560 x 1440p with everything maxed out. You can either go the single GTX 580 setup, which will probably allow you to run this game at 60 FPS if you turn down a few options, or a dual GTX 560 Ti setup if there's room on your motherboard for it. I've heard Fermi-based cards scale well (meaning if you run the thing in SLI, the second card will actually be able utilized correctly). Both will cost you about $500. A single GTX 560 Ti will be sufficient for most other games but if you really want to run this game at max settings, I think that's the route you should take.
If you only run in windowed mode like me, you'll have to go the single GPU route, as Crossfire and SLI do not work in windowed mode.
yes actually this honestly helping me out seeing other point of view on how the they set up their computers. But yes I may go for a GTX580 right now, i'm still looking at a radeon because I have never played with a radeon, so that is still an option, but when it comes to overclocking and understanding how CPU optimization help me play this game better.
I really do want to push for 60 FPS at least on high. so seeing people set up is helping me understand my own setup.
It really boils down to what's your budget? Keep in mind that after 2.0 it won't take as much to max the game @60fps.