People should reread this just to be clear but the frame limiter actually is scaled based on your monitors maximum refresh rate, if you look closer on the existing options theres a 1/1 rate 1/2 rate and 1/4 rate of your monitors default rate.
People should reread this just to be clear but the frame limiter actually is scaled based on your monitors maximum refresh rate, if you look closer on the existing options theres a 1/1 rate 1/2 rate and 1/4 rate of your monitors default rate.
You would hope!
I'm probably gonna get banned for posting this but I am prepared. Someone did a study and showed that while gcd is time based, the queued action does not execute until the first frame is available.
https://www.reddit.com/r/ffxiv/comme...ort=confidence
Essentially, the less frames you get the less DPS you will do over a long long time. Now while that is probably negligible between 90 to 60 fps, what isn't negligible is how having the in game's frame limiter on at all causes my 1080ti which consistently puts out 100-150 frames uncapped to stutter at 60 capped and even randomly drop to 30-40. How is that possible if I'm never going below 100 without the limiter? I don't know. Then imagine getting a micro stutter mid mudras which do not queue up in the same way gcds do and end up casting the wrong ninjutsu or having to babysit the mudra casts even if it takes longer than frame perfect execution which is still a DPS loss. I do not want to use the in game frame limiter no matter what the limit is, even it they had it at 300fps. I have Gsync for a reason so I don't have to rely on crappy software vsync technology.
My point is that there are those of us who want to play the game to the absolute highest performance possible and will seek every single nook and cranny to optimize our game play. It just happens that graphical performance (among many many other things) is one of those avenues of min maxing. So by having this limit, it removes that facet of optimize for us. Is 90 still acceptable and can they still perform extremely well? Yes. Should they be limited if their hardware can handle more? I don't think so. What the user has hardware wise is their choice, and so should be their fps limit if any at all.
I'm running at 60Hz, so not applicable to me. But I understand the sentiment and I share your rage. If this is due to something tied to physics then let those of us that want to exceed 90fps deal with that issue. If it's due to the botting issue whereby they can use high framerates to somehow get OOB, then git gud and write tighter code, hire more GM's with all that sub monies we gives ya and deal wit the issue proper like
Don't touch me there
It had better still run at a 1/1 ratio depending on your monitors native resolution as someone stated above. I understand some of you don't have 144Hz+ monitors or have never played at anything about 60Hz and that's fine but many of us have and stay at those refresh rates for a reason. If it doesn't apply to you I don't even see the reason for you posting here, this is an issue for those of us who do play at 144Hz+ and a serious one.
i dont know how to feel about this. I play the game in 4K full settings with 2 cards in SLI.. i do notice some zones give me weird crazy high framerates and also really push my fans, but we will see what this does. i honest don't see much difference in my game when i play above 100, so this may not be an issue, but again.. we shall see.
It's not 100% garbage when it works fine for most people...
This change is clearly to stop unintended things (exploits) from happening. It's worth making a small % of players have to "suffer" by playing at 90 FPS. No human can see the difference between 90 and 144hz anyway.
The real problem with this is how it would effect Gsync/freesync
That's literally not how "code" works. Some things are due to the engine. And this game was originally made in 2010. there's only so much that can be "fixed". Especially without breaking other things. This is a simple fix that hurts like 5% of players
That ones still being researched. Theres parts of the eye which cant detect specific things above 60hz, and a chunk where your brain cant process any differences past 90 even if its visible. Theres studies that say optimal frequency of objects for rendering is somewhere between 7 and 24hz, and that while you CAN detect significantly higher, you cant process them any differently, while others say 90hz is the sweet spot, and anything past that has a significant drop-off.RL basically has >5*10^44 "fps" with only our eyes (changes with >200fps still noticeable) being the limit.
We still need a lot of information on that topic.
For people that are trying to defend capped fps because they do not personally notice the difference is sort of funny and sad.
|
![]() |
![]() |
![]() |
|
Cookie Policy
This website uses cookies. If you do not wish us to set cookies on your device, please do not use the website. Please read the Square Enix cookies policy for more information. Your use of the website is also subject to the terms in the Square Enix website terms of use and privacy policy and by using the website you are accepting those terms. The Square Enix terms of use, privacy policy and cookies policy can also be found through links at the bottom of the page.