Just use DLSSTweaks or NvidiaInspector and force DLAA + preset C or D. Then you will never have blurry images.
Cheers
Just use DLSSTweaks or NvidiaInspector and force DLAA + preset C or D. Then you will never have blurry images.
Cheers
DLSS is not a blanket "everything is improved" setting. It sacrifices visual fidelity for better performance, and to try and give people with lower specs an image they otherwise wouldn't have. It just happens to sacrifice much less visual fidelity than its contemporaries, such as FSR. If you have a low end PC, you may see a better image with DLSS on because it is trying to compensate for your bad hardware. But if you have good hardware, it may produce a worse image because your hardware is inherently better.
With DLSS always on, it will likely give you a worse image if your hardware can produce a better result on its own. With DLSS only activating to help compensate, you may be finding that it isn't activating to compensate at all. So I'm in agreement that "always on" makes it look worse on your rig because your rig is capable of producing better results without it.
That is incorrect. The latency increases from DLSS/FSR are only for frame generation. FF14 is not adding frame generation, it's adding DLSS/FSR1 upscaling. And with upscaling latency has the same realation between framerate and latency as any normal framerate difference. As long as upscaling increases your framerate it decreases your latency.The nvidia models are better, but they add 1-2 frames of latency at the GPU level where as FSR adds the latency at the CPU level. So one "looks laggier" and the other "feels laggy". Fine for people with lower spec GPU's if that means they get 30% more frame rate without lowering the resolution. But it will be distracting every time the image has high motion.
And the "looks laggier/feels laggier" thing is complete nonsense.
This is the issue I'm running into. My rig blows this game out of the water and DLSS makes the game look blurry, so I would prefer for it to just be off and run the game at my native resolution. Is the only solution to pick the AMD option and set it to 100% resolution?DLSS is not a blanket "everything is improved" setting. It sacrifices visual fidelity for better performance, and to try and give people with lower specs an image they otherwise wouldn't have. It just happens to sacrifice much less visual fidelity than its contemporaries, such as FSR. If you have a low end PC, you may see a better image with DLSS on because it is trying to compensate for your bad hardware. But if you have good hardware, it may produce a worse image because your hardware is inherently better.
With DLSS always on, it will likely give you a worse image if your hardware can produce a better result on its own. With DLSS only activating to help compensate, you may be finding that it isn't activating to compensate at all. So I'm in agreement that "always on" makes it look worse on your rig because your rig is capable of producing better results without it.
DLSS AI image reconstruction intentionally attempts to resolve detail in the image that is otherwise lost to aliasing. In a significant amount of games with DLSS, it adds back lost detail caused by aliasing, as opposed to TAA which only blurs the final image using information from the previous frame.DLSS is not a blanket "everything is improved" setting. It sacrifices visual fidelity for better performance, and to try and give people with lower specs an image they otherwise wouldn't have. It just happens to sacrifice much less visual fidelity than its contemporaries, such as FSR. If you have a low end PC, you may see a better image with DLSS on because it is trying to compensate for your bad hardware. But if you have good hardware, it may produce a worse image because your hardware is inherently better.
With DLSS always on, it will likely give you a worse image if your hardware can produce a better result on its own. With DLSS only activating to help compensate, you may be finding that it isn't activating to compensate at all. So I'm in agreement that "always on" makes it look worse on your rig because your rig is capable of producing better results without it.
Even if you are running at 4K, Quality mode for DLSS should result in overall better final image than the aliased look of native unless you *prefer* the sharpness of native, or perhaps you are very sensitive to ghosting if your framerate is low. Geometry at native will be very clean yes, but foliage is a sore spot that will not be resolved nearly as well, even with the new TSCMAA addition. TAA and/or DLSS handles the foliage so there is no aliasing to be found.
Last edited by MiloslavBlazena; 04-17-2024 at 11:46 AM.
Nope, just select the DLSS option that only activates when your framerate drops below 30/60. If your framerate stays above this target DLSS won't activate and you'll run at native using the new AA options they added. Another option is to install DLSSTweaks and use it to force DLAA. This option will also run the game at native resolution, but use DLAA for anti aliasing instead of the new options - though DLAA is more demanding to run so will incur a slight performance hit.This is the issue I'm running into. My rig blows this game out of the water and DLSS makes the game look blurry, so I would prefer for it to just be off and run the game at my native resolution. Is the only solution to pick the AMD option and set it to 100% resolution?
Ideally they will add more DLSS options for the full release, because at present they are using the equivalent of the performance preset for everyone which halves your native resolution before upscaling. So if you're running at 1440p it is lowering it all the way down to 720p. A lot of people will have graphics cards capable of running this game on the quality preset (or even ultra quality) and still getting good performance; so allowing users to select which preset they wish to use would probably solve at least some of the issues people are having.
It's weird.
![]()
やはり、お前は……笑顔が……イイ
Sadly no. I have no ability to force an upload of PNG that I am willing to sign up to.
FSR on an NVidia 4090 is clearly the better option of the two, even with the jpg artifacting. The rocks in particular are fairly immune to the sort of compression JPG would do on hair or grass, but the rocks themselves are the best way to see just how bad DLSS is for people with better video cards.
This bit of the rocks:
![]()
やはり、お前は……笑顔が……イイ
|
![]() |
![]() |
![]() |
|
Cookie Policy
This website uses cookies. If you do not wish us to set cookies on your device, please do not use the website. Please read the Square Enix cookies policy for more information. Your use of the website is also subject to the terms in the Square Enix website terms of use and privacy policy and by using the website you are accepting those terms. The Square Enix terms of use, privacy policy and cookies policy can also be found through links at the bottom of the page.