Quote Originally Posted by Naraku_Diabolos View Post
I will mention that now with DirectX 12 and Windows 10, I did the benchmark for 1.0 and got a score slightly over 6,000. I made a thread on it awhile ago saying that 1.0 needed Windows 10/DirectX 12 to really shine.
Not this silly stuff again.

1. Windows 10 has nothing to do with it
2. DirectX 12 has nothing to do with it
3. There are changes to multithreaded display list rendering with DirectX 12... which requires DirectX12.

Any perceived improvement is tiny and a result of less cruft running in the background with the updated OS. As a second point of reference, display drivers change specifically for every version of Windows, so the end result is that nVidia, AMD and such spent a lot of time on making WHQL drivers for the operating system release. The drivers are otherwise identical to Win8.1 and Win7. How AMD and nVidia choose to optimize their drivers in each OS is different. A WindowsXP driver has to be optimized against a Pentium III, and Windows 7 against a Pentium Core2, and Win 8/10 against the Core i3/i5/i7, and so forth. When they change the optimization paths, you get some small bumps in driver performance, but you're not going to get anything phenomenal.

It IS true that there were SOME higher resolution textures in V1.0. That doesn't mean that higher resolution textures = better graphics. Higher polygon models means better graphics, and that is a limit of the video card performance. Higher resolution textures is irrelevant when the average player doesn't play the game at maximum zoom. It's more important for the game to perform adequately when there's 100 Players on the screen, than it is to have 10 characters to have extremely high details in a cutscene that you can see their eyelashes.