I'm not sure what Norack is saying. Instead I looked up 64 bit on Wikipedia.
I'm not sure what Norack is saying. Instead I looked up 64 bit on Wikipedia.
Cool. Do you have any benchmarks showing this huge difference in game performance between 32bit and 64bit games?
I mean, if it's as gigantic as you make it sound like, surely, there must be some gaming enthusiasts out there making use of it. No one in their right mind would say no to a potential 2x performance boost.
I mean, maybe you are right, technically. I don't know enough about processor microarchitectures to say you're dead wrong, but I can't understand how this wouldn't have been on every hardcore PC gamer's lips if they could indeed squeeze so much extra performance out of their games in 64bit mode. Of course, assuming that they're actually CPU limited, which is extremely unlikely in this day and age, unless graphics settings are turned down to a minimum.
So if you really think that upping a game to 64bit mode is going to increase actual performance and not just theoretical performance under certain circumstances, I would say some unbiased benchmarks are needed.
I'm also really thinking that there is something wrong with your computer's setup though if FF14 is never using more than 20% of your GPU's power. I'm pretty sure you could put on a lot more antialiasing, ambient occlusion, and various other effects if that was the case. My 460 (but the same is the case for people I know with 560s) is most of the time running at 95% load or more, even with my old, inferior Q9450.
Last edited by Mirage; 07-09-2012 at 01:05 PM.

Nope, im sure some ppl have benchmarked it though. I'm only talking out of MB to CPU architecture theory.
did a quick like 1 second google search and found this outta millions of results
http://www.iinuu.eu/en/it-guru/windo...ance-benchmark
Remember its theorized and the results do show significant improvement. Quite frankly im surprised the 64 bit improved as much as it did because the hardware they used if you check it out at the beginning of the test they used all hardware that a X86 processor can take advantage of, and when they switched over and tested it on X64 they didnt change the hardware at all. I don't really understand the point of benchmarking something on X86 versus X64 when you use the same hardware. For example, why not upgrade your RAM? I can understand why there computer only used 4 Gigs of RAM when they where running X86 cause thats all it can utilize. But X64 can utilize 192 Gigs of Ram, depending upon which version you have. So why not upgrade the RAM? You know what I mean, its like "hey on the X86 where using 100% of our RAM so lets switch over to X64 and use the exact same RAM and see if we can use more than 4 GB's of RAM when we only have 4 GB's of RAM in there..............by the way of magick?" XD well thats a little over board but hopefully you understand what I mean.
EDIT:
------------------------------------------------------------------------------------------------------------------------
Have everything on FFXIV maxed out and only 20% which is why im assuming its necked out. So hopefully the 2.0 new architectural changes it to where it can utilize more of my GPU. I think really the biggest change between X86 and X64 in this game would just be the RAM usage as of right now. Id love to see how much funner this game can be from running using only 4 Gigs of RAM like I am now to all of a sudden running it using 36 Gigs of RAM. Lol I know thats never gonna happen and would never need that much but............. son of a just thinking about it XD.
Last edited by Norack; 07-09-2012 at 01:13 PM.
That benchmark is showing negative numbers in several of the tests, and doesn't even mention gaming performance.
The few tests that involved GPU usage showed a very small decrease in performance for some tests, no change in some other tests, and a slight positive increase in some of the tests. As you can see, the increase in performance for GPGPU tests were all less than 5%. That's not very impressive, really.
A few specific tests that heavily utilized a small number of the new 64bit registers on 64bit CPUs saw a huge increase. I don't see any tests there that replicate the type of work load a game would commonly have.
Last edited by Mirage; 07-09-2012 at 01:19 PM.

I know like I said I don't really see the point in benchmarking that stuff when you keep the same hardware. Its pointless but apparently the person that did that article wanted to see something. The hardware they where using a X86 can use like I don't know lets just say 98% of all its resources. I don't understand the point of upgrading to hopefully get the extra 2% for your ROI. I don't know the actual numbers for that one but when it comes to computers its a good rule of thumb that you will never get more than 70% of what your hardware is capable of doing. Least that's what all my colleagues and professors always go by.

Look no further than here, Crysis 64-bit vs Crysis 32-bit: http://blog.tune-up.com/windows-insi...e-performance/
Have to scroll down a bit
You really can't rely on benchmarks like this to tell you anything. For example, we know nothing of the host system. How well the software has been tuned for use of the additional registers X64 provides. And so forth. A benchmark like this may look very different in a year running different software. Another comparison perhaps would be Quicktime 7 and Quicktime 10 for Mac OS X. Quicktime X is *much* faster than Quicktime 7. How much of the speed increase is do to the fine-tuning for 64bit architecture is unknown.Look no further than here, Crysis 64-bit vs Crysis 32-bit: http://blog.tune-up.com/windows-insi...e-performance/
Ideally, one should run 32bit in a 32bit OS. This will always yield better performance than running 32bit software within the confines of a 64bit kernel or 64bit software on top of 32bit kernel. It may not be as easily distinguishable as one would expect but there can be a considerable performance decrease (5~10% perhaps? I kind of pulled that figure out of thin-air). Of course this is often impractical for today's desktops and laptops. So this should not be a considered a recommendation.
Last edited by Laraul; 07-19-2012 at 10:41 AM.
Thank you.



to OP. new GPU will give the best "bang for the buck"
Last edited by Judge_Xero; 07-10-2012 at 02:21 AM.
"I don't always drink beer, but when I do, it's often."
Temp Forum Ban - July 7th 2016 *** I promise to never call out scrub players again due to it causing a toxic community
|
|
![]() |
![]() |
![]() |
|
|
Cookie Policy
This website uses cookies. If you do not wish us to set cookies on your device, please do not use the website. Please read the Square Enix cookies policy for more information. Your use of the website is also subject to the terms in the Square Enix website terms of use and privacy policy and by using the website you are accepting those terms. The Square Enix terms of use, privacy policy and cookies policy can also be found through links at the bottom of the page.
Reply With Quote



