You do realize developers don't make a FPS cap just to fuck with you, right?
Printable View
I see it more as a discussion than an argument. we have different preferences on what we like and look for.
but this puts my opinion in better words than i did. and explains it better. Thanks for the link.
I agree with this. desperately need optimization SE. I have dual 580s and i shouldn't have a problem rendering the game at all, but i need to tone things down so my GPUs dont' go crazy. FFXI was limited because of PS2s specs. which was a bummer but it kept the game running smooth.
Yeah, it goes into much more detail than what I was hoping to find actually. In short, FPS = digital; eyeball = analog. 24 fps is plenty with motion blur, 100 or more could needed without if you want the game to be viewed 'reality smooth'. But there are other factors as well, you may need a better mouse for example.
I've cut out the parts that I don't have a direct disagreement with. It was not to make your post seem less coherent.
You are however missing the point here. I'm not talking about videos that are recorded in 24 (or 30) FPS then "upscaled" to 60 fps, I'm talking about videos that are actually recorded at 60 fps and been at 60 fps all the way from the camera to your display. This is what looks really good.
The amount of "frames" you can process is not something that is limited by your eyes. This is dependent on the type of data you need to process as well as how good you are at processing visual data. This is a skill that can, to a certain extent, be trained. The most prime example is a Formula 1 driver, who typically can process many times more visual information per second than most people can. If you fed 15 frames per second to a F1 driver, I can assure you that this driver will be missing a *ton* of information that his (or her) brain was accustomed to having available for processing before.
Because eyes do not work in clean, separate frames, those faint visual changes that an extremely quick movement would give you can also be processed as visual data in your brain. These faint registrations that would not normally qualify as a "whole frame" can give you enough information for your brain to decide if further investigation of the object is worth doing. The eye can then "lock on" to the object for just enough time to properly identify it.
If this rapidly moving object was recorded with a 24fps camera, the entire object's movement would be just a blurred mess, and no amount of movement tracking done by your eyes would be able to stabilize it and get a clear image of what you're looking at. If it was captured with let's say something really crazy like 200 fps and displayed to you at that framerate, the same object would still have been a blurry mess as it was moving around, but your eyes could lock on to it and give you a clear image just like you would be able to do in real life, while everything that wasn't moving would become a blurry mess instead.
Your eyes do this all the time in real life. They stabilize everything that moves that you want to detect more accurately. Your eyes can't do this if you're watching a low-fps recording of the real world, where motion blur distorts the image data permanently.
Playing action games at high framerates lets you do the same thing. Of course, it is not as important to the actual gameplay of FF14, because it's relatively slow paced and inhibited by latency, but you can still notice a much smoother experience if you move the camera around fast, and if another character runs past you really close to your camera. At a low framerate, an object moving past you close to the camera would be harder for the eyes to track if the entire run-by was just a total of 10 frames(1
/3rd of a second at 30fps), rather than 40 (1/3 of a second at 120fps). The eyes (or rather, your brain) would have had a lot more information to detect speed and direction with, and therefore be able to more easily lock your eyes on to the moving object to stabilize it.
-edit-
changed values in examples to reflect more practical situations.
Also, don't worry. I know exactly which specifications to look for when I buy my visual equipment. As you might have noticed, I have a more than average interest in how these things work :p.
I see where you are coming from now.
using a high FPS camera to record is acompletely different scenario. and in my experience this is only done so you can slow a fast moving object down. (IE: record at 200FPS so you can slow it down to see detail of movement.)
for commercial purposes high FPS cameras are a waste. recording an IMAX movie at 60FPS would be a monumental waste of resources. most cameras are in the 25-30 FPS range.
for pro-gaming especially in First person shooters i undersatnd the need for high FPS. but for most people this is not needed. we can't take advantage of it. our brains can't process that fast to make flash decisions.
our brains are also capable of filling in gaps. much like we can read a sentence even if all the letters aren't exactly in the right place. like many people have said latency in games like this are more important than FPS.
Of course I also understand the argument that many of us want our games to look really good. and FFXIV looks really good, so seeing frames drop can be... disappointing.
things in FFXIV aren't really moving fast enough i think for 60+ frames to be needed. i don't' think much above 30 is needed when looking at the entire population, especially seeing as there are a lot of people who's PCs aren't capable of this. it's better to optimize to a lower framerate and make the game more enjoyable to a wider audience.
Actually, a few movie directors really want to film using a higher framerate than what they are working with now. The Hobbit is actually filmed in 48fps, which I guess is in order to be able to re-use existing 3D cinema equipment, making it much cheaper than having to replace everything with equipment able to do for example 50 or 60 fps.
Using 48FPS, you can effectively just put the video directly into existing 3D projectors, because 24FPS 3D movies are in fact 48FPS as far as the projector is concerned. You just have to show the same "angle" for both eyes, and then we can just take off our 3D glasses and view the movie in 48FPS with no playback equipment change needed, except perhaps a firmware update.
At the same time, the 48fps recording can just be down-sampled to 24FPS in postprocessing in order to create a video compatible for non-3D cinemas.
I'm really looking forwards to it.
Traditionally, movies have been kept at 24fps because they were shot with actual film, and increasing the framerate would mean double the cost from film alone, as well as doubling the weight and physical size of all the film they had to move around. In the old days, film was really fucking expensive, so this was a big no-no. As time moved on, film became cheaper, but it had sort of become tradition, and that "cinema feel" is kind of tied to the use of film and 24fps.
Now, however, if you shoot the movies with digital cameras instead of film based ones, transitioning from 24fps to something else isn't nearly as problematic on the recording side of things. Moving the recorded videos around afterwards isn't nearly as much trouble either, because digital storage is relatively cheap and takes a lot less space. Not to mention, it's reusable.
Anyway back to game stuff.
I'll agree with you that more than 60fps isn't really needed, but the way I see it, there's no reason to hard-cap the framerate at 60. Hard-capped framerates are typically something you see in console games, and is really uncommon in PC games. Unless something is really screwy in their new engine, I don't see a reason to enforce a cap of 60fps, when just about every other PC game can run at as many fps your PC can possibly handle and still make the animations play back at the right speed.
When I hear Yoshida talk about the game being optimized for 30fps, I'm more inclined to take that as this is the target fps that you should be able to get with most computers in decently crowded areas, not that the game won't render things efficiently at a higher framerate than that. I'll take another example, just cause I love examples.
I currently use settings that makes my game run at 45-60 FPS for 90% of the areas in the game. In uldah i dip down into the 30s as a result of being CPU limited when there is a ton of characters on screen. However, with the same settings, if I go to Turning Leaf (hatehateHATE!), i will drop down to 10-25 fps if there is more than one treant on screen at the same time. At worst, i've been down to 7. This is with the same settings that give me 45-60 FPS for almost the entire rest of the game. So, what I would call this is a game that is not properly optimized for running at 30fps. This is on a GTX460 1GB.
Now we can try imagining how it would be on a PS3 which will not be running at more than 30fps for most areas. With the current huge gap in graphics requirement from one area to another, they would be getting something like 5 fps in Turning Leaf. Optimizing the game for 30FPS to me just sounds like they're making sure that no zones lead to a framerate below 30fps, when your settings let you achieve 30FPS most other places. Of course, you will have variations in framerate based on area and player count, but you should try avoiding situations where one area drops your framerate to one 5th of what you normally have. You shouldn't need to reduce graphics settings for one single area, especially when you're even solo in that area.
Well, I figured this would happen. Regardless I still find the discussion very interesting. I have learned a lot from many of the posters here! :-)
As an owner of a true 120hz monitor (benq xl2420t) I have noticed the difference. 30fps vs 60fps vs 60+fps is not comparable in any gaming environment. I'm not talking about watching TV, or going to the movies. I see (or should I say feel?) a huge difference in fluidity when gaming. For those saying its senseless to run an mmorpg above 30 or 60fps I would have to disagree as well. My experience with mmorpg's has always been (with my older 60hz monitor) the closer to 60fps the better for both PVE and PVP. Any game that my computer can achieve 60+ fps makes this monitor worth every penny.
My OP was simply to instigate whether SE would be considering giving players the "uncapped option," be it for future proofing or for anyone who can currently run the game at 60 fps constant in various areas. My logic was simply that the new engine would be booth beautiful and well optimized enabling a variety of rigs to run this game with smooth frames. The more options the better.
60fps is all you need.
Trying to push the game to 60fps+ would be a waste of time, when it would be better served to increase the quality of the motion blur effect.
All right so first I get to reply to someone involving network latency with Framerate, and now a video game gets compared to a movie. What's next?
"upper limit of human eyes" + "fps" = complete non sense. You can't measure that up in "fps".
As Mirage I was saying 30fps isn't smooth, 60fps is ok.
Sorry, go see a doctor. You are a lost cause if you are unable to notice the difference between 30fps and 60fps. And I am actually serious when saying that. Record a gameplay footage (could it be XIV or most of games) at 30fps, then at 60fps. Do an ABX test. If you fail to identify which is which, you have issues.Quote:
either way you dont' really notice a difference between 30FPS and 60FPS.
Edit:
Below are two links to two videos (sorry it's not XIV, couldn't be bothered to launch XIV now, fraps and encode twice, so I just re-used something I already had).
Both videos are coming from the very same source. The same video codec, bitrate, resolution etc. are used. (x264) Filesize are roughly the same (62MB) One video is @30fps. Another is @60fps. Try to guess which is which. Off course you can easily "cheat" and check the file details within Windows or with your media player and you will see which file 30fps and which is 60fps.
Go and tell me you can't notice a difference now. While on MMORPG, it's less of an issue to have 30fps instead of 60fps, you can still notice the difference. In certain type of games, 30fps is just... horrible. (shmup...)
Video 1
Video 2
Here is a question posted from a player who did not know the difference between capping at 30 fps vs 60 fps. He stated an interesting observation.
This is the same effect people experience when they enter a crowded area like Ul'dah. The brain exaggerates a drop of 60 fps to 25 fps as a decrease of responsiveness than one of 30 fps to 25 fps. The reason of course is that you subconsciously adjust your response to the rate at which you interpret changes in visual information. In other words, a consistent FPS appears more fluid than a constantly fluctuating one.
From a developer perspective, your goal is to keep the frame rate about 20% within the target fps. If you exceed that 20%, you either need to tweak the environment so to get the frame rate up, or lower the target fps. And since the game developers of this game want higher detail by sacrificing frame rate, the cap should be set at 30. As time passes and hardware improves, environments within the game will become more complex, but the frame rate need not change.
60 FPS or above would require keeping the game frame rate above the mid 40s. And few players can achieve that. Maybe a small few will be able to, but you are in the minority and the majority comes first.
unthrottled render loop is nice,
for burning out pc components
120 isn't for frame rate, its for 3d
Eh, I didn't buy a true 120hz monitor for 3D.
If FFXIV burns up a PC that's SE fault. Bad optimization. Yes, the 60 fps cap and 30 fps cap were placed to prevent this from happening since beta, but that was because the original engine and overall server design stunk. The developers have already admitted to this. It's no secret.
I mean, FFXIV graphics are nice and all... but they aren't that nice. It's terrible optimization bringing your computer to its knees. Like you said, rendering objects far above and below what you even see on your screen hogging up resources pointlessly. I turn on ambient occlusion and I run FFXIV at 2 fps. I run BF3 on ultra with no problems. 50-90 fps and temps no higher than 55 degrees Celsius. Even on my gtx 460 768mb I could run far more demanding games than FFXIV well into 60+ fps (medium-high settings) under 60 degrees Celsius. It's not like we need an fps limit for other games on the market that are way more demanding than FFXIV (DX11 games). Why aren't those games frying machines?
The new engine that they are working on is supposed to alleviate these issues. If they need to cap your fps at 60 in order to prevent your computer from melting then they are simply putting a bandaid on a horrible mess of an engine to begin with. This is why they are building a new engine from the ground up for 2.0. We are playing on PC, not consoles. I expect to see fps cap on consoles simply because there is no work around for those systems. You are playing with what you have on a PS3. No upgrades and everyones machine is the same. But when it comes to PC, there are going to be a wide variety of set up's which means there should be a wide variety of graphical options for both lower end and higher end machines. Lower end machines can crank the graphics down to maintain playable frames, while higher end machines can crank up the graphics to make use of the hardware.
At the end of the day, there's no reason to enforce an FPS cap. PERIOD!
The average person can see up to the equivalent of 60FPS, and even someone with as bad eyesight as me can often notice the difference between the equivalent of 60FPS and 90FPS. The ONLY reason why people have not noticed the differences until recently is because SDTV (esp. analog) sets never worked beyond maybe an equivalent of 35~40FPS. Period. (Not to mention, everything American film - until Avatar I think it was - was filmed with a 24.9~29.9 FPS rate.)
Directx11 is less taxing than both 9 and 10 when using comparable features, as for lowering features i've run this game on an ATI X650 with everything on low granted it looks like wolfenstein 3d but is it playable. here's a screenie if you dont believe me < http://cloud.steampowered.com/ugc/57...0679616BF27B6/ >
Oh my lord. Please explain to me how 120fps will burn my video card out faster than 60fps, when I capped to 60 would increase the image quality sufficiently for the game to max out my video card anyway.
And if FF14 somehow manages to do this, it's a game that should be pulled from the market, cause 120FPS has been possible for a long time on PCs, and was almost the norm for PC gamers before LCDs became mainstream. Tons of CRTs were capable of outputting 120hz 10 years ago, and competitive gamers would almost always play at 100fps or more.
He's quite right if there are no artificial limits your GPU would literally Fry itself by overload, there will always be a cap.
Current generations Cards Can reproduce upto 400+ frames a Second without reaching Maximum GPU Limit @ around 500 Frames the Cards GPU will be under 100% load, no GPU can last more than a few minutes under a 100% load they will always burn themselves out usually by way of V.Current Leakage or solder balls melting. also keep in mind that 400+ frames a second can only be achieved by Artificial means with no over head (texture's,AA,AF, TESS, etc etc).
A few months ago everytime I would start FFXIV, after a few minutes my system would spontaneously power completely down. Turns out some wood dust was preventing the GPU fan from spinning up. At idle the temperature was 95C. A fail safe* exists to prevent irreversible damage to the hardware components.
How did I fix it? I gave the fan a light tap to get it spinning. It's been working fine since. And this was SEVERAL DAYS with the GPU running at 95C. So I don't believe FFXIV is capable of "frying" your graphics card, nor is any other game.
And somewhat related, just because your GPU gets a little warm does not mean the game is poorly optimized. There is no correlation between how optimized a game engine is and how hot your GPU gets.
* "Fail safe" is a term I coined up, it may not necessarily be the technical term used officially
I don't even know where to begin.
The artificial limits you're talking about to prevent overheating is something the drivers and GPU firmware takes care of. Looking at just framerate when it comes to a graphics card's power output is just plain wrong. It is a combination of graphical detail, resolution, framerate and a bunch of other things. The total amount of processing a card needs to do is what determines how much heat it generates. If I lower my graphics settings so that i'm able to do 120fps, that makes fuck-all of a difference for the graphics card as far as heat generation is concerned.
Quake 3 arena was frame capped because a consistent framerate is desirable for the players, and because at the time the game was released, most computers could not give a consistent 120fps framerate at good graphical settings.
There is no limit in the Q3A graphics engine when it comes to how many frames the game can render per second. I actually downloaded the game just now and did a timedemo with 864 FPS average, and I was CPU limited, not GPU limited. With an i7, I probably would have broken 1000 fps. There is no automatism in high framerate and graphics card overheating. Just because I could, I also tested crysis and got to around 300 fps there, without getting a higher load on my GPU than FF14 gives me at 50fps. Just drop the entire framerate-overheat argument and admit you were wrong.
There is absolutely no hardware-related reason why FF14 can't run at 120fps, or even more.
I ran a looping timedemo on UT'99 at 600fps+ for hours, my card generate less heat than XIV at 60 FPS.
Thank you.
Yeah thats because the engine is diabolical, lets see any game I play on my Alienware laptop doesn't heat up so much, yet when I play XIV I honestly can't play on my laptop because it has burnt my legs in the past LOL.
I was just saying I do remember the game being smoother in general to me, I would like it to be uncapped or at least 100fps for 2.0, 60 I can cope with, but dropouts to 45 fps is no joy for me, just feels lack luster and generally meh with regards to movement and navigating the gui.
Maybe not, but you'd have to be rich or stupid not to put an fps cap on any game you're playing. Your computer has a limit as to how much fps it can keep up with consistently. Any drops in fps will be very noticeable, even at 120+. Smoothness is a function of consistency, not necessarily how high the fps count is. If you don't put a cap, your gameplay will suffer terribly due to the constant spikes in fps. Ideally you want a level your computer can run with without dropping significantly. Any higher and it will actually appear worse.
Of course if you're running a supercomputer you don't need a cap for any game out right now. But enforcing a cap prevents fps noobs from experiencing crappy gameplay due to bad game settings. Most ppl assume more fps = good, when it's only good if your computer can stay up there consistently. And given that the vast majority of computer owners are ignorant about their hardware, an fps cap keeps ppl from being dissatisfied with your product. It's all about the benjamins.
EDIT: This isn't even talking about how much optimization a game would have to go through in order to account for lack of an fps cap. I imagine getting any game to run 120+fps without any significant drops would be quite costly in terms of hardware, equipment, and man hours. Especially with an MMO, which has to account for mass amounts of people congested into relatively small maps. I don't think any company is going to drop that kind of coin without charging you more for the game. And i'd rather not pay for that, considering I can't even use it.
If a person's computer cannot run a game in 1080p past 60FPS, an enforced FPS cap is still useless. There is nothing in game code that can force a video card to run a game at an FPS higher than what it can run to begin with and the lack of an FPS cap isn't going to cause a game to run more than what the video card can handle. It simply does not happen, period.
Now, if people want an option to turn on an FPS cap, that's different. But don't you dare try to sit there and tell me that I should not be allowed to run it above xyzFPS just because someone else can't. That's absolutely stupid, and not to mention that because of fail-safe software & hardware design, it's actually impossible for the average user to burn their video card (it takes someone that knows how to bypass certain coding).
FYI, I do use a 60FPS cap on all games I play on my PC when and where there is a choice to. Even when emulating old games, unless I want to play in turbo-mode, I use no more than 60FPS.
I'm not talking about "handle", i'm talking about consistency. Handling 60 fps is a lot different than running a smooth 60 fps. For example, you can probably run higher than 60 fps, but you optionally choose to cap it at 60...because you know you can run it silky smooth at that cap. The average person would have no idea. An FPS cap just prevents the average person from becoming dissatisfied with gameplay due to poor settings. I'm not saying it *should* be enforced, i'm just taking a stab in the dark as to why they enforce it.
Well you should be allowed to, I wasn't trying to say you shouldn't. Just some theorymoning about why companies cap them at all. They should just link fps settings to auto-detect, maybe less people would choose poor caps that way.Quote:
Now, if people want an option to turn on an FPS cap, that's different. But don't you dare try to sit there and tell me that I should not be allowed to run it above xyzFPS just because someone else can't. That's absolutely stupid, and not to mention that because of fail-safe software & hardware design, it's actually impossible for the average user to burn their video card (it takes someone that knows how to bypass certain coding).
I do too actually, though i've never tried with a higher cap. I'm 99% certain I couldn't run a smooth 120 though so i wont even bother.Quote:
FYI, I do use a 60FPS cap on all games I play on my PC when and where there is a choice to. Even when emulating old games, unless I want to play in turbo-mode, I use no more than 60FPS.
Well if that's why they want to enforce it, they should still give the ability to turn it off. I really don't mind the default settings themselves not being to my liking, just the inability to change them, and SE has done good so far on that end I'll admit.
And/or use auto-detect to give info on which settings are going to work at what FPS rating .Quote:
They should just link fps settings to auto-detect, maybe less people would choose poor caps that way.
Nah, I probably could get it run smoothly at 120FPS, but that'd be such a huge graphical downgrade, probably worse than ever getting FFXI to run at 999FPS.Quote:
I do too actually, though i've never tried with a higher cap. I'm 99% certain I couldn't run a smooth 120 though so i wont even bother.
Heya guys!
While the details have not yet been decided, we are planning to make it possible to select from 30fps, 60fps and uncapped settings. ^_^
PRAISE BE TO YOSHI!
Uncapped? people be going ZOOOm ZOOOM ZOOOOOOOOOOM
If we're getting a Uncapped setting I hope the game is optimized so peoples GPUs dont melt
It's like SE has discovered the modern age of computing!
Drivers allow over clocking, and overclocking can fry a GPU. I don't see why attempting to process more info than the GPU can realistically handle without overclocking would be treated any differently. TBH, I'd be careful when using the uncapped setting. But maybe that's just me, I lean towards the conservative side of my graphics settings, always. I'd rather have my hardware last for a long time than be amazed at shiny objects for a few months before my components blow the hell up.