Then how does 3D television work?
Then how does 3D television work?
Last edited by Laraul; 07-25-2012 at 11:48 PM.
Sounds like a false dilemma, and looking at them in the stores won't give you the whole picture. You can probably find several TVs with great picture quality, and among those find TVs with different refresh rates. In that case, a TV with great picture quality as well as the capability to do 600Hz will be able to display inputs of different framerates better than one that's "just" 100Hz. However, if the input in question is 25fps or 50 fps, there will be no discernable difference from the refresh alone, as both perfectly add up to both 100 and 600.
At the same time, you won't be able to reliably compare them at the store, considering they'll be sending one single video source to all TVs. If you ask a clerk there "hey can you switch to a 30fps source instead of 24fps source so i can look for panning jitter", you'll probably not get anywhere.
And really though, it *is* easy to notice 24fps video jitter on a 60fps display, like most PC monitors are.
You're right though, I won't be able to say "this display is exactly 60hz", but I will be able to say "this display is giving me a lot of jittering, i'd rather have one where I didn't get that".
Personally, I have a Panasonic plasma TV with fantastic response time, contrast, and picture quality in general. It is however only 100Hz and I do actually get jittering when watching 24fps movies. There is some motion smoothening technology in place, but it would have been better if my TV used 120Hz instead of 100, cause that would give me a constant rate of 5 refresh cycles per video frame.
For console gaming, however, I'm not getting any of that. It'll run at 60Hz while receiving a 60Hz signal from my PS3, making every frame my console output align nicely with the refresh cycles of my TV. As long as it's staying close to either 60 or 30 fps. Many games do.
Most fighting games always run at exactly 60 fps, and other games often limit the framerate to 30 fps even if the console could output anywhere from 30 to 45, because this leads to less screen tearing and also more stable input lag. The only thing worse than high input lag is highly variable input lag. 30 fps is two update cycles per frame on a display that's in 60hz mode, giving you the same amount of time for each frame, most of the time.
Last edited by Mirage; 07-26-2012 at 12:08 AM.
http://en.wikipedia.org/wiki/Refresh_rate
When LCD shutter glasses are used for stereo 3D displays, the effective refresh rate is halved, because each eye needs a separate picture. For this reason, it is usually recommended to use a display capable of at least 120 Hz, because divided in half this rate is again 60 Hz.
Yoshi-P already said he is optimizing the game for 30fps => as much screen detail as your system can support to give 30 fps.
60 fps is a waste on MMOs even if you can tell the difference. Network latency is much much much higher. More likely to get lag from that than FPS.
I'd like to understand how network latency is related to framerate. This forum will never cease to amaze me.
30fps is NOT smooth, period. Even with 500ms latency it doesn't matter, it is not smooth. 60fps is not a waste for MMO when you are playing in a low populated area or during cut-scene. Off course in town or during massive events, I don't expect any graphic engine to be able to render tons of players while maintaining a solid 60fps framerate even with high-end hardware.
Antipika.
Deathsmiles II-X - Difficulty Lv.2+ (1CC/2LC ALL clear) : http://youtu.be/pjRuwv_-MlI?hd=1
Touhou 13 - Ten Desires (all clear) : http://www.youtube.com/view_play_list?p=PL194872B2BBA7CA67
Touhou 12.5 - Double Spoiler (all clear) : http://www.youtube.com/view_play_list?p=BD180E7054F3C1A2
Touhou 9.5 - Shoot the Bullet (all clear) : http://www.youtube.com/view_play_list?p=53B01AAE8A03BDD1
Touhou 8 - Imperishable Night (all clear) : http://www.youtube.com/view_play_list?p=7A5C1FF6BDAD1C1B
If you can tell the difference, it's not a waste.
*rolls eyes*
you do realize 'real time' cameras record at 25-30 FPS right? an IMAX camera records at 24 FPS.
saying 60FPS is not smooth is ridiculous. if it appears as not smooth, then it's not displaying at 60FPS. the upper limit of the human eye is around 35FPS. there is no reason to go over 30 for most people.
--------
The main reason you want higher frame rates is for micro-instances where your system is lagging and FPS drops. if you're capped at 30FPS, anytime your system lags you will notice. because your eye can detect it.
at 60FPS you can lag and lose up to 50% of your cap and still not notice. this is the only reason this is better. it adds a buffer to what you can and cannot notice as far as rendering lag.
The engine in FFXIV is terrible. I've got a pretty decent rig and can run the game at 60FPS fine. but i leave it at 30FPS. know why? because i can increase the texture quality. when i set it at 60FPS cap my system will continue to pull resources in an attempt to hit 60FPS. this bottle necks my system when set to the highest textures and causes my system to drop below 30FPS, because it's trying to render at 60 and bottle-necking.
when I cap my system at 30FPS it stops pulling resources when it hits 30FPS. it can then put additional resources towards other things: like rendering textures.
the only time you're going to want to pay attention to this is when you're on the borderline spec for the game you're trying to render. When I play League of Legends I set it to 'unlimited' FPS cap. simply because my system is powerful enough that i do not see a performance difference between 30, 60, 120, and unlimited FPS.
but for FFXIV, there is a big difference in peformance between 30 and 60, just because my system can't rendor 60FPS on high settings. (it can on lower settings, but i'd rather have pretty textures than higher than needed FPS)
EDIT:
I should mention the problem gets worse if you're crossfire/SLI. I personally run two GTX580s and due to inconsistencies/variability the two just wont' cooperate with certain games (FFXIV for instance) even though i'm WAY over the minimum spec line. my problem is compounded because i purchased a refurbed system, and the HDD is crap. It came with two low tier HDDs in raid. this has caused me nothing but problems. I should spend the time to fix it, get a decent HDD for running my OS/most played games and use the other two crap-disks as storage. but it would take about a week to get everything back up and running and everything re-installed.
it comes down to optimization and what you're willing to deal with.
Last edited by Onisake; 07-26-2012 at 02:28 AM.
24fps only looks smooth in videos because of the inherent motion blur of cameras. Last time I checked, FF14 wasn't shot with a real camera. If you want motion blur to cover up the low framerate in a game, you have to spend GPU resources on adding a motion blur effect. I'd rather use that GPU power to get another 5 fps, personally.
Additionally, have you even watched a 60fps movie? It looks so, so fantastic.
And he was saying 30fps wasn't smooth, not 60.
As for me, my card is good enough to get *both* the highest res textures and 60 fps whenever I'm not around the fronds in uldah. Maxing out textures isn't really *that* taxing on a video card, as long as you have the VRAM for it.
But why am I even replying? I should have stopped reading when i got to the "human eye can't blahblah" bullshit.
Human eyes are not digital. They do not have a "framerate". If you have a 100fps video of white, with one black frame. you're going to notice it every time. How is that even possible, if your eye doesn't register more than "at most" 35 frames? With that logic, 65 frames would be lost, meaning you'd miss the black frame almost 2/3rds of the time.
Eyes work in energy levels. Whenever enough photons have hit a light receptive cell to increase the energy level sufficiently, the cell will send a signal. The minimum amount of energy required to activate one single cell is roughly what you get from *four* photons. Human perception is a constant stream of signals from cells that respond individually from other cells. How many "frames" we can see is highly dependent on the constrast and movement in the frame. How many "frames" a brain can process also highly depends on the content of each frame, and on the person's training.
Last edited by Mirage; 07-26-2012 at 02:41 AM.
oops. mis-read that. either way you dont' really notice a difference between 30FPS and 60FPS. saying 30FPS isn't smooth is the same as saying 60FPS isn't smooth. if it doesn't appear smooth it's either 1) not displaying at 30FPS and you're getting lag in your rendering 2)you're on the far end of the visual spectrum and capable of viewing things at greater than 30FPS
I've seen movies in 60FPS. i've seen the same movie, on the same projector at 30FPS and not noticed a difference. 60FPS projectors typically have better color capabilities and far higher resolution. this improves quality FAR more than FPS. Also keep in mind these movies are recorded at 25-30 FPS. displaying them at 60FPS doesn't improve anything, you're just seeing more 'blur'. but what really sets it apart is the resolution. not the FPS.
a lot of people tend to forget that for some reason. it's not the FPS you're noticing. it's the improved resolution/color display. a dirty little secret: a lot of TV/projector companies will tell you FPS is important so they can jack up the price. you should really be looking at resolution more than FPS.
If you go to Tiger Direct and look at TVs on sale, you may see a 42"1080i that's cheaper than a 38" 720p. most people would think 'Hey! what a deal!' but the 1080i is 1680x1050 resolution 10000:1 and the 720p is 1920x1080 20000:1. I personally will go for the 720p for the higher resolution and contrast.
-------------
EDIT:
you updated your post so i will too.
eyes do work in energy. at 1000FPS, a single frame is displayed for 1ms. you will see it, but you won't really know what you saw.
the human eye can differentiate up to 15 images per second. most people are below this. this doesnt' mean we are caped at 15 FPS. it's just anything above that our brain won't process. it's too much information.
so yes, technically speaking the eye is capable of seeing more than that. but the brain can't keep up. it's like playing sounds above/below the spectrum our brain can process. our ears pick it up, but it's junk data to our brain so it just throws it out
Last edited by Onisake; 07-26-2012 at 02:55 AM.
just to keep the arguments to a minimum:
http://www.100fps.com/how_many_frame...humans_see.htm
|
![]() |
![]() |
![]() |
|