It was a mixture of things, take a a look at this for example, the scene is very bright. A trick used in 1.23 all the time, the brighter the object the more detail you can trick into it, especially for a lower poly object (See Kingdom Hearts as a great example) The normal maps used to create this scene are what make the detail you see:
What is a normal map?
A normal map is essentially an RGB texture that the engine reads and creates detail based on the light sources in the scene to give the illusion of detail.
The object is still a low poly object but with the texture it has the illusion that its of a much higher detail:
We have to start in saying that a little bit of understanding and maturity goes a very long way, we can't always have what we want and sometimes we need to make compromises.
Not everyone has a beastly PC, very few infact. The average market of people playing this MMO still don't have a top of the line PC that would have run 1.23 and that's why you're seeing the overall design decisions you're greeted with daily now.
Marketing
1.23 failed, it was a financial scare for the entire company and sent ripples through the entire company.
They didn't want to make the same mistake twice, when it came to planning 2.0 choosing the common denominator after deciding to rebuild the game was the safest option, instead of going for a smaller market of people with better PCs in the hope that the PC market would tide people into 1.23 eventually over time due to its beastly rendering costs (overall). Which they soon learnt people wern't going to stick around for (even when they could run the game).
They had to consider "What is our target market, what does the average PC user, Final Fantasy Fan and MMO player have in common for hardware?" to which the answers came back with an average specification for systems along side the Playstation 3. So a target was set and the development of models, textures, rendering, area size, UI size/information, and memory limitations were all set in stone before development even began.
Necessity vs Luxury
The problem with MMOs
There are a lot of unique characters on screen at once, this poses as a problem in that rendering many of these at once drastically increases the rendering cost, so keeping this down helps improve on performance and also allows more room for a 'default number of characters on screen' which currently is quite high compared to 1.23
Ontop of all this, you still have the entire environment to render on top of NPCs through the world, and even more-so during battle you have a firework display of effects, animations, sounds, particles and effects left right and centre to render.
It's very difficult to balance this, so keeping things revolved around the average markets hardware can work best at times for this.
Lighting & Shadows
The particular method used for casting shadows based on light sources in 1.23 was very scalable and could produce a fantastic quality shadow as a result, but at a very high rendering cost and made it a resource hog and overall produced a kinda crappy look and feel to lighting for the most part.
2.0 uses Deffered Rendering which allows them to use a quick and fast and very scalable quality for shadow/lighting sources, you'll notice a green glowing bush will have its own light source which will defeat the other light sources it comes into contact with, as well as producing its own shadows should'st you walk near it.
This particular type of rendering can be adjusted to produce a low quality or high quality output making it the perfect option for future scaling.
(The Ambient Occlusion method used too was nice, yet intensive)
Textures:
Textures take up quite a lot of data, essentially you have a sub-set of textures, shader data and materials for various things such as the base texture, normal maps, specular maps, parallax maps, specular maps, environment reflections and more to consider.
The higher the resolution the more data is required. Keeping the client small for the PC and PS3 release was an initial idea on how to keep download and weight of the entire client small initially with the potential to expand it later. (Updating textures in the future is no easy task but a doable one)
1.23 had beautiful textures in that very few of them were ever heavily compressed. Gear, hair, skin all had high resolution diffuse, normal and a wide-set of shaders to make the gear look very appealing at a high rendering cost.
Animations:
Sadly 1.23's animations although vastly suprerior and very beautifully hand-crafted had the issue of inertia where your character had weight and progression into and from each animation, to do this they added in an animation lock to abilities which if you recall locked you in place till the animation was done which in an MMO which usually revolved around dodging/avoiding mechanics was a nightmare and wouldn't have worked in 2.0
1.23 also had the issue of un-unique animations for things such as casting spells which is something the 2.0 team wanted to change, this was in some ways an improvement in others not so.
Cutscenes:
These take a long time to make, to make them very elbaorate like in 1.23 where a lot of work and unique coding took place took more time than it was worth in 1.23 which led to its own problems and costs. 2.0 made a smaller time frame and budget for creating cut scene and therefore the quality was reduced, this was unavoidable for the most part.
Was it all worth it though, so sacrifice some of the great assets from 1.23?
Yes, it saved the game, saved the company and has breathed new life into the game at the expense of a few luxuries. Some of these can be corrected, some cannot but we often take it for granted that "FFXIV isn't the same as before" when in reality it's a very detailed and beautiful MMO.
EDIT/ADDITION:
To the "Uber Quality" comment;
Oversampling
In 1.0 you could oversample the game to draw the entire scene at double the resolution for minor increase in overall drawing quality at a huge processing expense, the problem with this setting was that it was never explained, so people mistakenly put everything on high including 'Drawing Quality' which led to a HUGE FPS drop, (Example: Standard would have left it at your monitors resolution, the next two settings drawing the scene at 1x or 2x the base monitor resolution)
As for textures, it's just a matter of doing it. Yoshida mentioned at one point that him and his development team felt the resolution of textures didn't matter on a high-definition screen. (This is ridiculous in my opinion, a low resolution texture will look awful no matter what screen you're viewing it on). Although textures can have a large impact on space having an optional package for this would drasctically increase the look of the game.
Anti-Aliasing
FFXIV on release (1.0) had multiple options for Hardware Based Anti-Aliasing (AA) types to defeat aliasing on the 3D models and textures in the scene, luckily the way lighting and shadows was rendering in that particular engine allowed for hardware based AA in DirectX 9.
The problem with FFXIV:ARR (2.0) is that hardware based AA can't be used with Deffered Rendering (the method used for real-time dynamic lighting/shadows) due to a bug in the DirectX9 API. This however was fixed in DX10, 10.1 and 11 and is no longer an issue, sadly ARR currently only uses DX9 meaning ARR could only use FXAA (Fast approximate anti-aliasing).
Why does this matter?
There's so many types of anti-aliasing, some require more processing time than others, some less-so but to a lesser degree of quality.
With many hardware options such as ;
Multi-Sampled Anti-Aliasing (MSAA) This method doesn't work in Deffered rendering, but it's still commonly used (although less-so now) pretty much every GPU is packaged with this capability nowadays.
Super-Sampled Anti-Aliasing (SSAA) Very common and works wonders for textures, alphas and sharp edges in general, however requires a high degree of scene re-rendering to achieve the effect.
Coverage Sample Anti-Aliasing (CSAA) Essentially a more powerful MSAA, running the 'hack' at a higher rate to produce a better result at a high performance cost.
Morphological Anti-Aliasing (MLAA) This was a GPU based solution from AMD using DirectCompute to try and 'speed-up' the AA process, seeing as it was such a resource hog. MLAA runs passes over the entire scene, leaving a very considerably increase in image quality (see below). As mentioned in its post on AMD's information page, it could inject itself into almost any DX9 (and newer) title which made it a nice option for people to turn on externally.
No MLAA
With MLAA
Almost all hardware based AA detects the polygonal edge, alphas and shaders in the scene and smooths ONLY the jagged edges and ignores the rest. This is the best option as it doesn't blur the entire image and produce any undesired results on textures, UIs and other objects in the scene that don't require AA. (See the trees in 1.0's Coerthas for a great example of hardware AA working on the alpha textures for the leafs whilst retaining detail on the original texture)
Fast Approximate Anti-Aliasing (FXAA) This is simply an overpass of your entire screen, and it is extremley efficient and costs virtually nothing in processing time. Compared to hardware based AA solutions it doesn't detect geometrical or alpha edge data, it simply runs the anti-alias pass to every pixel being displayed. The problem ARR has with this, is the user-interface, luckily the set FXAA overpass provided in ARR doesn't affect the UI, but sadly external programmes you can set up will affect the UI.
FXAA is generally doesn't achieve as nice an effect as hardware based AA, but sadly only the DX11 client would be able to support this, hopefully Square Enix will add this in the update.





Reply With Quote


