ROTFLMAO.
Throw popcorn kernels at the nVidia GPU (RSX) in the PS3 all you like, but the Cell BE CPU is/was a tremendous piece of work. It's taken many years for X86 architectures to catch up on pure performance terms, so please, let's not make any ill-educated absolute statements about it. Cell BE uses PowerPC architecture for it's main processing element, it's hardly custom. The SPEs are the only thing that's 'non-standard' and they're non standard only if you are stuck on X86 architecture being the only way to go in computing, labeling everything else non standard or custom.
Oh, BTW, how old is the x86 architecture? Before you claim Cell BE is 'really old and long out of date' (that's just a priceless statement in and of itself) Maybe you should take a quick look at the 40+ history of the x86, which is surely a bit long in the tooth by now....
The SPEs were a bank of really fast processing units that would work individually or in groups, could process data streams in series or parallel, and had tremendously high single precision Floating Point capability. It was an architecture designed around the needs of media processing, video and gaming. I'm honestly sick of the uneducated assertions people make about the Cell BE. The CPU in the Xbox 360 for example used an almost identical PowerPC core, although it was a tri-core unit with no SPEs.
People seem to be hung up on extra dev time to learn an architecture, guys, that's what developers do for a living, they learn architectures and learn to exploit them as best they can. Cell BE was a different architecture to x86, but so was Xbox 360's Xenon. For that matter every new generation of GPU that comes out represents a new architecture that must be learned to take the most advantage of it.
All the whining about a custom architecture and it's so hard to use comes from people's inherent bias against Sony more so than any technical issues. The fact is, when Cell BE hit it was the first time many developers had to start thinking about how to parallelize their code to make good use of the SPEs. Initially most Devs simply used them as extra cores and didn't get creative, until they learned to. If there was never any innovation in microprocessor design, out technology would stagnate. Just as there has been innovation, evolution and new development in software engineering, new languages and so forth, so the same is true for processor architectures. I honestly hope I never work with a developer who displays the kind of attitude towards new architecture that your post contains.
P.S. consider that the PS2 had vastly less computational power and a tiny amount of memory compared even to the PS3 and yet it was perfectly capable of running the FFXI client. In a lot of ways, it's not how big your tool is, it's how well you use it. PS2 and PS3 are great illustrations of that truism in the world of technology and gaming.
This is in fact completely wrong. the PS2 architecture uses a MIPS 3/4 design and features two special vector processing units, that were completely custom. PS2 also used the GS GPU which was very powerful considering it's age, but that's a different discussion. PS2 did not use Cell BE, nor did it use anything that developed into Cell BE. PS3 used the CellBE which was a joint design effort by IBM, Toshiba and Sony. The main CPU element was a standard PowerPC core, the SPEs and on chip data bus (extremely high performance) were new to the party. The GPU was RSX which was a reworking of an existing nVidia design, the GPU core elements were the same as existing products shipped by nVidia. PS3 did not use anything from the PS2 architecture (except for those early systems that included a PS2 chipset for backwards compatibility).
PS4 can't run PS3 games because a) it's an x86 system not Power PC, and b) emulating the SPEs and the interconnect bus is beyond the capability of the APUs in current consoles. Emulating the SPEs is hard because of the extremely high bandwidth of the interconnect bus that connects them on-chip. Few if any CPUs around can match the speed needed, and that's only the raw speed, not under emulation. So, PS3 games will not run under emulation any time soon.
"Harder to develop for" if you're not familiar with either architecture. If you were a developer working on iPad or Android products and switched up to working on PS4 or PC games, you'd be in the same boat. It's a new architecture to learn, and new GPUs too. "Harder to develop for" is a lazy excuse.
I played on PS3 from beta 3 until PS4 landed with the PS4 version of FFXIV. It wasn't horrible. It fairs poorly in a side by side comparison with PS4, but on it's own merits it's actually pretty good.
This game ws designed to run on PS3, low end PCs, high end gaming rigs, Macs and PS4. It was not clearly designed with PS4 in mind since the PS4 runs a modified version of the PC code base, and not the PS3 client.
Honestly, delays summoning retainers only occur when there is a server side issue. Dodging was a latency issue that affected everyone, latency was improved at about the same time as people started moving to PS4. PS3 can't render as many characters in crowded areas as a PS4 can, but then a PS4 and most low-medium range PCs can't match the number of characters that a high end gaming rig can show. It's all relative.
The PS3 client is far from being a mess.