Quote Originally Posted by Argyle_Darkheart View Post
To be clear, I am talking specifically about the growth rate of the scaling.

For an extreme example, let's compare a character with a native 100% critical hit chance (character A) to a normal character (character B).

A CHR stat of 2855 gives 20% critical hit chance and 55% critical hit damage.
At a baseline damage of 100, character A would have a final damage of 155 and character B would have a final damage of 111.

Now, let's increase the CHR stat to 3680, which gives 25% critical hit chance and 60% critical hit damage.
Character A would have a final damage of 160 and character B would have a final damage of 115.
In this example, character A gained 5 damage and character B gained 4 damage. For character A, a 5 damage increase is an increase of 3.226%. For character B, a 4 damage increase is an increase of 3.604%.

Now, let's take an absurd jump all the way to 7805 CHR, which gives 50% critical hit chance and 85% critical hit damage
Character A gains 25 damage for an increase of 16.129% and character B gains 27.5 damage for an increase of 23.913%.

Which character is scaling better?
Obviously character B is scaling better, but you ignore the reality that no one will feasibly be reaching 50% crit rate from crit stat. Briefly, with buffs, it's possible for another job to reach or even exceed 50% crit for a short period, but no one has on-demand direct crits like WAR has. Thus, no one benefits from crit like a WAR does. Yes, there is potential for better scaling for another job, but this ignores the fact that the absolute damage difference is drastically different. With 50% crit and 85% crit damage, character B is still dealing half as much absolute damage as character A with 100% crit and 85% crit damage.