It's a fact that the server only checks you xyz every 300 miliseconds. But that alone isn't enough to make you get hit unless you got out of it with less than 300 ms since your last xyz check and your last xyz check was in it (rare).
But when you take
300 - built in
200 - client latency
250 - lag from visual processing in the brain to motor reflex response on the keyboard
You end up with nearly a whole second, and most people get out of the plumes and stuff with less than a second to do it, and get hit every single time.
SE can't do squat about client latency and user reflexes, but they can fix the 300ms server check time. SE should be checking our XYZ every time our client sends them. When the client sends your XYZ it should be used in the calculation immediately, why does it take 300 ms? Bad design imo, bad architecting imo.
