
Originally Posted by
Sindele
Decided to take a deeper dive. Turns out it's a real pain to get millisecond precision from chat messages, but the result I got was kind of a no-brainer in retrospect: macros execute one line per frame. This was consistent across every test at every fps breakpoint tested.
As a quick refresher, frametimes at different common fps breakpoints, plus the resulting 'macro queue' window with a micon, error off, 13 cast macro:
144fps = 6.94ms frametime = ~90ms window after ~14ms
120fps = 8.33~ms frametime = ~108ms window after ~17ms
60fps = 16.67~ms frametime = ~217ms window after ~33ms
30fps = 33.33~ms frametime = ~433ms window after ~67ms
15fps = 66.67~ms frametime = ~867ms window after ~133ms(!)
In my opinion, this isn't a good thing for macros at either end of the curve, and shatters some of the initial claims outright.
To get an acceptable "macro queue" time, you need 60fps or lower - but worse fps also means that multi-purpose macros with things like oGCDs get delayed, potentially for extreme lengths of time that can lead to undesirable clipping. With better fps, your "macro queue" time narrows - at 144fps your window is less than 1/5th the length of the standard queue's half-second - which can lead to oGCDs not on waits being brought up so close to the cast that they will likely fail on animation locks. Worse, because fps in this game isn't exactly predictable or reliably controllable, it's going to screw with muscle memory, too.
One factor I'm not able to test for at present is how this interfaces with ping. I have my theories from what I know and what I've heard, but... I prefer data.