Perhaps just the scoring metric / weight was changed (lowered) to more accurately reflect true ingame performance. Eg, taking into account FATE's and other high population areas of the game. I've not compared the two benchmarks in terms of min, avg, max FPS, so I can't comment on if there's actual engine differences between the two ... yet it wouldn't matter either way (see below):what is the difference between the two? How do I know which one is the correct one with the right score?
They're both "technically correct" in that any benchmark software only tells you how THAT specific benchmark performs (far as its scoring). They're intended to give you a 'jist' of if your machine is capable of running the game playable, or at all, and if your machine's performance is up to par compared to other people who have similar setups (eg, whether or not something is wrong with your configuration)... That said, neither will give you a very accurate idea of how the game itself will run (in all ingame situations), as the game is constantly moving / not a fixed target ... with population density in zones changing, engine changes each patch, and so on.
To be honest, towns, outposts, FATE's, etc, can bring even the highest end machines to their knees (machines which score ludicrously high on the fixed benchmarks). The only point of the game where the benchmark performance may be close to a reliable metric of true-ingame performance is in instances, where there's consistent caps on the visible players and monsters on the screen at once.
Point being, I wouldn't concern yourself over the benchmark scores so long as the actual game runs up to your standards.
--Also, glad that you figured out the cause of the score differences and that it ended up being something alot simpler than all suggestions I had made. (sorry for sending you on a witch hunt)