It's a topic that got discussed fairly heavily back around launch time. There are identifiable hot spots on the maps that will cause a significant dip in rendering (turn your line of sight towards Hawthorne hut when on that map and you should see it happen, even when standing in Little Solace). There also seems to be a bit of a kickback with a lot of players around--all the textures and such, even though they may be geting pulled from GDRAM, it's still a lot of wire frames that it has to plaster them on.

Leaning towards SE needing to somehow better optimize the culling and occlusion--maybe even some form of mipmap/LOD Bias or something could help. I noticed toggling the LOD streaming option helped my old ATI 4870 512MB a little. Not sure how much mileage we can get with that on the 1GB+ cards though. Might tinker with LOD cache on my laptop next time I'm in Del Sol or something, just to see. It's an I7 Quad with 8GB and a 3GB GTX670MX in that beast, so not expecting much to improve things at this point, as I've thinned it out pretty good already and maintain mostly 30+ regardless of what is going on with that system now.