One more rather important disadvantage is that, due to separating the lighting stage from the geometric stage, hardware anti-aliasing does not produce correct results any more: although the first pass used when rendering the basic properties (diffuse, normal etc.) can use anti-aliasing, it's not until full lighting has been applied that anti-alias is needed. One of the usual techniques to overcome this limitation is using edge detection on the final image and then applying blur over the edges,[6] however recently more advanced post-process edge-smoothing techniques have been developed, such as MLAA[7] (used in Killzone 3 and Dragon Age 2, among others), FXAA[8] (used in Crysis 2, FEAR 3, Duke Nukem Forever), SRAA,[9] DLAA[10] (used in Star Wars: The Force Unleashed II), post MSAA (used in Crysis 2 as default anti-aliasing solution). Although it is not an edge-smoothing technique, Temporal anti-aliasing (used in Halo Reach) can also help give edges a smoother appearance.[11]
DirectX 10 introduced features allowing shaders to access individual samples in multisampled render targets (and depth buffers in version 10.1), making hardware anti-aliasing possible in deferred shading. These features also make it possible to correctly apply HDR luminance mapping to anti-aliased edges, where in earlier hardware any benefit of anti-aliasing may be lost, making this form of anti-aliasing desirable in any case.