I don't think the camera stutter has anything at all to do with your video card. Rather, it's a programming logic issue - something purely on the coder's end, and it won't be fixed until they patch it.

To illustrate, think of it this way - whenever you move your mouse or your character, the camera responds by changing its rotation or the position it's 'looking at,' respectively. Each frame, the camera has to update the position it's looking at and its own position, (both in x, y, and z cartesian coordinates), and its own angular position (yaw, pitch, and roll).

The stutter is noticeable because they took the simplest form of updating-per-frame there is: "for each frame, new position = old position + this frame's delta/change." So whenever you're walking downhill, the camera is instantly displacing itself downwards however many units you 'fell.'

They could, instead, implement what's called linear interpolation, which results in a much more smooth and natural transition between "new position" and "old position." Each frame, instead of just instantly jumping to its desired point, it travels - usually half, but it's determined by whatever coeffecient > 0.0 and < 1.0 that you want - a percentage of the way to its desired point that frame. So if the camera's at (0, 0, 0) and needs to move to (10, 20, 30), linear interpolation would make it move to (5, 10, 15) the next frame, (7.5, 15, 22.5) the next, and it'd keep going half the distance each frame.

The basic math for it is "for each frame, camera's new position = (coefficient between 0.0 and 1.0) * ((desired position to look at, such as the player's location) - (camera's current position)) + (camera's current position)"

It's much better for aesthetic purposes. With a name like "linear interpolation," it sounds fancy and all, but it's still fairly easy to implement, as illustrated.

I hope this sheds some light on the situation as to why the camera stutter, at least, isn't a problem that originates from your end.