Now any idea why "screen fit" gives me a little bit more of the picture?
Yes, because most DVD players, game consoles, and similar consumer electronics devices (especially older ones and even broadcast TV), assume that there will be some level of "overscan" (parts of the picture cut off by the bezel), and they compensate by shrinking the image. To prevent having the video surrounded on all sides by black bars (because the "guess" of the CE device won't match the real overscan of the TV), most TVs "zoom" the image a bit by default. This can cut off the outer edge of the video a tiny bit, if the TV doesn't guess quite right, but they figure that's better than seeing black bars and wasting screen real estate. My TV has three different levels of zoom, that are supposed to be used with different types of set-top devices. All of these features do degrade the picture quality, because they are rescaling the image (and different TVs have different quality scalers included).
Computers, on the other hand, are "pixel perfect" displays. They map pixels 1 to 1, and assume that the monitor will handle any overscan and that there is REALLY a full 1920x1080 pixels available, edge to edge. When you are hooked up to a HTPC (and many newer devices), if you use the "normal, default" zoom setting on the TV, it is compensating for a problem that isn't there, so the edges of the image gets cut off.
There are two ways to compensate for this. If your TV has a 1:1 setting (which it appears you've found, they're always called something different), then that is the best option. If not, the Nvidia or AMD drivers include Overscan correction settings that can be used to shrink or expand the size of the image output to match the display. Both Nvidia and AMD seem to make these settings difficult to find for some reason, but they're in there.
But, like you said, you found it. Just use the 1:1 option on the TV, and leave the video card alone.