Hi
I'm pretty sure that the TV is 1080p, so maybe that's the problem.
I know about the picture modes but I can't see how they would effect the resolution that my video card produces to my monitor.
It seems like it would only effect the image size on the TV, I don't think the computer can sense when it's charged.
It's not the TV that won't show full screen, when I duplicate the displays the TV is fine, but stretched because of the Pixel Aspect Ration, (they use rectangular pixels 1.333 times as wide as a normal pixel to get a wider screen format without increasing the resolution). Maybe the mode switch would correct that by cutting off the bottom and top a bit.
But when I duplicate the displays, my monitor is forced into a very low resolution mode, that's the part I don't get, when I did this on my old computer both displays stayed full screen at full 1920 by 1080 resolution?
I have an nVidia Geforce 680 video card, and I'm running the TV off of an HDMI connection, but my monitor runs off a DVI-D connection, (the only way to get the 144 Hz refresh rate), so maybe that has something to do with it, on my old computer both displays ran off HDMi.
I know it will output interlaced video because I create Interlaced files when I'm doing film editing, I wonder if there is a way to tell it to send a progressive scan signal to the TV?
And why does it work when I extend the display, then I get full screen 1920 by 1080 on both displays?
Weird, anyway I'll turn on the TV and experiment with modes right now and see what happens.
Mike