Windows 8 Multiple Displays Question?

MikeHawthorne

Essential Member
Microsoft Community Contributor
Hi

I was wondering whether anyone could tell me why this happens?

I have a 1080i HD TV set, with a resolution of 1920 by 1080, and a 27" monitor with the same resolution.
I do see that inspite of the fact that both are the same pixel dimensions they are not the same aspect ratio.
I'm aware that the are different aspect pixels, and I assume that my TV has 1.333 pixel aspect ratio.
I know that my monitor has square pixels.

When I extend the displays to both, everything looks normal and works fine but the TV is stretched out, the resolution stays at 1920 by 1080.

But if I duplicate them the resolution is reduced to 1024 by 768 and the display on my monitor is a very squarish format while the TV is still full screen.

If I select show only on the TV, then the TV is full screen but the resolution can only be set to 1280 by 728.

Why can't I get the full 1920 by 1080 on the TV any time except when the displays are extended?

Mike
 
HDTVs, unlike monitors, also have "Picture Mode" options (ie. Standard, Theater, Sports, PC, etc.) which also affects the display size on the screen. You may think it's the PC mode that you should use but that one is only if you are using VGA cable and not HDMI cable.
On my Win 8 HTPC which is connected to my 47" living room HDTV via HDMI cable, I have to set the picture mode to "Game Mode" to be able to fit the resolution perfectly in the TV screen. You may want to explore that feature on your HDTV.
The other possibility that I see is maybe that your GPU is not capable of outputting "interlaced" video (1080i) which is basically like half the "progressive" video resolution (1080p). So, you should actually be looking at the lettres "i" and "p" and not the numbers "1080". For sure, they are not the same.
 
Last edited:
Hi

I'm pretty sure that the TV is 1080p, so maybe that's the problem.

I know about the picture modes but I can't see how they would effect the resolution that my video card produces to my monitor.
It seems like it would only effect the image size on the TV, I don't think the computer can sense when it's charged.

It's not the TV that won't show full screen, when I duplicate the displays the TV is fine, but stretched because of the Pixel Aspect Ration, (they use rectangular pixels 1.333 times as wide as a normal pixel to get a wider screen format without increasing the resolution). Maybe the mode switch would correct that by cutting off the bottom and top a bit.

But when I duplicate the displays, my monitor is forced into a very low resolution mode, that's the part I don't get, when I did this on my old computer both displays stayed full screen at full 1920 by 1080 resolution?

I have an nVidia Geforce 680 video card, and I'm running the TV off of an HDMI connection, but my monitor runs off a DVI-D connection, (the only way to get the 144 Hz refresh rate), so maybe that has something to do with it, on my old computer both displays ran off HDMi.

I know it will output interlaced video because I create Interlaced files when I'm doing film editing, I wonder if there is a way to tell it to send a progressive scan signal to the TV?

And why does it work when I extend the display, then I get full screen 1920 by 1080 on both displays?

Weird, anyway I'll turn on the TV and experiment with modes right now and see what happens.

Mike
 
Well, what I've found is that when I run my TV off of my Cable Box I get 1920 by 1080p.
When I run it off of my Computer I get, an image on the TV that is 1023 by 767.

TVOnly_zps2610d1fd.jpg


This is a screen capture of the TV, but it's not what I see when I look at the TV.
What I see is this image stretched out to fit the TV screen.


This is the same image opened in Photoshop, you can see the image size.

InPhotoshopTVOnly_zps2c58769e.jpg



And this is the same image forced to 1920 by 1080, this is what I see on the TV.



Anyway this has me totally confused and I'm going to stop messing with it.

I just don't know what's going on. LOL

Mike
 
Just came back to say that after doing a little research I find that the nVidia Geforce cards don't support 2 monitors when it comes to gaming and things like that, they do support 3 monitors in the surround set up, but not 2.

It sounds from what I've read that some people are having issues in Windows 8 that they didn't in 7, I didn't have any problem running 2 monitors in Windows 7 with my ATI video card.

Anyway I'm interested in getting a second monitor that I can move the tools from my animation software to, so that I can use the main screen for my work space.

I'll have to figure out what I need to do, I don't want to buy another monitor like the one I have just for that, too expensive.
But I suppose that I should get a monitor with the same screen resolution.

I don't need a 27" monitor for that, a 22" would do if it's 1920 by 1080.

More research to do.

Mike
 
Hi Mike,
Just seen your post and was thinking the same thing that your video card may not fully support both. Reason for saying that is I always have had ATI video cards and never had a problem with resolution on tv in extended mode or duplicate mode. I guess I am lucky with that. The last monitor I had was a Samsung 24" 1920x1280 and when in extended mode with my Toshiba 52" hdtv it would default my monitor back to 1920x1080. Now I have a Samsung 24" 1920x1080 so nothing changes. I use a hdmi cable from computer to tv also.
Hope you get it worked out for your viewing!

cheers
 
It's not the TV that won't show full screen, when I duplicate the displays the TV is fine, but stretched because of the Pixel Aspect Ration, (they use rectangular pixels 1.333 times as wide as a normal pixel to get a wider screen format without increasing the resolution). Maybe the mode switch would correct that by cutting off the bottom and top a bit.

But when I duplicate the displays, my monitor is forced into a very low resolution mode, that's the part I don't get, when I did this on my old computer both displays stayed full screen at full 1920 by 1080 resolution?

I have an nVidia Geforce 680 video card, and I'm running the TV off of an HDMI connection, but my monitor runs off a DVI-D connection, (the only way to get the 144 Hz refresh rate), so maybe that has something to do with it, on my old computer both displays ran off HDMi.

I know it will output interlaced video because I create Interlaced files when I'm doing film editing, I wonder if there is a way to tell it to send a progressive scan signal to the TV?
 
PC mode that you should use but that one is only if you are using VGA cable and not HDMI cable.
On my Win 8 HTPC which is connected to my 47" living room HDTV via HDMI cable, I have to set the picture mode to "Game Mode" to be able to fit the resolution perfectly in the TV screen. You may want to explore that feature on your HDTV.
 
Hi chal4oye

You are running the same video card that I am an Nvidia 680 GTX.
And I'm using DVI-D for my monitor as well, I'm guessing we probably both have Asus monitors too.

I'm familiar with Pixel Aspect Ratios because I do a lot of animation and ran into problems with the different video formats when I started editing them until I settled on resolutions that were all consistent.

I've always had my TV hooked up via HDMI cable to my computer and my satellite box, but the HDMI circuit died on my TV and I had to switch the RGB connectors.

So until I get a new TV I'm stuck, I guess it's time to look into hat.

Mike
 
Back
Top