Windows 7 Multiple Monitors

seekermeister

Honorable Member
Joined
May 29, 2009
Ever since I upgraded my video card (EVGA GeForce GTX 650 Ti Boost), I haven't been able to get both of my monitors on the desktop to work the way that I want. Nothing is wrong with either monitor, but my secondary monitor is an old Dell VGA LCD, which I've only gotten to work on this card when connected to the upper DVI-I connector, but when connected in that fashion, the BIOS only displays on it, rather than my main monitor.

Since the Dell is VGA, it requires an adapter to connect to the graphics card, but the one that works in the DVI-I doesn't work in the lower DVI-D connector, because the latter doesn't permit insertion of the 4 pins around the single blade. I got one like this:

http://www.ebay.com/itm/300325133278?ssPageName=STRK:MEWNX:IT&_trksid=p3984.m1439.l2649

Which plugs in fine, and to a degree works, because the monitor seems to think it is connected, but the computer doesn't, all I get on it is a black screen. The Device Manager, Nvidia Control panel nor Ultramon see it either.

So I tried finding a DVI-I 24-1 adapter, but what I've found, the picture shows as having the 4 pins around the single blade, that I'm trying to get rid of:

http://www.ebay.com/itm/DVI-I-24-1-...=US_Video_Cables_Adapters&hash=item20c8869ecf

Can someone help clear up my confusion?
 
You did the steps in extending the desktop display, did you?

multimonitor.jpg
 
You can't extend the desktop to a display that doesn't appear in the options. As I said, I believe the answer is in using the proper adapter to connect the monitor. I just haven't found one.
 
Okay, Just asking.
You cannot use the DVI-D port to connect your VGA monitor as DVI-D only sends digital signal. VGA is analog and it will work only in DVI-I port which sends both analog and digital signals. Needless to say, even if you are able to pull-out that 4 pins, it won't work on DVI-D port because of signal incompatibility with the VGA monitor.
 
Last edited:
I was afraid you would say something like that. Since it will only work on the DVI-I, is there some way to get the BIOS to use the DVI-D, instead of the DVI-I connector to send the BIOS screens on?
 
Sorry I haven't done what you are trying to accomplish. Plus, not all BIOS config are the same. But looking at the way BIOS uses the DVI-I as the primary port, it is just smart enough to default on the one that sends both analog and digital signals for the benefit of the user, to make sure that it will work whatever type of display the user have.
 
But you don't know how to tell the BIOS right? :)
Anyway, I did a quick check and you can't send the BIOS screen to any particular display as the system is at limited state (like Safe Mode) and the GPU is not at 100% and not doing it's job of extending the screens across all your monitors. When in BIOS, all monitors are in mirror mode (all screens are the same). Therefore, you can't do what you wanted to do.

Black Friday is just around the corner, I hope you can scoop in a good deal on monitor that will work for you. :)
 
No, I'm not going to buy another monitor. That only leaves me two options, either put my EVGA GTX 460 SE back in the desktop, because both of it's connectors are DVI-I and will work as wanted, or get used to having to crank my head back and forth during the boot.

Outside of this one factor, I like the GTX 650 Ti Boost, but didn't even think about it connector configuration when making the decision to buy it.
 
It's either you shed out some more dollars to get a monitor and get that video card in good use, or recover some money back and sell if you don't have any plans of getting another monitor. The longer you hold on to it, the lower it's price will go down.

cheers!
 
I just swapped connections on the monitors, putting the VGA Dell back on the DVI-I using the same adapter I had been using before, and the primary back on the DVI-D as before also. Predictably the BIOS screens only displayed on the Dell, and the primary remained black until reaching the login screen. However, it is from that point that things were not so predictable. The Dell remains black, even though the Display settings in the control panel are set to extend the desktop to the Dell, the Dell appears normally in the Device Manager, Nvidia Control Panel and Ultramon. When I cycled the Dell's power off and back on, the Dell's desktop appears normally only for a moment before returning to a black screen.

Obviously, the monitor works and is connected properly. bit something is amiss somewhere...what am I forgetting?

EDIT: Just rebooted and found that I hadn't noted another difference...the Dell only works after the BIOS screens until the Starting Windows screen appears only for a moment, and the flying windows never appear before it going to black.
 
Last edited:
Maybe it wasn't the exact same adapter that I used, because I just replaced it with another identical adapter and now both monitors are displaying properly.
 
Apparently, the problem is not totally due to the type of video card installed, because when I boot to Kubuntu, everything displays on both screens all the way through....after the BIOS screens that is.
 
Last edited:
I've been re-considering your suggestion to replace the monitor, but shopping around brings back to mind the hassle of trying to find exactly the right monitor. Mainly due to the lack of full specifications on many of them...especially the low-end price models. Many don't even bother to list exactly what type of connector(s) that they have. My video card also has 1 HDMI and 1 Display Port connector available, but is there any way to KNOW if either of these were used, that it would solve this problem?
 
Using DVI or VGA in combination with either HDMI or DisplayPort will solve your dual monitor problem. Getting a monitor with HDMI connector will easily solve your problem as there are plenty to choose from. Any HDMI monitor will work. There are not a lot of monitors in the market with DisplayPort (DP). But I think they are more expensive. Most of the time, the DP port is used to run a 3rd monitor using an active displayport to HDMI/DVI adapter. That is exactly how I am running my third monitor. The adapter has to be active. Non-active adapter will not fire up a 3rd monitor.
 
Last edited:
One more twist in the picture (in my mind), that I'm wondering about. Currently, I have the primary monitor connected via a DVI to HDMI cable. The monitor has D-Sub, DVI & HDMI. If I were to use a straight HDMI cable to connect the primary, would that be a solution, or do I simply need to get rid of the VGA secondary monitor?
 
If it would work, that would be the cheapest solution, but when looking at cables, I also found a HDMI to VGA cable available. Do you think it would make any difference which monitor was connected to the video card's HDMI?
 
Back
Top Bottom