seekermeister

Honorable Member
Joined
May 29, 2009
Messages
1,496
Ever since I upgraded my video card (EVGA GeForce GTX 650 Ti Boost), I haven't been able to get both of my monitors on the desktop to work the way that I want. Nothing is wrong with either monitor, but my secondary monitor is an old Dell VGA LCD, which I've only gotten to work on this card when connected to the upper DVI-I connector, but when connected in that fashion, the BIOS only displays on it, rather than my main monitor.

Since the Dell is VGA, it requires an adapter to connect to the graphics card, but the one that works in the DVI-I doesn't work in the lower DVI-D connector, because the latter doesn't permit insertion of the 4 pins around the single blade. I got one like this:

Link Removed

Which plugs in fine, and to a degree works, because the monitor seems to think it is connected, but the computer doesn't, all I get on it is a black screen. The Device Manager, Nvidia Control panel nor Ultramon see it either.

So I tried finding a DVI-I 24-1 adapter, but what I've found, the picture shows as having the 4 pins around the single blade, that I'm trying to get rid of:

Link Removed

Can someone help clear up my confusion?
 


Solution
Using DVI or VGA in combination with either HDMI or DisplayPort will solve your dual monitor problem. Getting a monitor with HDMI connector will easily solve your problem as there are plenty to choose from. Any HDMI monitor will work. There are not a lot of monitors in the market with DisplayPort (DP). But I think they are more expensive. Most of the time, the DP port is used to run a 3rd monitor using an active displayport to HDMI/DVI adapter. That is exactly how I am running my third monitor. The adapter has to be active. Non-active adapter will not fire up a 3rd monitor.
Plug your hdmi monitor directly to the hdmi port and then plug the vga monitor to DVI-I port. The DVI-I port is the only port where the vga monitor will work. DVI-I supports both digital and analog signals.

Sent from my Nexus 7 using WindowsForum mobile app
 


Do you think it would make any difference which monitor was connected to the video card's HDMI?

Yes it does. Vga will not work on hdmi unless you have a signal converter in between. The hdmi to vga cable only converts the interface but not the signal.


Sent from my Nexus 7 using WindowsForum mobile app
 


Okay, but if VGA couldn't work connected to HDMI, I wonder why they offer a cable for that purpose? Doesn't really matter, because I will go the way you suggested. Thanks.

EDIT: Didn't see your last post, until I posted. I have no idea of what a signal converter is, but I guess that I really don't need to know, since that would only complicate things.
 


Signal converter from digital to analog. There are converters available in the market. And with that converter, the hdmi to vga cable will probably work. But these converters and cables are normally used in office environments for probably connecting projectors to a large screen, etc where extra costs are justifiable.

Sent from my HTC One X using WindowsForum mobile app
 


Thanks for the insight, but I already ordered an HDMI>HDMI cable and keeping my fingers crossed.
 


Thanks for the insight, but I already ordered an HDMI>HDMI cable and keeping my fingers crossed.

As long as the VGA monitor is working on the DVI-I port, the HDMI will give you a second screen for sure. Maybe the problem with your dual dvi is that one can only be active at any given time. The other one has to be connected to an active dvi adapter for it to work. Just a theory.
 


You may have lost sight of the objective. I already have both monitors working...at least after reaching desktop. I'm just wanting to get the BIOS screens and boot menus to appear on the primary monitor, either with or without them appearing on the secondary monitor, which is where they appear now.
 


The BIOS pops up on both my monitors as well....at the boot up BIOS screen and when I actually go into the BIOS. I don't think there is no way around that.
 


The BIOS pops up on both my monitors as well....at the boot up BIOS screen and when I actually go into the BIOS. I don't think there is no way around that.

Yeah. In BIOS mode the PC is running at limited state and only generic drivers are running. There is no way to control any hardware connected to the mobo except to enable/disable them.
 


Now, unless your a programmer and can right a code to do just that and if your system is using the new UEFI BIOS, you can have to BIOS to what ever you want. But it's possible....not that I would want to do all that just for what I call a minor annoyance.
 


Now, unless your a programmer and can right a code to do just that and if your system is using the new UEFI BIOS, you can have to BIOS to what ever you want. But it's possible....not that I would want to do all that just for what I call a minor annoyance.

Yeah, and I think some motherboards have a more advanced utility for BIOS/UEFI which is like inside the windows already. Mine is just the regular blue & white screen. :)
 


Just installed the HDMI cable, and it didn't solve the problem. It took some juggling to just get it working as it did before. At first, all of my desktop gadgets and the Start Menu were all jammed onto my secondary monitor. Even after fixing that, I can't enter the primary monitor's control settings. Therefore, I guess I'm going back to plan B, and try to find a good replacement monitor for my secondary.
 


Somehow, I still have the feeling that this problem could be overcome by using the proper type of cables and connections, which makes it more difficult to decide to bite the bullet and pay for a new monitor. One thing that I noticed when I disconnected the DVI cable from the primary monitor, is that it has a different pin pattern than the other connectors I have tried. I has two rectangular arrays of nine pins, along with a lone single blade style pin. Could this be a factor in making things work as I want?
 


I've found in my own personal experience with multimonitor setups and cables using cable adapters, that the adapters are usually end up being the problems. So I always opt for buying and using the correct cables...just easier for me.
 


Just installed the HDMI cable, and it didn't solve the problem. It took some juggling to just get it working as it did before. At first, all of my desktop gadgets and the Start Menu were all jammed onto my secondary monitor. Even after fixing that, I can't enter the primary monitor's control settings. Therefore, I guess I'm going back to plan B, and try to find a good replacement monitor for my secondary.

I really dont' understand why this is so big a problem for you. Which of the 2 monitors do you want to become a primary one? The HDMI or the VGA? Whichever monitor you want to use as primary, position it accordingly on your desk. And then on the settings menu, select the monitor you want to become a primary one and tick "Make this my Main Display".


maindisplay.webp




And again, not all motherboards have the ability to select a monitor for BIOS screen. Normally, BIOS screen appears on both/all monitors.

Edit:

You have a single-link DVI-D adapter. The only difference with the dual-link is the resolution. Maximum resolution for single link DVI-D is 1920x1200 and for dual-link is 2560x1600. It has nothing to do with setting up primary monitors.

Edit (again):

My 3 monitor setup (on my sig below) is more complicated. My graphics card only has an HDMI, Displayport and VGA ports (yeah, it's a cheap GPU with no DVI... I don't use it for gaming). But to make full use of all HD monitors I have, I did not use the VGA on the PCIe graphics card but instead connected it to the DVI port coming from the onboard video card. Fortunately it works for me having both graphics card working at the same time on my PC. If you get the monitor to work, then it's good. It doesn't matter if you are using a $5 HDMI cable or a VGA cable from a dollar store. It's only a matter of setting up which one is primary, secondary, or tertiary and so on. Mine is done with the help of UltraMon.

DVI is the same as HDMI in terms of resolution but without audio. The variation in ports can easily be solved with adapters.


..





....
 


Last edited:
The problem has nothing to do with how the OS determines which monitor is primary, only with whether the BIOS screens appear on my primary monitor, which it doesn't regardless of any settings or connection configurations that I have tried. Perhaps the BIOS screens will only be sent over a DVI-I connection. If that is the case, then messing with the connections is a waste of time. It is just that I find it hard to believe that is the only way to connect for the BIOS screens to appear. What if there were only a single monitor connected via HDMI, would the BIOS screens not be visible at all?
 


The only difference between the the DVI and HDMI cables is that the HDMI cable carries sound, the rest of the signal is that same across both cables. So the issue with the BIOS being sent different over one cable or another is not the issue nor the OS but rather your setup. I'll have to re-read this entire thread again.

It might be just that the old monitor needs to be replaced.
 


The problem has nothing to do with how the OS determines which monitor is primary, only with whether the BIOS screens appear on my primary monitor, which it doesn't regardless of any settings or connection configurations that I have tried. Perhaps the BIOS screens will only be sent over a DVI-I connection. If that is the case, then messing with the connections is a waste of time. It is just that I find it hard to believe that is the only way to connect for the BIOS screens to appear. What if there were only a single monitor connected via HDMI, would the BIOS screens not be visible at all?

My guess is that your motherboard or GPU is hard coded in terms of display priority. VGA, being the oldest tech is first , then DVI and then HDMI is last. So if you have a VGA-DVI combination, BIOS screen will appear on VGA. Then if you have DVI and HDMI, BIOS screen will pop up on DVI. But if you only have only one monitor connected, it will still work which ever port you have it connected. It's hard to tell on my setup because I am using both onboard and PCIe graphics card. So, getting a new monitor with DVI or HDMI will probably solve your problem. Connect the primary monitor to DVI and the secondary to HDMI.


...



...
 


Back
Top