Windows 7 HDMI to TV problems... PLEASE HELP!!!

Bazerk83

New Member
Ok so after trying EVERYTHING I can think of to get this resolved.... I'm slowly losing my mind.

Basic history. For a couple of years now I've been using my Sony Bravia HD as my display monitor for my PC. Since using it I've hardly ever had a problem with it save for one. This one problem is something I've experienced only a couple of times in those years and I have it again now with ALL the same symptoms except this time I haven't been able to solve it!

I came home one day (with my PC left running) and there was nothing on the display which I thought was weird because I'd done nothing before I left except left it to download.

I've now.... after spending hours tinkering, been able to get my display back by hooking it back up to my monitor but I WANT it back to my HD Sony Bravia.

Well, I've updated the graphics card (which really shouldnt have been neccessary as the driver was fairly up to date and worked fine before anyway).

I've tried going into display settings but there's nothing to directly TELL windows to look for a different display device.

I've tried different HD slots and nothing, different cables and the cables work fine.

The interesting thing is this. one of the symptoms I had before when I had this problem was that when I plugged in a lead from the PC to the default dell monitor AND also plugged in a HD lead for hd to hd from pc to tv, what I get is a blank desktop on the dell monitor. everything suddenly dissapears (no icons, taskbar NOTHING, just the background) and nothing on my TV.
This has happened before but I've got it working, Not sure how but I think a reboot with just the HD lead in did it, doesn't work now.
Then when I unplug the hd lead from the graphics card and leave the monitor lead to the dell monitor IN, the desktop comes back.

What I've done before is shutdown or restart with just the HD lead to TV in but this isn't working now. I've changed NOTHING settings wise on my TV since it was working and the same with my PC other than uninstalling and reinstalling some C++ and .net framework 4 which has no relation to this problem.

Can somebody PLEASE tell me if there is something I'm missing because I've had this problem before but it's always come back. I really don't want to revert back to this monitor.

PLEASE HELP!!!!
 
Last edited:
There is a chance your system was given a Graphics Driver update while you were out, and some of your settings may have been reset.

Could the Sony be looking at the wrong input?

Is the Dell Monitor using DVI?

When you plug the HDMI into the Sony, does your sound playback devices show the TV?

Do you, or have you ever had it set for multiple monitors? Does it have two GPUs on board the desktop?
 
There is a chance your system was given a Graphics Driver update while you were out, and some of your settings may have been reset.

Could the Sony be looking at the wrong input?

Is the Dell Monitor using DVI?

When you plug the HDMI into the Sony, does your sound playback devices show the TV?

Do you, or have you ever had it set for multiple monitors? Does it have two GPUs on board the desktop?

I have a feeling it IS looking for the wrong output yes....perhaps.... I'm not entirely sure.

The dell monitor is hooked up via VGA. *

**UPDATE***

I've now plugged the vga lead directly from pc to SONY TV. The display settings now has my device listed as "SONY TV" although in device manager it says "Generic PnP monitor".

HOWEVER.... NOW I have the hdmi lead plugged into the Pc and TV along with the vga lead from PC to TV and I still have a proper display via the vga lead (taskbar, icons etc all showing). The channel I have selected on the TV is listed as "PC".

but when i got to AV4 (for the hd).... still nothing.... am I getting closer?

my sound playback devices does not show the TV but I've never had audio working through the tv when i link hdmi direct from graphics card to tv. I've always been happy to use my sound system plugged into the PC. (5.1 logitech 5-piece set + subwoofer)
 
Last edited:
What model of Graphics card do you have?

If you think the HDMI cable is good, perhaps the HDMI output is going out. Or, for some reason the resolution is set incorrectly..
 
argggh!!!!!!!!!!!

Ok so NOW after rebooting with both leads plugged in at either end, the system boots to blank screen, I take the hd lead out and it's fine.

When i have the vga lead in alone it's fine and full display but when I plug in the hd lead at either ends it reverts back to the problem before where the icons, taskbar and everything else but the background disappears. My tv also automatically changes it's resolution from 1280x768 /60hz to 1024x768 /60hz.

It seems to be like it's some knd of conflict and now I'm convinced with this reaction to pluggin in the HD lead that it is NOT the hd slot on the graphics card that is damaged. Besides which I've seen this before but managed to get it working.... or more like Windowes has stopped being a stubbon MF and decided to play nice....

This is really frustrating.
 
The graphics card is a (Gainward) nvidia GTX 260.

I'm sure the hd port on it is fine because I've seen these symptoms before and managed to resolve before somehow as I said above.
 
Sounds like you have the HDMI as the primary output, and when it is connected, it is the primary display.

Have you checked your multi-monitor settings to see if you are showing a primary and secondary display?

My GTX 260 does not have an HDMI output, just 2 DVIs and a S-video.
 
But shouldn't it BE set that way?

I've gone into Control Panel\All Control Panel Items\Display\Screen Resolution (you know, via right click on desktop and click screen resolution) and I now only have one display listed in the drop down menu titled "SONY TV". For the short period that I had a fully working display with both leads plugged in, it did say I had 2 "SONY TV" displays listed in the drop down menu. It showed 2 tv icons underneath to select from and interestingly one screen size was smalled and one was bigger and looked like a widescreen. I tried seleting both and then changing my tv channel from "PC" to "AV4" and still got nothing.
 
OK THIS IS GETTING ****ING RIDICULOUS NOW!!!!!

So this is the latest turn of events..... I dug out another vga lead, a longer, black with gold plating expensive vga lead. As one end is female and the other is male I attatched it to the previous vga cable which I then attatched back to the PC and TV. So basically creating an extension for my vga cable.

To my surprise, I suddenly had a full display (icons, taskbar etc.) with BOTH hdmi lead and vga lead attatched to PC and SONY Bravia.

I then went back into the display settings and it showed I had two displays!

So I'm thinking, oooh. we're making progress. I double clicked on the other display to use that one, and the screen when on then off and I had a MASSIVE list of resolutions to choose from that went all the way to 1900x200 or something crazy like that, which it hadn't given me before, before i was reduced to either 1208x768 or 1280x768. So I'm thinking ooh, it must've pick up the HD connection, else why would the resolution options increase so much?

So i switched my tv over to AV4 and still nothing from the hd channel. I go back and click detect to find out which display it is using for the vga connection and BAM, back to blank nothing but wallpaper. no icons, taskbar NOTHING and now regardless of which vga lead i use, whether extended or the original.... it won't let me have the vga lead connected AND the hdmi lead connected together without being a ****ing whore of a pc and taking my icons taskbar etc. away.

I feel at this point like taking a sledgehammer to my ****ing computer. I'm so hacked off.
 
Last edited:
and now....

the latest thing I just tried was whilst keeping the hd lead connected tv-pc I then connected a dvi lead from monitor-pc.

Sure enough, again.... i saw two displays listed and showing in my display settings. I then extended the displays so I have my desktop now showing on monitor and nothing on AV4 (HD channel).

i hit "identify" and it showed my monitor display as being "2" and still a big old blank black screen on AV4... which logically should be display "1"

When i switched displays via display settings all I get is a blank screen on AV4(HD) and the same old bloody blank screen on monitor except the wallpaper (no icons taskbar again etc.)

the on thing i am DEFINITELY not going to do again is hit the ****ing DETECT button because everytime I do that with two leads hooked up, whatever the combination (dvi+HDMI) (vga+hdmi) (vga+dvi) I always get the same result which is that I lose signal on BOTH connections.

I know it's not the hdmi port on the tv as I've tested it. unfortunately the other hd slots do NOT work so I can't try those. Those went down earlier this year after being completely fine for the past 2. (yeah... I'm cursed I know).

one other thing i should probably mention. On orginally discovering this problem when i got home last night. I at first naturally checked my hdmi connection at both ends with no success. After trying a few other things I then thought it might be an issue with something else and basically stripped out the hardware on pc (while powered off) and left it with its basic motherboard, cpu and fans attatched.

I then reset the bios by taking out the slim silver disc. I waited a couple of minutes then reinstalled and connected everything.

When trying this before it seemed to work after a few attempts. Well I've tried those few attempts again and nothing.

I feel completely and utterly at a loss at this point.
 
OK!!!

NOW i know the hd connection is fine because i've tried something new. When i posted that last message my display setup was as follows

dvi - PC to Monitor (fully working)
vga - PC to SONY TV via "PC" channel (fully working)

and NOW I've just connected a different HD lead and I'd getting the same thing that came up with the 1st hd lead. BOTH my TV and Monitor displays are working (in tandem) and via dvi and vga as stated above. I still have nothing from the Hd channel on my tv (AV4)

BUT... in display settings I now have this showing -> http://i303.photobucket.com/albums/nn151/Bazerk83/Tech Stuff/displaysettings-1.png

the "1" is the SONY TV. When I cllick on it's icon, pretty much most things become ghosted out. No advanced settings available, "make this my main display" becomes unavailable, and under "multiple displays" my choices are "duplicate desktop on 1 and 2" and "dupilicate desktop on 1 and 3". I tried this once for 1 and 2 and it didn't work, I believe what happened was blank screen on TV and full display on monitor. If I click "detect" it will just screw up the current set up of my dvi and vga displays without bringing back the HD display.

WHAT SHOULD I DO!?!?!?!?!

I'm really sorry to keep going on and on and double triple posting.
 
Last edited:
Well.... After spending my entire day trying to fix this ****ing stupid problem.... I give up. I'm thoroughly pissed off.

Ill try again tomorrow perhaps, that is if I haven't utterly smashed my PC to pieces before then.
 
In your previous post, you stated you work through the display options via the control panel (I presume through Windows), out of curiosity, have you tried working the video options via Nvidia's control panel?
 
Yes I've gone into the nVidia control panel and there's barely anything there for options that helps me. When clicking on the 'my display is not shown' link which is found in the 'set up multiple displays' section I did find the 'force tv' option but it was ghosted out. Above it was the 'rigourous display detection' mode which i clicked and it promptly found nothing. It then asked after doing this 'Would you like the television connection enabled?'' which I ofcourse clicked yes to.

THEN the little checkbox next to the ghosted out 'force television detection on startup' became checked, but the whole 'force television detection' mode (including the checked box) was ghosted out, as is the 'restart now' that comes with it. So with it all ghosted out but a little check box to supposedly force the connection, I click ok to finalise it. I then went back IN to the 'my display is not shown' link and hey presto... the friggin checkbox is unchecked again.

I tried this whole thing again except instead of double checking it, I just rebooted manually.

THE OTHER PROBLEM is. I can't have the hd lead connected on its own (for obvious reasons) and i cant have it plugged in AND the vga/dvi lead plugged in to a seperate display (my dell monitor) because with either the vga or dvi plugged into my monitor, as soon as I plug in the hd lead to my graphics card, i lose my whole display on the monitor with just a background to stare at and all ability to do anything in windows other than a blind shut down/restart via keyboard.

i've tried all kinds of combinations of connections and I even tried booting with NO leads in. waited for the windows chimes to indicate it had finished loading then shutdown, plugged in the hdmi cable to both pc and tv and powered up.

All the while keeping my tv channel on the correct input. (av4) which is really av1 as it starts with av4 my tv.

NOTHING.

I am completely and utterly out of ideas and fear that i will never see my desktop on my bravia via hd again.
 
I guess I'm going it alone with this one huh....

275 views. 2 helpers. I guess people just like to read and run.
 
This is something you are going to have to fix yourself, no one else can change your settings and connections. The fact you seem to show a split monitor 2/3 means something is not set correctly.

I would start by just using the Dell Monitor and a DVI connection. Get it set to a single monitor and a resolution that will work there. Then use the same DVI port and the DVI to HDMI converter to plug into the TV. Because you do not have a full HDMI connection, some resolutions may not be available.

If your TV is 1080P, the HDMI might not do that, but do not know for sure. I know my system works fine on a one year old Sony Bravia and an Intel onboard GPU with a HDMI output.

If you are using the VGA input on the TV, are you looking at a PC input, or an HDMI 4 input?

If you are using the VGA TV input, are you converting a DVI to VGA to make that connection?

The GTX 480s have gotten much cheaper lately... :) or another HDMI output available card.

Edit: And make sure the settings in the Windows and Nvidia control Panel agree, or at least do not conflict.
 
Last edited:
Well currently my setup is DVI port to Monitor at a resolution of 1280 x 1024 which is as high as the monitor will go.

Btw my TV is 1080i / 720p.

OK so.... I took the dvi cable out of the monitor and attatched the dvi to hdmi adapter and plugged into AV4 (the hdmi slot on my TV)
and it worked. But obviously this is still running a dvi display not an hdmi display.

It now gives me the option in nvidia control panel to change from a PC native 1360x768 to tons of different 720p or 1080i resolutions which when picking through them all look worse than the native resolution, which, is obviously going to be the case I assume.

Should I change to a 1080i resolution and then try the hdmi cable again?

THANKYOU for getting me this far and sticking with me on this I DO REALLY appreciate it :)

oh and to answer your other questions

when using vga on tv = PC input - straight vga connection not vga to dvi adapter.
 
The DVI and HDMI are, theoritically speaking, the same output. The HDMI just contains a couple of control channel is which allow for content protection and allow the playing of full resolution DVDs, and a sound channel. You may have to adjust the screen size when using a HDMI input, but only you know. On my system, the DVI provides a sharper computer output, but videos look pretty good either way.

If you need to run different resolutions for a dual monitor setup, it may cause problems and might depend on the capability of the video source. If you were using a DVI to VGA adapter, the TV may have a PC input which shows separately from an HDMI input, since a VGA input is not digital.

Your card has a DVI-I connector which allows for both digital and analog signals. Is you TV Analog, I think I remember what that is :) ? If it is, you would probably want to use the DVI-I to VGA converter then a VGA cable to the TV. But again, only you know, but I do not see a VGA output on the card.
 
Last edited:
There Is a VGA Port on the card but why would I want to use it instead of dvi? Nah I don't want to use two screens just want to get my hdmi cable working directly from the slot in graphics card to the slot in tv.

I don't know if my tv is analog or digital in the menu it has options for both.

So you're saying that there's no difference in video or gaming quality when using dvi-hd or straight hd? Because if that's the case i could just stick with the dvi-hd setup I currently have (the one with dvi-hd adapter) but just need a longer dvi cable.

Interesting update.... I plugged in the hd lead from pc to av5 (even though that input hasn't worked for weeks) and it showed up in the display list in both display settings and nvidia control panel. Still got blank screen on AV5 channel though.

But when I took that HD cable out and put it in AV4 (after removing dvi-HD) cable I stil got my good old blank screen but like usual i still hear my speakers make that detection sound that occurs when you plug in something that's plug n play.

Am I destined to never get my tv hooked back up to a straight HD lead?
 
Last edited:
What model TV do you have?

What type of DVI cable are you using (I, or D)?

You should be able to plug a DVI-HDMI converted cable into any of your HDMI ports and see the signal, unless the TV has some sort of restriction. When I get the owner's manual for the TV, I will look through it to see if I notice anything about the allowed connections.

Edit: There is also a DVI-A for a analog only signal which would be for a DVI to VGA cable.
 
Last edited:
Back
Top