I wantg to hook my new computer up to my big screen TV

thefed

Thinks s/he gets paid by the post
Joined
Oct 29, 2005
Messages
2,203
Tell me if this sounds like a reasonable plan...


Buy an inexpensive video card with some sort of video output (s-video, rca's, dvi). hook up the computer to the TV, and mount the computer on a shelf next to it.

Buy a wireless mouse and keyboard....and hook em up


Now I can use my 65" HDTV as a monitor? Am i missing something?
 
Make sure your TV can handle the resolution. I blew up a big screen by accidentally overdriving it from the s-video output on my computer.
 
wab said:
Make sure your TV can handle the resolution. I blew up a big screen by accidentally overdriving it from the s-video output on my computer.

okayyyy...not sounding like a great idea


how do i go about checking these specs?

I alos found a connection on my tv labled monitorlink...and its the same ol' plug as most monitors...

what's the chance of this connection giving decent quality on a 65 inch tv?
 
If your TV is "HD", I think that means it should support either 1280x720 or 1920x1080, both of which should look "pretty good," especially the latter.

The problem is that your computer may be set to a higher resolution unless you've had a chance to explicitly set it up for your TV resolution (or lower). Both s-video and the RGB connector on your set are analog, so you won't get as nice a signal as you would from a digital monitor connection (e.g., DVI).

DVI > RGB > s-video.
 
thefed said:
Now I can use my 65" HDTV as a monitor? Am i missing something?

RTFM
 
If your 'monitor link' is DVI then you need a video card that supports DVI.

If your 'monitor link' is a standard VGA (D shaped computer video connector), use that. Your computer should be able to figure out the appropriate resolution and scan rate.

If your TV only has S video in then it probably doesn't make sense for computer use (lousy picture) but will work OK for watching watching videos.

I have a computer connected to a plasma using a DVI->HDMI cable. THe computer runs Ubuntu Linux which detected the TV properly and set the resolution to 1280x720. It works great for video but since the picture is scaled (native plasma resolution is 1366x768) it looks a looks a little fuzzy for computer graphics. The video is also overscanned so I can't see the edges of the computer screen. None of this matters to me since I rarely use the TV as a computer monitor.
 
I have a Samsung 32" LCD Widescreen HD TV. I hook my laptop to it with an HDMI cable which carries both your video and sound in one cable. Works fantastic for me. I watch all my movies and TV shows that I download like this. Vid card in my laptop is a GeForce Go 7600. My DVD player is just a paperweight now. :LOL:
 
Wow, that's cool. I've never seen a laptop with HDMI. What model laptop is that?
My audio runs seperately to the receiver using a spidf (toslink) optical cable.
 
JB said:
Wow, that's cool. I've never seen a laptop with HDMI. What model laptop is that?
My audio runs seperately to the receiver using a spidf (toslink) optical cable.

Mine is an HP dv9000t. It rocks! :D
 
well, i have 2 monitor link hookups, and one is the regular old blue colored serial port like you find on a desktop pc for the monitor....

this i assume is VGA...and all i need is a cord now....im gonna go hunting tomorrow


will the text be clear?
 
JB said:
If your 'monitor link' is a standard VGA (D shaped computer video connector), use that. Your computer should be able to figure out the appropriate resolution and scan rate.

AFAIK, a VGA adapter doesn't have any inputs. How could the computer possibly figure out the appropriate resolution and scan rate? It's up to the TV to (try to) adapt to the signal.

Don't let my blow-up scare you. There's an excellent chance it will work fine. My problem was that I used a computer that was already configured for a high-rez monitor, so it tried to send that same signal to the TV, and the poor CRT drivers simply couldn't keep up.
 
wab said:
AFAIK, a VGA adapter doesn't have any inputs. How could the computer possibly figure out the appropriate resolution and scan rate? It's up to the TV to (try to) adapt to the signal.

There are a few VGA pins used for a connection back to the graphics card. It's called DDC (Display Data Channel) and is used to signal EDID (extended display identification data) back to the PC.

To be safe download the manual for your TV and set your display card to match the supported resolution and display rate. For the sharpest computer graphics choose the resolution that most closely matches the native resolution of your display.
 
JB said:
There are a few VGA pins used for a connection back to the graphics card. It's called DDC (Display Data Channel) and is used to signal EDID (extended display identification data) back to the PC.

D'oh, you're right. I forgot all about EDID. So, I can only blow up TV's via s-video I guess. :(
 

Latest posts

Back
Top Bottom