Jump to content

Why Component cable on my video card?


Playaa

Recommended Posts

I was looking through the box my video card (Radeon X850) came with and noticed this little cable that hooks to the S-Video out on my card and changes to the 3-pronged Component Cable hookup.

Now...couple questions.

1) Isn't Component Cable a better hookup than S-Video? If so...why would it have a converter since it can't convert to a better output...?

2) Can computers output HD video these days? If so...how? Can mine do it?

Link to comment
Share on other sites

My video card (Nvidia 7950) came with the same thing.... never really bothered to ask about it, just stuck it in the drawer with all the other hundreds of cables and adapters I've collected over the years heh. It would be nice to know the answer though.... of course I probably wouldn't be able to find the cable now :).

 

 

 

 

Shaftiel

 

 

As an aside, i will be upgrading to an 8800 in the next few months, anyone interested in my 7950 1gig?

Edited by shaftiel
Link to comment
Share on other sites

the biggest reason I ask is that I just REALLY want to view SOMETHING in HD on my tv. It's going to be awhile before I can get the HD Satellite hooked up, I live too far out in the country to get a good signal over rabbit ears (and I don't think my hd-ready tv would do that anyway)...so I just want to download some Apple HD Trailers and watch them on my tv just so I can see HD on it ONCE!

 

found this:

http://ati.amd.com/products/radeonx850/index.html

but I don't know what any of that really means.

Link to comment
Share on other sites

HD starts with a horizontal resolution of 540P. For example 1024 x 768 is considered HD. So you can get HD with you analog cable from the back of your existing vid card. There are other types of hookups such as HDMI, Compoent, and DVI that will also support HD resolutions.

 

Finally, Yes component is better than S-Video. As for the converted they probably just figure component is more common than S-Video.

 

Hope this helps... :D

Link to comment
Share on other sites

There's a possibility that the S-video output on your video card is not just a S-video output but can also send a component signal. I can't tell you for sure because I've never seen this adapter or tried it but I really don't see why they would give you the cable if it didn't do anything. I'm not saying S-video can handle HD, because it can't, but they may have made the connections possible through the port with the adapter.

 

I personally wouldn't consider anything below 1280x720 HD (as HD is promoted as 16:9...the term high definition is pretty blurred atm) With 1024x768 you are technically losing the sides of the image (although the picture is still technically high def, see...blurry...funnily enough unlike hd...ok im going off track)

 

Err...have fun trying it out!

 

Oh and if you don't want to have to buy component cables just to see if this works, just use any rca cables...they aren't ideal but will tell you if it works or not.

Link to comment
Share on other sites

it's both a component output and an s-vid output. if you check your vid card specs you should see an "hd tv output". it just happens to be a universal plug that does both signals.

I thought so...you think this functionality is on my 9800Pro? Not that I need it with DVI...ok...don't bother finding out as it doesn't matter in the slightest! lol

Link to comment
Share on other sites

so clearly this thing (my video card) will output in HD then...now where in the world can I get some video to test this? I REALLY want to view something in HD on my HD-ready tv...very, very, very, very badly.

Link to comment
Share on other sites

the biggest reason I ask is that I just REALLY want to view SOMETHING in HD on my tv. It's going to be awhile before I can get the HD Satellite hooked up, I live too far out in the country to get a good signal over rabbit ears (and I don't think my hd-ready tv would do that anyway)...so I just want to download some Apple HD Trailers and watch them on my tv just so I can see HD on it ONCE!

 

found this:

http://ati.amd.com/products/radeonx850/index.html

but I don't know what any of that really means.

 

look into getting an external HD antenna.. they work surprisingly well.. the bunny ears are back!

Link to comment
Share on other sites

Almost forgot to post this.

I got the tv hooked up to the pc last night through Component cables. I had to use my ATI Catalyst Control Panel to get the tv to display in HD mode. The Display Properties in Windows would only let me choose 800x600 as my max resolution...but I did get it to work.

so I played one of the trailers I downloaded (at 720p) and it was weird because when I viewed it on the tv or on my computer monitor (at 1440xsomething resolution) it looked worse than when I watched it on my pc at work which is in 1280x1024 resolution right now). The colors looked washed out and the screen was sorta pixelated. What am I doing wrong?

Link to comment
Share on other sites

Member
(edited)
Almost forgot to post this.

I got the tv hooked up to the pc last night through Component cables. I had to use my ATI Catalyst Control Panel to get the tv to display in HD mode. The Display Properties in Windows would only let me choose 800x600 as my max resolution...but I did get it to work.

so I played one of the trailers I downloaded (at 720p) and it was weird because when I viewed it on the tv or on my computer monitor (at 1440xsomething resolution) it looked worse than when I watched it on my pc at work which is in 1280x1024 resolution right now). The colors looked washed out and the screen was sorta pixelated. What am I doing wrong?

 

ok man.. I'll try to explain this one more time

 

720p == somewhere around 1024x768 resolution

1080p == 1920x1080 resolution

 

1080i doesnt mean you have a 1920x1080 resolution. It means interlace. Every other line gets updated when the screen refreshes. and a full refresh of all lines equals 1080 lines.. If you have 1080p, it is progressive and you have 1080 different visible lines being updated on one screen refresh.

 

So therefore 1080p has a much better resolution.

 

Every TV that I have seen that can handle 720p, can handle 1080i, for the simple fact that it is basically the same resolution, except 720p all lines are being updated on one refresh, while 1080i takes two passes to complete update all lines.

 

If your running at 800x600, it should look worse than 1280x1024 as expected

I seriously doubt that is an s-video out on your vid card. It is more than like a jack, where you can hook up you "dongle" (lol).. which has component and s-video out on them...

 

You should be able to set it to 1024x768 when using the TV, More than likely, windows doens't know what type of monitor is being used and it is default to something any monitor can handle(800x600).

Edited by NOFX
Link to comment
Share on other sites

720p = 1280x720 progressive scan (1280 only because 16:9 has been officially adopted as standard)

1080i = 1920x1080 interlaced

 

TVs that say they support 1080i do so because their inputs support the signal, however they certainly don't give you 1920x1080 pix (interlace and progressive have absolutely nothing to do with the pixel count)

 

These TVs can display 1080i FIELD but not the FRAME because of the way interlacing works...it can only show you one filed at a time (1920x1080 / 2) because it doesn't have the pixels available*not technically how it works, but seems to make people understand better (not that you ever actually ever see the entire frame with 1080i at 1920x1080 resolution but mostly because you wouldn't use 1080i in that situation) which is why on LCD TVs with 1366x768, a 1080i signal displays almost identically to a 720p (because technically they have the same pixel count) apart from the noticable screen tearing and flicker with 1080i (depending on your tv's ability to interpolate)

 

1080i in its actual form is never ever ever used because it looks like crap compared to 1080p and if you have 1920x1080 pixels it's easy to use 1080p so why use 1080i (hasn't alwasy been the the case, which explains why there is even a difference)

 

1080i on a 1366x768 TV will look exactly like 720p because the same number of pixels are being used but 720p is the full frame whereas 1080i alternates between fields and uses interpolation as to look normal (which is where the flicker and tearing comes from)

 

I can only assume that your TV has 1366x768 pixels so you should set it to that. In windows it hardly ever gives you the option and to tell you the truth I'm unclear as to how Windows determines your monitor's ability to display that resolution as when I first tried it offered me only 1280x720, then after a restart allowed 1360x768 and finally allowed me to get the extra 6 columns for 1366x768 and don't ask what i did...

 

You should now be thoroughly confused playaa

Link to comment
Share on other sites

  • 4 weeks later...

anon and nofx...I appreciate the explanations and everything you said makes sense...but it doesn't explain the reason why viewing an HD trailer (that is 720p) on my computer at work at 1280x1024 resolution looks MUCH better than viewing the same trailer on my computer at home at 1440xsomething resolution. When I view the trailer at home it looks very washed out and pixelated (though nothing else on the computer looks poor...especially games...they look amazing).

Link to comment
Share on other sites

anon and nofx...I appreciate the explanations and everything you said makes sense...but it doesn't explain the reason why viewing an HD trailer (that is 720p) on my computer at work at 1280x1024 resolution looks MUCH better than viewing the same trailer on my computer at home at 1440xsomething resolution. When I view the trailer at home it looks very washed out and pixelated (though nothing else on the computer looks poor...especially games...they look amazing).

I...err...can't explain that at all. You're using LCD monitors (not TVs) for both right? The only thing I can think that might have anything to do with it is the aspect ratio (1280x1024 is 4:3 and 1440x900 is 16x10) but...that just doesn't make sense...are you using the same player?

 

Basically, when you watch a video in 720p, as long as your resolution is higher than 1280x720 then you should see the full image at max resolution. There may be factors in the difference between your work and home environment that are affecting the way you perceive the videos...that's a long shot. Also, don't you have a really slow connection at home? Are you sure you downloaded the 720p version?

Link to comment
Share on other sites

at the office it's an onboard video card from a Dell machine. The machine is dual core 2.0ghz with 2 gigs of ram. At home the video card is a Radeon x850 256mb card (with latest drivers) 2.0ghz dual core with 2 gigs of ram. Both running XP Pro with the latest build. Also both using the Combined Community Codec Project for video codecs.

I downloaded a 720p trailer from Apple.com at work, watched it, took it home on my ipod (which I use as a portable hard drive), put it on my home pc (where I have no internet connection at all) and played it there where it appeared washed out and looked worse than some full length movies I've downloaded at 500mb (and the trailer was something like 74mb).

Like I said...when I viewed other videos they look just fine on my home pc...but the one time I tried an HD one it looked horrible (when viewed on the LCD and on the tv).

 

*shrugs*

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...