|
Hardware Support Discussions related to using various hardware setups with SageTV products. Anything relating to capture cards, remotes, infrared receivers/transmitters, system compatibility or other hardware related problems or suggestions should be posted here. |
|
Thread Tools | Search this Thread | Display Modes |
#1
|
|||
|
|||
Deinterlace or not?
I need someone to confirm my thinking. I've never paid much attention to the idea of deinterlacing.
Two scenarios; in both, my TV is connected to the computer via the component video out on my NVIDIA 6600GT. At 720p, the computer is using a non-interlaced resolution. It seems to me that under those circumstances I would want to turn de-interlacing ON. At 1080i, the computer is using an interlaced resolution. It seems to me that under those circumtances I would want to turn de-interlacing OFF. HOWEVER... If I switch between the two resolutions without making any configuration changes, nothing appears to change except the resolution. The video doesn't look all crazy. Why? |
#2
|
||||
|
||||
Quote:
You don't have to deinterlace when you use a dedicated video output, that is one that will only play video and is used on TV. Why? Because the PC graphic card output is supposed to be progressive, now you can connect your graphic card output to an interlaced video display such as a TV, but the TV out chip on your graphic card will in fact interlace the output to suit the display need. It might work though if the "I interlace an interleaved video played" when the two are in synch but I very doubt it work. To my knowledge only matrox graphic cards can send out the interlaced video to the TV "as is", all other are just outputting the "PC output" and don't care at all of the nature of what the "PC output" is used for (that is playing an interlaced video). Regards, Stéphane. |
#3
|
|||
|
|||
So basically I must have de-interlacing turned on currently. It sounds like you're saying that my recorded video (which is interlaced) is being de-interlaced by the decoder, and then the chip on the video card responsible for output to the TV is actually re-interlacing it if the resolution is 1080i or just leaving it alone if the resolution is 720p.
If that's all correct, why is there even an option? It sounds like there is no situation in which you coud avoid deinterlacing. |
#4
|
||||
|
||||
Quote:
Quote:
Quote:
For instance look the Nvidia Decoder, there is not an On/Off settings but rather a "What video are you currently playing?" setting, with these kind of options : * Well, let me choose what you are currently playing and be creative (Smart) * I will figure what you are playing (Auto) * You are sure a guy who lurk at hidden closet cam (Video) * You are way to smart to see anything else than pure art (Film) And then you can choose a method of deinterlacing depending on the capabilities of your hardware : BOB / ADAPTIVE (and sometimes WEAVE but usually you will not see it in the choice list, because WEAVE means no deinterlacing applied) Some other decoders will always automatically deinterlace when using Overlay choosing the best method (and you cannot choose any option). For DivX and WMV : no deinterlacing involved, all is progressive (well you can have interlaced divx / wmw but it's exotic at best). But as I said, when you use the TV output as a dedicated video output (theater mode / DVD Max / whatever the fancy name is) then it depends on the graphic card. But for ATI cards even when using theater mode, the video will need to be deinterlaced. You can always trust your eyes : a video with a ticker is the best test, play it with deinterlacing on and off on the TV. If one of the other method produce a blurry text, you'll know what to choose. It is not sufficient at this point to only search for "lines" in the text, the crucial point is sharpness, letters must be sharp (even if you see lines in them : after all on TV you'll see lines in ticker letters, but you'll never see "double edge" (blurry) letters) Regards, Stéphane. |
#5
|
||||
|
||||
Unless your output exactly matches your input/source resolution (basically unless you're using a dedicated hardware decoder/output) you have to deinterlace.
There's no way to resize with acceptable results without deinterlacing. |
Currently Active Users Viewing This Thread: 1 (0 members and 1 guests) | |
|
|