SageTV Community  

Go Back   SageTV Community > Hardware Support > Hardware Support
Forum Rules FAQs Community Downloads Today's Posts Search

Notices

Hardware Support Discussions related to using various hardware setups with SageTV products. Anything relating to capture cards, remotes, infrared receivers/transmitters, system compatibility or other hardware related problems or suggestions should be posted here.

Reply
 
Thread Tools Search this Thread Display Modes
  #1  
Old 11-28-2005, 06:44 PM
austin austin is offline
Sage User
 
Join Date: Nov 2005
Posts: 15
HDTV Stutter/Tear at high resolutions

I just bought a new TV to complement my Sage 4 rig and I am experiencing problems. The new TV, a Scepter 37" LCD, runs native 1920x1080 via DVI and it seems as if my 256meg Geforce 6600 cannot keep up. HDTV looked wonderful on my old set running 1280x768 via overlay and a VGA cable but now I am seeing something that seems like a mix between the old VMR9 tear and an HD stutter. The problem only happens when watching HD content (and especially during fast motion like pans), SD stuff looks fine, even that recorded over the DTV tuner. I have tried all possible combinations of decoders from the usual suspects as well as the various rendering options from FSE VMR9 to vanilla overlay and have gone back to my old standby of Nvdia via overlay for the video and Intervideo (from windvd) for the audio. I am contemplating buying a 7800gtx as I have heard some good things on the AVS form but I am a little hesitant to drop $450 on a new card with no guarantee of success. Has anyone experienced these problems and if so, was the solution a faster card?

Thanks in advance for the help…

-Austin

My system specs are as follows:
Software:
Sage 4.0
XP SP2
Forceware 81.95
Purevideo 1.02-185 (Video Decoder)
WinDVD 7 (Audio Decoder)

Hardware:
2 400 gig drives formatted 64k block size
P4 3.07 (I think) with HT
Jetway Mobo with an inlet chipset
1.5 Gigs of Ram
Hauppauge 350 (not using the video out)
Hauppauge 150
Avermedia 180
Nvidia 6600 with 256 Megs of ram
Sceptre X37SV-Naga @ 1920x1080 Via DVI HDMI cable
Reply With Quote
  #2  
Old 11-28-2005, 06:58 PM
mlbdude's Avatar
mlbdude mlbdude is offline
Moderator
 
Join Date: May 2003
Location: Melbourne, Florida
Posts: 4,174
6600 has no problems running HDTV at 1080i. You will probably have to do FSE though since it is an nVidia. I would guess SageTV is not configured correctly or working right for you.
Reply With Quote
  #3  
Old 11-28-2005, 07:35 PM
austin austin is offline
Sage User
 
Join Date: Nov 2005
Posts: 15
The problem is the resolution

Thanks for the response but I guess that I wasn't clear enough in my first post. The problem has to do with running 1920x1080 and not with decoding the 1080i signal. If I switch the resolution down to 1280x720 the picture looks fine, no stuttering etc. Of course this defeats the purpose of having an LCD with a native resolution of 1920x1080. What I wanted to know is, if I upgrade my video card can I run the native resoluion cleanly?
Reply With Quote
  #4  
Old 11-28-2005, 09:08 PM
blade blade is offline
SageTVaholic
 
Join Date: Jan 2005
Posts: 2,500
1920*1080 is 1080i and 1280*720 is 720p. I think you're confusing the incoming signal with what you're outputting. For example your captures may be 1080i but when you run at 1280*720 you're outputting at 720p. When you up the resolution to 1920*1080 you're outputting at 1080i.

As mlbdude said if you have everything configured right and are using FSE you shouldn't have any problems. I can run 1080i on my 6600GT without any problems.

Last edited by blade; 11-28-2005 at 09:20 PM.
Reply With Quote
  #5  
Old 11-28-2005, 11:33 PM
austin austin is offline
Sage User
 
Join Date: Nov 2005
Posts: 15
Thanks for the help, I'm now 90% sure that the problem is with my new set and not the video out. I assumed that the issue was with the computer and not the display as it has always been the case in the past. Too bad as I really don't feel like shipping it back.

Thanks again...

-Austin
Reply With Quote
  #6  
Old 11-29-2005, 02:29 PM
austin austin is offline
Sage User
 
Join Date: Nov 2005
Posts: 15
After speaking at length with the tech from Scepter I'm back to thinking it is the GFX card. Has anyone successfully run a GeForce 6600 outputting 1080p? Note that it is an LCD and thus a 1080p signal that I am outputting to the monitor and is twice the information as a 1080i signal that a normal TV would accept.
Reply With Quote
  #7  
Old 11-29-2005, 02:37 PM
mlbdude's Avatar
mlbdude mlbdude is offline
Moderator
 
Join Date: May 2003
Location: Melbourne, Florida
Posts: 4,174
Its possible if your resolutions is 1080p. I have only done 1080i on it. Does it work when you send your TV 1080i?
Reply With Quote
  #8  
Old 11-29-2005, 03:22 PM
RedR's Avatar
RedR RedR is offline
Sage Advanced User
 
Join Date: Nov 2004
Location: Dallas TX
Posts: 205
Heya,

Well this chat has gone from 1080i to 1080p I believe? Either way, the 6600 GT can do 1080i, but everything has to be right, this includes settings in SageTV. As to your question about the 7800's, yes it's far better with HD content, but their site only talks about 1080i, not 1080p for all the 7800's. I have an eVGA 7800 GT in my game box and it'll handle 1080i without a problem in SageTV v4.
First, make sure that the video drivers are current. I know recently nVidia has released new drivers that are leaps and bounds when it comes to performance and fixes (version 81.95). Another item when it comes to the video driver, use driver cleaner after you've uninstalled the video driver for the nVidia card. You'll boot into safe mode, run driver cleaner and select nVidia then Start. Once it’s done reboot normal and install the new drivers. I can’t help but think this is your problem, old driver residue. Also make sure DirectX is current (9.0c I believe). Make sure your system is sold before moving forward. Once you know this, you can focus on the 1080i/p problem you are having.

Hope this helps,
RedR
Reply With Quote
  #9  
Old 11-29-2005, 05:12 PM
austin austin is offline
Sage User
 
Join Date: Nov 2005
Posts: 15
My drivers are all current and clean, the system was only built a month or so ago (when Sage 3.11 came out) and I've only upgraded the NVIDIA drivers once from 81.85 to 81.95. It was rock solid running 1080i into my old Monivision 34" CRT that died two weeks ago, and as I mentioned in a previous post it works great at lower resolutions on this panel, just with the standard fuzziness of a LCD running at a non-native resolution.

The new panel will not accept interlaced signals; it flickers and generally looks horrible with anything not progressive. Anyway, I've managed to get allot of the problems to go away by messing around with custom timings on the GFX card, but it is now dropping frames and for the amount of money spent on the screen I want it perfect, and it still isn't quite there.

So long story short, I just bought a 7800gt and will see how it goes in a week or so when it gets here. Fingers are crossed....
Reply With Quote
  #10  
Old 11-29-2005, 05:24 PM
stanger89's Avatar
stanger89 stanger89 is offline
SageTVaholic
 
Join Date: May 2003
Location: Marion, IA
Posts: 15,188
If it makes you feel better, I wouldn't be too surprised if the 6600 just can't cut it. The 6600GT is the "lowest" card in the nVidia arsenal that supports all the HD stuff.
Reply With Quote
  #11  
Old 11-29-2005, 05:54 PM
mlbdude's Avatar
mlbdude mlbdude is offline
Moderator
 
Join Date: May 2003
Location: Melbourne, Florida
Posts: 4,174
If you new TV can't do 1080i how do you watch it? I would have though the TV would convert it to either 1080p or 720p internally.
Reply With Quote
  #12  
Old 11-29-2005, 07:15 PM
austin austin is offline
Sage User
 
Join Date: Nov 2005
Posts: 15
1080i is a broadcast format; 1080p is the format between the PC and the monitor. There are only a handful of flat panels out there that support 1080p, they are all LCDs (the DLPs that claim 1080p lie), they only hit the market this summer, and until the Scepter and Westinghouse hit the market they all were absurdly expensive.

As for how it works, the decoders deinterlace the content (if you use the NVIDIA decoders check out the properties there are a number of deinterlace options). Most modern computer monitors are progressive, and as far as I know all LCDs are progressive.

On a totally different subject, but since he seems to be reading this thread, I never got to thank mlbdude for his amazing V1 STV which, to my eye, is still the best looking Sage STV ever, Thank you.
Reply With Quote
  #13  
Old 11-29-2005, 07:19 PM
mlbdude's Avatar
mlbdude mlbdude is offline
Moderator
 
Join Date: May 2003
Location: Melbourne, Florida
Posts: 4,174
I know this is OT, but I have been looking at some new 1080p DLP's (even if they are not exactly 1080p yet). If we ignore the PC, and my cable box outputs 1080i (the hightest resolution it outputs) will a 1080p be able to display this or do I need to have my cable box output 720p. I would figure that I could just use a PC with a progressivie display, but right now the highest resolution content I play is 1080i at native resolution. I don't think I would want my PC to scale it up.

Thanks!
Reply With Quote
  #14  
Old 11-29-2005, 08:09 PM
austin austin is offline
Sage User
 
Join Date: Nov 2005
Posts: 15
Your set will automatically deinterlace the signal, and may down/up convert it depending on the set. There is no 1080p content (yet) so it wouldn't make much sense for the manufacturers to only support a nonexistent standard. But to answer your question, any HDTV will display any HD signal you just might not see every pixel. If you are getting your HD content off of cable this is a moot point anyway because they compress the hell out of the signal in order to squeeze more channels in.
Reply With Quote
  #15  
Old 11-29-2005, 08:23 PM
RedR's Avatar
RedR RedR is offline
Sage Advanced User
 
Join Date: Nov 2004
Location: Dallas TX
Posts: 205
Not to bump anyone, but I thought the 6200 was the lowest, the 6600 GT is the highest that supports HD and games in the 6000's series no? I tend to agree from what I've read though, the 6200 or 6600 isn't quite enough for 1080 anything.
Do let us know how the 7800 works out, I myself am a big fan and am very interested in hearing the results =)
Reply With Quote
  #16  
Old 11-29-2005, 08:49 PM
stanger89's Avatar
stanger89 stanger89 is offline
SageTVaholic
 
Join Date: May 2003
Location: Marion, IA
Posts: 15,188
Quote:
Originally Posted by austin
1080i is a broadcast format; 1080p is the format between the PC and the monitor. There are only a handful of flat panels out there that support 1080p, they are all LCDs (the DLPs that claim 1080p lie),
That's debateable, there was a great post on AVS about that, that showed that in almost all circumstances (except diagonally-alternating checkerboard or something) that the "wobulated" 1080p DLPs could fully resolve 1080p, and even had some advantages over "true" 1080p displays.

Quote:
Originally Posted by mlbdude
I know this is OT, but I have been looking at some new 1080p DLP's (even if they are not exactly 1080p yet).
Two things you have to look out for:
1) That the display can actually accept 1080p, hopefully over DVI/HDMI (many early 1080p displays didn't accept it, or only did over VGA).
2) How it deals with 1080i, most, especially the less expensive displays, will simply bob the 1080i to 540p and then scale it to 1080p from there. The hardware do to, real 1080i deinterlacing is still very expensive (Realta, Genum, maybe another), with one noteable exception, Geforce 6600GTs or better are about the cheapest thing that can do real 1080i deinterlacing.

Quote:
If we ignore the PC, and my cable box outputs 1080i (the hightest resolution it outputs) will a 1080p be able to display this or do I need to have my cable box output 720p.
It will take 1080i and deinterlace it, ideally it would do true deinterlacing, but probably just a bob.

Quote:
I would figure that I could just use a PC with a progressivie display, but right now the highest resolution content I play is 1080i at native resolution. I don't think I would want my PC to scale it up.
Why not, let your 6600GT (or better) do it for you.

Quote:
Originally Posted by RedR
Not to bump anyone, but I thought the 6200 was the lowest, the 6600 GT is the highest that supports HD and games in the 6000's series no?
Not sure what you mean... The whole 6/7 series supports HDTV output, with the noteable exception of the NV40 (6800 familly), but what I was referring to is the "Spacial-temporal deinterlacing" of 1080i content, and the 6600GT is the "lowest" card that supports that.
Reply With Quote
  #17  
Old 11-30-2005, 04:54 AM
blade blade is offline
SageTVaholic
 
Join Date: Jan 2005
Posts: 2,500
Quote:
Originally Posted by austin
But to answer your question, any HDTV will display any HD signal you just might not see every pixel.
Not entirely true. Some TV's will not accept 720p input. With those sets the signal must be scaled by something else such as a cable box or htpc.
Reply With Quote
  #18  
Old 11-30-2005, 06:59 AM
mlbdude's Avatar
mlbdude mlbdude is offline
Moderator
 
Join Date: May 2003
Location: Melbourne, Florida
Posts: 4,174
Quote:
Originally Posted by stanger89
Why not, let your 6600GT (or better) do it for you.
Deinterlacing with the video card on my 1080i looks much worse than when I do not deinterlace 1080i content and output at native resolution. The hope would be that the scaler/deinterlacer in a 1080p set would not degrade the quality at all.
Reply With Quote
  #19  
Old 11-30-2005, 09:00 AM
stanger89's Avatar
stanger89 stanger89 is offline
SageTVaholic
 
Join Date: May 2003
Location: Marion, IA
Posts: 15,188
Quote:
Originally Posted by mlbdude
Deinterlacing with the video card on my 1080i looks much worse than when I do not deinterlace 1080i content and output at native resolution.
Well there you've got the same sort of issue as SD users, that you've got an interlaced source and and interlaced display. It's always best to leave it untouched.

Quote:
The hope would be that the scaler/deinterlacer in a 1080p set would not degrade the quality at all.
I think you're hoping too much, at least of the lower end 1080p sets.
Reply With Quote
  #20  
Old 11-30-2005, 11:53 AM
mlbdude's Avatar
mlbdude mlbdude is offline
Moderator
 
Join Date: May 2003
Location: Melbourne, Florida
Posts: 4,174
Quote:
Originally Posted by stanger89
Well there you've got the same sort of issue as SD users, that you've got an interlaced source and and interlaced display. It's always best to leave it untouched.



I think you're hoping too much, at least of the lower end 1080p sets.
The one difference now that I use component out and the HDTV for 1080i it actually works with deinterlacing off. It is kind of a pain though to have to turn it on and off, but better than nothing.

What I am trying to avoid is "upgrading" my TV and have the wife compain that the picture does not look as good as our native 1080i set.
Reply With Quote
Reply


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -6. The time now is 04:38 AM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, vBulletin Solutions Inc.
Copyright 2003-2005 SageTV, LLC. All rights reserved.