SageTV Community  

Go Back   SageTV Community > SageTV Products > SageTV Software
Forum Rules FAQs Community Downloads Today's Posts Search

Notices

SageTV Software Discussion related to the SageTV application produced by SageTV. Questions, issues, problems, suggestions, etc. relating to the SageTV software application should be posted here. (Check the descriptions of the other forums; all hardware related questions go in the Hardware Support forum, etc. And, post in the customizations forum instead if any customizations are active.)

Reply
 
Thread Tools Search this Thread Display Modes
  #21  
Old 12-17-2004, 01:05 PM
teedublu's Avatar
teedublu teedublu is offline
Sage Advanced User
 
Join Date: Mar 2004
Posts: 198
Quote:
With the FX5200, how are you setting the resolution to 720x480? I am only seeing 640x480 & 800x600. I've got my FX5200 connected to a SDTV with SVideo. Thanks, BryanJ
Nothing special! It was one of the resolutions available -- (and one of the resons I like this board) bear in mind, I do not have anything attached to the VGA or DVI port (just the S-Vid). Also I used the video card mfg's (MSI) driver, because when I tried using latest generic from NVidia, it messed things up -- greyscale pic etc....

For Windows, 720x480 isn't much better than 640x480 -- you still have some menus being clipped etc. But mine is a dedicated HTPC so the priority is TV quality not windows desktop.
Reply With Quote
  #22  
Old 12-17-2004, 01:18 PM
nielm's Avatar
nielm nielm is offline
SageTVaholic
 
Join Date: Oct 2003
Location: Belgium
Posts: 4,496
Quote:
Originally Posted by willemse
1- is the Deinterlacing done in the Hauppauge before encoding and writing video to disk???
no, it records interlaced (ie 2 fields per frame)... The video is deinterlaced on playback by the mpeg2 decoder when being displayed on the windows desktop.

If using the PVR-350's TV-out, I believe the video is NOT deinterlaced, and the interlaced video is output directly to the interlaced display (the TV!) with no scaling/deinterlacing/smoothing, which is why it looks much better
Reply With Quote
  #23  
Old 12-17-2004, 01:20 PM
nielm's Avatar
nielm nielm is offline
SageTVaholic
 
Join Date: Oct 2003
Location: Belgium
Posts: 4,496
A quick question for the FX5200 people: How did you get decent contrast with VMR9 on the FX5200?

On my display, it always looks washed out, whereas Overlay looks much more vivid... Shows like 24 where everything is dark are pretty much unwatchable using VMR9...
Reply With Quote
  #24  
Old 12-17-2004, 02:10 PM
mc2wheels mc2wheels is offline
Sage Advanced User
 
Join Date: Nov 2004
Location: Medway, MA
Posts: 101
Maybe I just don't know what I am missing...

I don't watch 24 and haven't tried to get overlay picture quality to it's best, but I played with the settings under vmr9 until the sage output aproximated the brightness and color depth of regular TV. I do lose sharpness and contrast, but it compares pretty well to the staight TV.

My codec settings:
Brightness -- just past half way
Contrast -- just under half way
Digital vibrance -- just under half way
all other settings in the middle.

Then, my Sony has various picture settings (movie, sports, vivid, etc). I put it on sports, and that adds a little sharpness to the picture.
Reply With Quote
  #25  
Old 12-17-2004, 02:32 PM
tundrwd tundrwd is offline
Sage User
 
Join Date: Dec 2003
Posts: 22
One item I don't see listed is the quality and length of the cable being used to connect everything.

Is it good quality cable, and is it shielded? Despite being shielded, does it cross over an AC power cord, or is it near AC power?

Are you using a 12' or 25' cable? It isn't going to be good quality unless it is VERY low impedence and heavily shielded.

ALL the components in the "stream" are important, and sometimes cabling is the most important.

Try moving the cables around, or use a better quality cable (borrow one from a friend for a trial basis).
Reply With Quote
  #26  
Old 12-17-2004, 02:40 PM
willemse willemse is offline
Sage Advanced User
 
Join Date: Oct 2003
Location: Netherlands
Posts: 91
Quote:
Originally Posted by tundrwd
One item I don't see listed is the quality and length of the cable being used to connect everything.

Is it good quality cable, and is it shielded? Despite being shielded, does it cross over an AC power cord, or is it near AC power?

Are you using a 12' or 25' cable? It isn't going to be good quality unless it is VERY low impedence and heavily shielded.

ALL the components in the "stream" are important, and sometimes cabling is the most important.

Try moving the cables around, or use a better quality cable (borrow one from a friend for a trial basis).
tundrwd,

Good point, but in my case I have made connections with different cable length; 1 meter and 30 meters without difference in quality.
30 m cable is shielded on both signals and again as total cable FYI

Nielm tks for quick response. My FX 5200 just blow up on TV out!!! Heat sink came loose, but replacement is on its way and will post settings once up and running!!
Reply With Quote
  #27  
Old 12-17-2004, 03:44 PM
mayamaniac's Avatar
mayamaniac mayamaniac is offline
Sage Icon
 
Join Date: May 2004
Posts: 2,177
I always find the software decoders picture quality to be poor when watching fast motion videos such as a basketball game. And this is not just on the TV using the video card's TV-Out. Even on the computer CRT monitor, it looks bad. I've tried different software decoders and the Nvidia decoder seems best, but still I can see horizontal lines during fast motion content. Whether its VMR9 or overlay, it still cannot compare to the smooth motion of what you see on a SDTV. Maybe if I spend countless hours tweaking and messing with all the utilities out there, I can get it to looking good. But I think I'll just stick to my PVR-350's TV-Out which looks awesome (although I have to put up with the crashes once a while. Switching back to the original STV seems to lessen the crashes).

I think PVR software makers should really invest on better TV-Out solutions for their PVR customers. Most people don't watch recorded shows on the computer, they watch it on their TVs. As powerful as SageTV gets, it's worthless if what you see on the TV is crap. Picture quality on the TV must improve. The probable answer might be for the few that owns HDTV LCD/Plasmas to use non interlaced digital outputs. But for the rest of us that owns tube TVs, there has to be something better than a PVR-350 TV-out.
__________________
Mayamaniac

- SageTV 7.1.9 Server. Win7 32bit in VMWare Fusion. HDHR (FiOS Coax). HDHR Prime 3 Tuners (FiOS Cable Card). Gemstone theme.
- SageTV HD300 - HDMI 1080p Samsung 75" LED.
Reply With Quote
  #28  
Old 12-17-2004, 07:32 PM
cummings66 cummings66 is offline
Sage Aficionado
 
Join Date: Dec 2004
Location: Moberly, MO
Posts: 281
My PVR-350 TV Out is awsome. It's as good as my DVD player and I'm well pleased with it.

I had some initial problems with crashes and delayed audio and I believe that since I have added a slot fan and another fan in the front of the case to pull air in the problems are fixed. I believe the 350's video output to be as good as you will ever get with mpeg.

I think when HD gets supported we'll find better outputs like DVI being used in a hardware decoding solution. Consider that you would now have something around 20 - 30 % cpu usage with software decoding and 0 - 2 % cpu usage with hardware decoding. That's SD, now HD is even more intensive and a total hardware solution would be best in order to not require an awsome computer to handle it.

For example, right now I'm watch Jag that I recorded off a wildfeed, replying here, and reading my email. I have 97% free cpu and no slowdowns in anything. With HD I don't think I could capture or playback doing all those things using software decode.

Software decoding has advantages over hardware, upgrades are easily done and you can sometimes improve the image quality quite a bit with just a newer version.
Reply With Quote
  #29  
Old 12-17-2004, 10:33 PM
stanger89's Avatar
stanger89 stanger89 is offline
SageTVaholic
 
Join Date: May 2003
Location: Marion, IA
Posts: 15,188
Quote:
Originally Posted by cummings66
I think when HD gets supported we'll find better outputs like DVI being used in a hardware decoding solution. Consider that you would now have something around 20 - 30 % cpu usage with software decoding and 0 - 2 % cpu usage with hardware decoding. That's SD, now HD is even more intensive and a total hardware solution would be best in order to not require an awsome computer to handle it.
Actually I seriously doubt it. HD cards, when first released, all had hardware decoders (WinTV-HD, HiPix DTV-200, MyHD MDP-100, AccessDTV, MDP-120) because they needed them. The MDP-120 is the most recent one and is over a year old IIRC. If you look at the current trend in HD cards, it's toward software decoding (Fusion I, II, III, QAM, Sasem USB HDTV, HDTV Wonder, V-Box DTA-150), further, all the development on the software side is moving towards software decoding (MCE 2005 most notably).

If you look at the dedicated HW decoder market, there are currently no HDTV decoders (other than those on HDTV Tuners), and there's really been no development on the SD side either, the Xcard is really the most recent card (there are a couple others, but those are more business targetted).

Also it doesn't take a monster PC to play HD, at least not if you have a video card with hardware accelleration. For example I've played files on my 1.5 GHz Athlon XP 1800 with a lowly Geforce4MX, I think I may have even played them on my Athlon 900 (not sure that was a long time ago). I know Dvico recommends a 1.6GHz class machine to play HDTV with DXVA, and that's getting to be a rather low end machine these days. Heck, I just tried and I can play HD with full software decoding (Dscaler 5) on my AXP 1800+.

Already we're to the point where HW decoding is not necessary for playback of even HD, power wise. I think it's almost certain that where we are going is totally away from hardware decoding, at least in the traditional sense. Were I see us going, is actually probably somewhere in between software and hardware decoding (as we think of them today).

The Geforce 6 series is the first hint of what's to come. If you look they have essentially programmable hardware video decoders built into the chip, they don't do all the decoding, but they are doing more and more. The biggest hurdles right are in configuring a solid playback setup. And there are a lot of companies spending a lot of money to get the PC into the living room (where there's a lot of money to be had). MCE is an example, MS has put a lot of money into MCE to make it act like an STB and not a "PC". With continued software development, the glitches will be worked out in the playback chain, and with hardware development and the move to HDTV, the interconnection of PC and TV will be solved, it's nearly there already, just look at how easy it is to connect the Component-equiped cards to HDTVs. There are issues with DVI, but as TV manufacturers realize they need to provide EDIDs that work with PCs, and PC manufacturers get more experience, those will be worked out too.

What I see, is that we'll be able to plug PCs into TVs, just like we do now into computer monitors, and that the whole PC/HT/STB/Living room line will blur.

Wow, you know, I'm sounding pretty optomistic today. Got to thank Amir and Chris from MS for that I think. They've made a couple re-assuring post on AVS, specifically that HD-DVD and Blue-ray are being designed with PC-compatibility in mind, and not PC-incompatibility. Only time will tell...
Reply With Quote
  #30  
Old 12-18-2004, 02:23 AM
jan smit jan smit is offline
Sage Advanced User
 
Join Date: Jul 2003
Location: Haarlem Netherlands
Posts: 159
Thanks a lot for all these reactions. I'm trying to swallow all this data, especially the links from nielm and glbrown are very instructive. I am now at least certain that my problems are caused by (de) interlacing; the question is how to solve them.
I have therefore tried to explain to myself how the system works and would very much appreciate your reactions on this theory:
step 1: Analog TV signal gets digitized by Hauppauge hardware
step 2: Hauppauge h/w encodes digital data into MPEG format (720x576)
step 3: MPEG gets (buffered and) written on harddisk
step 4: MPEG (from buffer) gets software-decoded (NVDVD) and sent to graphics card (Fx5200)
step 5: Graphics harware/sftware reders and scale imaga to fit on monitor (17"CRT)screen; at same time produces picture for TV-out

Questions:
1. Am I forgetting (important) steps, or am I maybe totally wrong with my assumptions?
2. Where in the process does de-interlace take place:
a. part of step 1
b. part of step 2
c. between step 1 and 2
d. somewhere else?

3. If 2a or 2b or 2c, then de-interlaca happens on Hauppauge card (hardware/firmware?); how can I improve the quality of this interlace step?

Conclusion: Since even the monitor picture (as well as TV picture) is showing these interlace problems, the problem must be caused before step 3 (on the Hauppauge card) unless my theory is wrong.

btw I am using exactly the same settings as mc2wheels.

Thanks for your reactions.

Last edited by jan smit; 12-18-2004 at 02:26 AM.
Reply With Quote
  #31  
Old 12-18-2004, 03:14 AM
nielm's Avatar
nielm nielm is offline
SageTVaholic
 
Join Date: Oct 2003
Location: Belgium
Posts: 4,496
de-interlacing is done halfway through step4!

4a The MPEP2 steam is decoded to generate the 2 fields that are part of each video frame (odd-lines and even-lines)
4b these 2 fields are combined using whatever deinterlacing algorithm is configured
4c The resulting frame is sent to the gfx card (actually with hardware MPEG2 accelleration the video card also does some of 4a and 4b)

and then step5
5a: Graphics harware/sftware reders and scale imaga to fit to video window resolution
5b Gfx hardware generates analogue VGA output at desktop resolution
5c GDX hardware RE-scales desktop to TV-resolution (often with some smoothing, and anti-flicker algorithms) and splits the frame into 2 interlaced fields to generate interlaced analogue TV signal on composite or S-video output
Reply With Quote
  #32  
Old 12-18-2004, 04:55 AM
mayamaniac's Avatar
mayamaniac mayamaniac is offline
Sage Icon
 
Join Date: May 2004
Posts: 2,177
If we all switch to HD hardware and all the TV networks switch their shows to HD, then we wouldn't be complaining about the TV picture quality in this thread. The point is as of now (near end of 2004), most of us are not experiencing the HD goodness. And most of the shows on TV are not in HD. Last I heard, TV networks in the US are not required to switch to HD by 2006 anymore. Because it cost too much and is just impossible, its like switching to electric cars. So if you are waiting for HDTV to take over, don't whole your breath.

But my point is most of us will be stuck with standard definition (SDTV) for quite a while, even after HDTV gets popular. Its not like we will all throw away our current TV sets. So in the meantime, we will suffer with crappy interlaced picture quality while watching a fast action scene. Or maybe we should all learn to like and watch soap operas or the LifeTime channel where the contents are nice and slow, therefore picture quality is great.

Or maybe the Frey folks who created this great TV software will find a better TV-Out solution for us and save us from the soaps.

Or just get SageTV to work better with the PVR-350. Even better, partner with Hauppauge to make a dedicated TV-Out decoder card that will work with both SDTV and HDTV.
__________________
Mayamaniac

- SageTV 7.1.9 Server. Win7 32bit in VMWare Fusion. HDHR (FiOS Coax). HDHR Prime 3 Tuners (FiOS Cable Card). Gemstone theme.
- SageTV HD300 - HDMI 1080p Samsung 75" LED.

Last edited by mayamaniac; 12-18-2004 at 04:58 AM.
Reply With Quote
  #33  
Old 12-18-2004, 06:07 AM
jan smit jan smit is offline
Sage Advanced User
 
Join Date: Jul 2003
Location: Haarlem Netherlands
Posts: 159
Thanks Nielm, so if I understand correctly the (for my problem) important steps are taken in 4a and 4b. After that the quality of the graphics card is important, while the problem lies not in the Hauppauge card/driver.
Which settings in Sage are decisive for steps 4a and 4b?
I have tried all types of combinations for DXVA MPEG mode and DXVA Deinterlacing, but I cannot see any difference in picture quality.
Reply With Quote
  #34  
Old 12-18-2004, 09:30 AM
stanger89's Avatar
stanger89 stanger89 is offline
SageTVaholic
 
Join Date: May 2003
Location: Marion, IA
Posts: 15,188
mayamaniac,

I'm fully aware of the fact that the transition from SDTV (content) to HDTV (content) will take a very long time, and that SDTV will never be completely gone. What I was commenting on was twofold, one the idea that HDTV requires a hardware decoder, and that with the transition to HDTV hardware (which will happen much quicker for people like us than the public at large) the deficiencies of current PC video output technology dissapear.
Reply With Quote
  #35  
Old 12-18-2004, 09:36 AM
stanger89's Avatar
stanger89 stanger89 is offline
SageTVaholic
 
Join Date: May 2003
Location: Marion, IA
Posts: 15,188
Quote:
Originally Posted by jan smit
Thanks Nielm, so if I understand correctly the (for my problem) important steps are taken in 4a and 4b. After that the quality of the graphics card is important, while the problem lies not in the Hauppauge card/driver.
Which settings in Sage are decisive for steps 4a and 4b?
Maybe just to clarify, but the line between 4 and 5 can be a very blurry one, especially when you're using DXVA, since basically software unpacks the MPEG file, does a little prep, and then passes the video to the video card for a lot of decoding and probably all the deinterlacing. Of course you can draw a hard line and do all the decoding and deinterlacing in software.

So what things contribute to overall quality?
1) The decoders, despite the fact that MPEG is a standard, all decoders are different, the latest nVidia decoders and the new Dscaler 5 decoders seem to be some of the most highly regarded.
2) The video card, the best example is the Geforce 5 and 6 series, which offer smart deinterlace control (cadence based Inverse Telecine) and Pixel Adaptive Deinterlacing (if you can't tell I like my 6800)

Unfortunately, beyond the decoder and renderer selection, the other settings in Sage don't really do anything.

I may have to get ambitious and take/post some nice hi-res pictures of different decoders, settings, etc. We'll see...
Reply With Quote
  #36  
Old 12-19-2004, 08:41 AM
willemse willemse is offline
Sage Advanced User
 
Join Date: Oct 2003
Location: Netherlands
Posts: 91
Quote:
Originally Posted by nielm
A quick question for the FX5200 people: How did you get decent contrast with VMR9 on the FX5200?

On my display, it always looks washed out, whereas Overlay looks much more vivid... Shows like 24 where everything is dark are pretty much unwatchable using VMR9...
Nielm

Attached jps's are my FX 5200 en NVDIA settings
The NVDIA Video decoder is set as MPEG2 Video decoder in Sage/settings/detailed settings/video
FYI The contract does not significantlty change when changing from prefer overlay to prefer VMR9 setting

Hope this is of help en met vriendelijke groet

Last edited by willemse; 12-19-2004 at 08:43 AM.
Reply With Quote
  #37  
Old 12-19-2004, 09:38 AM
nielm's Avatar
nielm nielm is offline
SageTVaholic
 
Join Date: Oct 2003
Location: Belgium
Posts: 4,496
Quote:
Originally Posted by willemse
FYI The contract does not significantlty change when changing from prefer overlay to prefer VMR9 setting
How about when you change the Sage renderer from VMR9 to overlay (I am not sure what the 'prefer' settings do, but I imagine they would be overriden by Sage choosing the renderer) (maybe I should get nvdvd...)

Last edited by nielm; 12-19-2004 at 09:41 AM.
Reply With Quote
  #38  
Old 12-19-2004, 10:03 AM
willemse willemse is offline
Sage Advanced User
 
Join Date: Oct 2003
Location: Netherlands
Posts: 91
Quote:
Originally Posted by nielm
How about when you change the Sage renderer from VMR9 to overlay (I am not sure what the 'prefer' settings do, but I imagine they would be overriden by Sage choosing the renderer) (maybe I should get nvdvd...)
Changing Sage renderer to overlay causes output to show picture for split of a second and then turns black; So I do not use that setting

Given all the setting possibilities I am thinking of producing a matrix indicationg what works and what not, with qlty indication for those which work
Will share when I got around it
Reply With Quote
  #39  
Old 12-20-2004, 12:18 PM
mc2wheels mc2wheels is offline
Sage Advanced User
 
Join Date: Nov 2004
Location: Medway, MA
Posts: 101
Upgraded from fx 5200 to 6600 GT -- findings

Well, I was moderately happy with the FX5200 and the nVidia codec. Sports had some artifacting with fast motion, but with the deinterlacing on the codec options set to "median filtering" -- it wasn't too bad.

So then I upgraded to the Chaintech 6600GT AGP (the wife got it for me for X-mas, but gave it to me early). Anyway, I was originally horibly dissapointed with the "upgrade." Using s-video out, VMR9, hardware acceleration, screen at 800x600, etc, basically the same settings as with the fx 5200. Unfortunately, for some reason, when I changed cards, I lost the "median filtering" deinterlacing option. So for sports, the 6600GT looked worse than the 5200. I tried overlay, 640x480, and every deinterlacing combo, and I could not get a picture better than I had with the 5200. At this point I was really bummed out.

Then, I decided to try using the component out that comes on the chaintech conveter cable. My sony SDTV has component inputs on the back and I've never had a card that outputs that. OMFG, what a huge difference. Now I get no artifacting to speak of with sports. I am using VMR9, with "pixel adaptive" interlacing and smart interlacing on the nVidia codec, and the screen is at 480i. The only downfall, is that there are limited resolutions that I can choose that are native HD resolutions, and my tv will only handle the 480i. There is also a slight overscan that makes the desktop/start menu cut off a bit around the edges, but I am willing to live with that for the better picture. Strangely, with this setting, it will not allow me to change the screen size, but it does allow you to adjust the screen size with the s-video resolutions. Perhaps it can be done with custom timings (and they do have a screen changing that), but I do not understand how that stuff works.

I am still slightly disappointed with the 6600GT -- it is supposed to handle hardware acceleration for the nVidia VPP codec, but I see no evidence of that. Also, the definition/contrast of the picture is still about the same as my 5200, but I think it is pretty good.

Anyway, bottom line: If you only have s-video inputs on your tv, upgrading to the 6600GT might not give you any better picture -- mine got worse. If you have component inputs, it will give you a much better picture (as far as artifacting), but at the cost of being able to adjust the screen size/overscanning. You can always adjust that in sage, but it doesn't help with the desktop.
Reply With Quote
  #40  
Old 12-20-2004, 01:49 PM
stanger89's Avatar
stanger89 stanger89 is offline
SageTVaholic
 
Join Date: May 2003
Location: Marion, IA
Posts: 15,188
Quote:
Originally Posted by mc2wheels
WI am still slightly disappointed with the 6600GT -- it is supposed to handle hardware acceleration for the nVidia VPP codec, but I see no evidence of that. Also, the definition/contrast of the picture is still about the same as my 5200, but I think it is pretty good.
By selecting "Smart" for deinterlace control, you are enabling the hardware accellerated VPP. So you're already using it.
Reply With Quote
Reply


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -6. The time now is 03:28 PM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, vBulletin Solutions Inc.
Copyright 2003-2005 SageTV, LLC. All rights reserved.