SageTV Community  

Go Back   SageTV Community > Hardware Support > Hardware Support
Forum Rules FAQs Community Downloads Today's Posts Search

Notices

Hardware Support Discussions related to using various hardware setups with SageTV products. Anything relating to capture cards, remotes, infrared receivers/transmitters, system compatibility or other hardware related problems or suggestions should be posted here.

Reply
 
Thread Tools Search this Thread Display Modes
  #21  
Old 09-25-2006, 04:02 PM
GTwannabe's Avatar
GTwannabe GTwannabe is offline
Sage Aficionado
 
Join Date: Dec 2004
Posts: 434
Quote:
Originally Posted by lobosrul
Ehh... a 7800GTX smokes a 7600GT.

I remember when those babies came out last summer ('05) they were a giant leap forward in video card speed.

If you dont belive me: http://www23.tomshardware.com/graphi...=529&chart=231

Look for the two lines in blue.

Edit: my bad sort of (the GTX selected above is a 512MB OC'd modeled). See this chart: http://www23.tomshardware.com/graphi...=529&chart=231 Still much higher though. And yes the clock speed of the GTX is lower, it has way more pipelines though im sure.
In 3D games, yes. However, PureVideo uses dedicated silicon on the GPU. That's why its performance scales directly with clockspeed and not the number of pixel pipes or memory bandwidth.
__________________
Intel NUC SageTV 7 server - HDHomeRun PRIME - 2TB iSCSI ReadyNAS storage
Intel i3 HTPC SageTV 7 Client - Win 7 x64 - Onkyo TX-674
Reply With Quote
  #22  
Old 09-25-2006, 08:33 PM
autoboy autoboy is offline
Sage Aficionado
 
Join Date: Aug 2006
Posts: 477
We are not debating the 3D performance of these cards. 3D performance does not always scale with video performance. The 7300GT is an ok low range card for 3D but it gets smoked by a crappy 7300LE in 2D.

Also, I found that my Nvidia card has 2 different clocks that can be adjusted seperately. In Ntune, my 7300LE has a 2D clock at 450mhz, and a 3D clock also at 450mhz. When I do a manual overclock, I only adjust the 3D clock. When i do a custom automatic overclock, ntune adjusts both clocks seperatly. With a 20min test, ntune got my card to 521mhz in 2D, and 524mhz on the 3D test. Memory went to 850mhz.

This is interesting and could possibly help us experiment with purevideo speed. We can try increasing the 3D clock and see if anything improves with Purevideo.

Quote:
That's why its performance scales directly with clockspeed and not the number of pixel pipes or memory bandwidth.
We can't say for sure whether or not the shaders are used for some advanced deinterlacing features. We also don't know whether memory performance is a factor. We know that H.264 acceleration is done in the 2D silicon and scales with clockspeed, but we don't know how memory bandwidth factors in.

In fact, we could be focusing on the wrong thing for the 6150. Perhaps the memory bandwidth is the limiting factor preventing smooth 1080p playback and not the limited pixel shaders. If the shaders are being used, shouldn't the power requirements rise? We should check the power useage under video load compared to 3D games.
Reply With Quote
  #23  
Old 09-25-2006, 09:04 PM
stanger89's Avatar
stanger89 stanger89 is offline
SageTVaholic
 
Join Date: May 2003
Location: Marion, IA
Posts: 15,188
Quote:
Originally Posted by autoboy
In fact, we could be focusing on the wrong thing for the 6150. Perhaps the memory bandwidth is the limiting factor preventing smooth 1080p playback and not the limited pixel shaders.
One user (don't remember if it was here or AVS, here I think) reported that going from single to dual-channel config on their 6150 board made a significant difference in high-resolution VMR9 performance. So memory bandwidth apparently is an issue.

Quote:
If the shaders are being used, shouldn't the power requirements rise? We should check the power useage under video load compared to 3D games.
Probably depends on how their used, among other things.
Reply With Quote
  #24  
Old 09-26-2006, 08:49 AM
lobosrul's Avatar
lobosrul lobosrul is offline
Sage Expert
 
Join Date: Aug 2005
Location: Albuquerque, NM
Posts: 573
You guys are telling me the fill rate of a card doesnt affect VMR9 playback? It only relies on the clockspeed?

I wouldve thought 2d clockspeed would only be relevant for Overlay playback.

Anyways, I had smooth playback with a 6600GT (when using purevideo). I only bought a 7800GT for gaming.
Reply With Quote
  #25  
Old 09-26-2006, 09:13 AM
Kirby's Avatar
Kirby Kirby is offline
Sage Icon
 
Join Date: Jan 2006
Posts: 1,253
I'm not saying that. AFAIK, VMR9 uses the 3D pipeline in the GPU.
__________________
Sage Server: HP ProLiant N40L MicroServer, AMD Turion II Neo N40L 1.5GHz Dual Core, 8GB Ram, WHS2011 64bit, Sage 7.1.9 WHS, HDHR (1 QAM, 1 OTA), HDHR Prime 3CC, HD-PVR for copy-once movie channels
HTPC Client:Intel DH61AG, Intel G620 cpu, 8GB ram, Intel 80GB SSD, 4GB RamDisk holding Sage/Java/TMT5
Sage Client:Sage HD-200 Extender
Reply With Quote
  #26  
Old 09-26-2006, 09:19 AM
malore's Avatar
malore malore is offline
Sage Fanatic
 
Join Date: Aug 2003
Location: Iowa
Posts: 877
Quote:
Originally Posted by Kirby
So who has tried the 7600gt passive cooling? How's it doing for you?
I'm using the GIGABYTE GV-NX76T256D-RH Geforce 7600GT in my main/game computer and it's been doing fine, no problems with overheating. However, it is very large, taking up three slots. I can't comment on it's HD ability, because I haven't tried it.
Reply With Quote
  #27  
Old 09-26-2006, 11:00 AM
stanger89's Avatar
stanger89 stanger89 is offline
SageTVaholic
 
Join Date: May 2003
Location: Marion, IA
Posts: 15,188
Quote:
Originally Posted by lobosrul
You guys are telling me the fill rate of a card doesnt affect VMR9 playback? It only relies on the clockspeed?
I think what we can conclude, is that hardware decoding accelleration is not affected by fillrate, only by clockspeed (and 2D clockspeed probably).

Fill rate, does or probably does come into play for VMR9, including advanced deinterlacing/film detection, etc.
Reply With Quote
  #28  
Old 09-26-2006, 02:12 PM
blade blade is offline
SageTVaholic
 
Join Date: Jan 2005
Posts: 2,500
According to walford over on the AVS forums you need 10 GB/sec of memory bandwith to handle full resolution 1080i. I have no idea how he came up with that number.

I did some testing with my 6600gt last night running at 1080i underscanned to 1688*1004. I used fraps and what I considered smooth was when I got the full fps. Playback would appear smooth at lower settings, but lacked full framerates. I had AF set to 16x and everything at the highest quality settings, also AA was off. I needed the following memory speed to get smooth playback with VMR9.

390mhz (128 bit bus) - 20 mbps, 1080p, film (24 fps)
780-980mhz (128 bit bus) - 18mbps, 1080i, video (60 fps)

For video content it varied a lot from clip to clip so I included a range. Notice that video runs at 2.5x higher fps and I needed 2.5x faster memory speed for most of my clips. I'm trying to download a clip that claims to be 40 mpbs and another at around 12mbps to see if the bitrate affects memory speed requirements.

Maybe someone else can confirm this, because I'm not all that great at testing this junk. Also does anyone have any thoughts to explain exactly what is going on?

Last edited by blade; 09-26-2006 at 02:55 PM.
Reply With Quote
  #29  
Old 09-27-2006, 11:06 AM
peternm22 peternm22 is offline
Sage Expert
 
Join Date: Jan 2005
Posts: 709
I e-mailed nVidia a few days ago asking why the AGP variants of the 7600GS and 7600GT weren't listed on their chart. I haven't received a reply from them yet, but the chart has been updated http://www.nvidia.com/page/purevideo_support.html

It now has the 7600GS AGP listed (however no 7600GT).

The supported features of the 7600GS AGP are identical to the PCIe version. Which is good news for us AGP folks.

-Peter
Reply With Quote
  #30  
Old 09-27-2006, 11:16 AM
lobosrul's Avatar
lobosrul lobosrul is offline
Sage Expert
 
Join Date: Aug 2005
Location: Albuquerque, NM
Posts: 573
Quote:
Originally Posted by blade
According to walford over on the AVS forums you need 10 GB/sec of memory bandwith to handle full resolution 1080i. I have no idea how he came up with that number.

Hmm that sounds high. Even after decoding the uncompressed video isnt nearly that big.
Reply With Quote
  #31  
Old 09-27-2006, 11:18 AM
blade blade is offline
SageTVaholic
 
Join Date: Jan 2005
Posts: 2,500
Quote:
Originally Posted by peternm22
The supported features of the 7600GS AGP are identical to the PCIe version. Which is good news for us AGP folks.
Hopefully it is good news, but it may not be. They re-added WMV9 acceleration to the chart for the agp versions of the 6600gt and it still doesn't work. Also the h264 acceleration still doesn't work unless you have a SSE2 capable cpu.
Reply With Quote
  #32  
Old 09-27-2006, 11:27 AM
peternm22 peternm22 is offline
Sage Expert
 
Join Date: Jan 2005
Posts: 709
Quote:
Originally Posted by blade
Hopefully it is good news, but it may not be. They re-added WMV9 acceleration to the chart for the agp versions of the 6600gt and it still doesn't work. Also the h264 acceleration still doesn't work unless you have a SSE2 capable cpu.
My enthusiasm has now been tempered.

-Peter
Reply With Quote
  #33  
Old 09-27-2006, 11:01 PM
blade blade is offline
SageTVaholic
 
Join Date: Jan 2005
Posts: 2,500
Well it seems everyone has lost interest in this, but just in case anyone still cares.

A quote from this thread.

Quote:
To perform advanced motion adaptive deinterlacing with PureVideo at 1080p resolutions requires - last time I checked, which was a while ago - 10GB/s of graphics memory bandwidth minimum, which made the 6600GT entry level for true HD deinterlacing.
If you compare the cards that do Spatial-Temporal De-Interlacing for HD Video with their memory bandwith according to this chart it does appear that the only cards capable of doing it have over 10 GB/sec of memory bandwith. I have no clue if the guy knows what he's talking about or if it is just a coincidence.
Reply With Quote
  #34  
Old 09-27-2006, 11:12 PM
blade blade is offline
SageTVaholic
 
Join Date: Jan 2005
Posts: 2,500
Quote:
FiringSquad: How much of PureVideo is done with the dedicated VPU components of the chip versus the 3D pipeline?

Scott Vouri: I’m really glad you asked this because it is one of the coolest things about our programmable video processor. Our PureVideo technology actually does processor load-balancing across all the video cores and the 3D rendering engine. That way we can process multiple tasks at once or process different stages of the video pipeline at the same time.

[Alan's comments: That wasn't as much detail as I was interested in hearing, but in a follow-up question, NVIDIA suggested the example of performing hardware accelerated decode and some post-processing effects on the internal VPUs and then doing additional color-enhancements or post-processing on the 3D engine.]
From this old interview.
Reply With Quote
  #35  
Old 09-28-2006, 09:31 AM
malore's Avatar
malore malore is offline
Sage Fanatic
 
Join Date: Aug 2003
Location: Iowa
Posts: 877
As is usually the case for me, I find out what I really need after buying something and using it for a while. I purchased a cheap Geforce 6200 and was very happy with it until I tried to watch my first 1080i HDTV show and the stuttering was head ache inducing. It's outputting underscanned 720p. After much trial and error with different decoders and overlay versus VMR, I've found I can get reasonably smooth playback by overclocking the videocard. I highly recommend Fraps and wish I had used it sooner, because it easily verifies what you think you are seeing. I have my card overclocked to 415/700 and Fraps shows a fairly steady 60 with only an occasional blip which might be from noise in the recording. When I returned it to the stock 350/650 the frame rate was in the 50s. I'm using the nvidia purevideo decoders (223) with VMR9 and FSE. I have the 92.91 beta drivers installed.

1080i playback definitely stresses the GPU which runs hotter than during 720p playback. If you choose to overclock, test the playback while nothing critical is recording, because I've had the video lockup after watching a show for a while when I had it overclocked ever higher.
Reply With Quote
  #36  
Old 09-28-2006, 10:16 AM
blade blade is offline
SageTVaholic
 
Join Date: Jan 2005
Posts: 2,500
Quote:
Originally Posted by malore
I highly recommend Fraps and wish I had used it sooner, because it easily verifies what you think you are seeing. I have my card overclocked to 415/700 and Fraps shows a fairly steady 60 with only an occasional blip which might be from noise in the recording. When I returned it to the stock 350/650 the frame rate was in the 50s.
I agree about fraps. I've been using it for months and it is very useful especially when over or underclocking things to see how they affect playback.

My 6200 has no problems with 1080p film (24 fps), I only run into problems with video (60fps). You should see the same thing. This may be why some shows playback smooth and others stutter for some people.
Reply With Quote
  #37  
Old 09-28-2006, 01:14 PM
blade blade is offline
SageTVaholic
 
Join Date: Jan 2005
Posts: 2,500
Out of boredom I keep tinkering. I only tested this on 1080i video content because the requirement for film is so low there isn't any point. Again this was on my 6600gt and I underclocked to the lowest speed that still allowed me the full 60 fps for video content according to fraps for vertical stretch deinterlacing. Then I switched back to Per-Pixel Adaptive to see what sort of frame rates it would get at the same speed.

550 mhz GPU / 500mhz (128 bit) Memory
Vertical Stretch Deinterlacing - 60 fps
Per-Pixel Adaptive - 40-50 fps

335 mhz GPU / 1 ghz (128 bit) Memory
Vertical Stretch Deinterlacing - 60 fps
Per-Pixel Adaptive - 40 fps

To get smooth playback with Per-Pixel Adaptive I needed 500-550 mhz depending upon the video clip. Some were more demanding than others. Since many other cards operate at a slower clock speed and can still do per-pixel adaptive it seems the deinterlacing is mostly dependent on fillrate and memory bandwith. Since h264 and wmv9 is broken for my card I can't test acceleration, but others have already confirmed it is dependent on gpu speed. I've never noticed any difference in cpu usage for hd mpegs whether I'm at 350 or 550mhz so I guess it only really applies to decoding wmv and h264.

I've played around with my 2d clock speed and have yet to see any differences with it. Not sure if it plays any role or not.

Last edited by blade; 09-28-2006 at 01:26 PM.
Reply With Quote
  #38  
Old 09-29-2006, 09:53 AM
AtariJeff's Avatar
AtariJeff AtariJeff is offline
Sage Aficionado
 
Join Date: Nov 2005
Location: Ontario, Canada
Posts: 276
Quote:
Originally Posted by malore
As is usually the case for me, I find out what I really need after buying something and using it for a while. I purchased a cheap Geforce 6200 and was very happy with it until I tried to watch my first 1080i HDTV show and the stuttering was head ache inducing. It's outputting underscanned 720p. After much trial and error with different decoders and overlay versus VMR, I've found I can get reasonably smooth playback by overclocking the videocard. I highly recommend Fraps and wish I had used it sooner, because it easily verifies what you think you are seeing. I have my card overclocked to 415/700 and Fraps shows a fairly steady 60 with only an occasional blip which might be from noise in the recording. When I returned it to the stock 350/650 the frame rate was in the 50s. I'm using the nvidia purevideo decoders (223) with VMR9 and FSE. I have the 92.91 beta drivers installed.

1080i playback definitely stresses the GPU which runs hotter than during 720p playback. If you choose to overclock, test the playback while nothing critical is recording, because I've had the video lockup after watching a show for a while when I had it overclocked ever higher.
Are you happy with the PQ of the 6200 with the newer drivers? I had stair-stepping so bad I had to, yet again, back out to version 84.21. Now its great.
Reply With Quote
  #39  
Old 09-30-2006, 11:28 AM
autoboy autoboy is offline
Sage Aficionado
 
Join Date: Aug 2006
Posts: 477
Quote:
According to walford over on the AVS forums you need 10 GB/sec of memory bandwith to handle full resolution 1080i. I have no idea how he came up with that number.
This was from the Windows Media Player website. It was a kinda arbitrary number that was supposed to eliminate the slower video cards across all GPUs. This # is why I chose to use a 9600xt because the memory was just about 10Gb/sec. It turned out to be less than ideal. There are so many more factors than just memory fill rate in 1080i video.

This is good testing. Once you guys get a finalized fraps test with all the info i'll update the guide. It has been wrong too many times for me to keep putting my own views up there. I need facts supported by evidence for your bandwidth issues. BTW, i've still never had any problems with my 7300LE in 1080i video except when i ran out of main memory. I'm only using 512MB and sage can use 160MB sometimes.
Reply With Quote
  #40  
Old 10-02-2006, 08:53 PM
Tighr Tighr is offline
Sage User
 
Join Date: Jul 2004
Location: Bakersfield, CA
Posts: 15
Send a message via ICQ to Tighr Send a message via AIM to Tighr Send a message via Yahoo to Tighr
So what are the best options for us AGP folks that want good HD playback?

I've currently got an ATI Radeon 9600, and was looking for a good AGP card to upgrade to since my HD playback isn't very smooth. The cheapest 7xxx on newegg is a 7600GS for $125.
Reply With Quote
Reply


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -6. The time now is 05:35 AM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, vBulletin Solutions Inc.
Copyright 2003-2005 SageTV, LLC. All rights reserved.