|
Hardware Support Discussions related to using various hardware setups with SageTV products. Anything relating to capture cards, remotes, infrared receivers/transmitters, system compatibility or other hardware related problems or suggestions should be posted here. |
|
Thread Tools | Search this Thread | Display Modes |
#1
|
|||
|
|||
VGA -> RGB SCART cable; flickery image
I've bought one of those VGA to RGB SCART custom cables for ATI graphics cards. After several heartaches I've managed to get PowerStrip to display a usable image on my TV using it; I'm in the UK and used the settings found here:
http://www.idiots.org.uk/vga_rgb_scart/index.html and left tiling on on my Radeon 9250 PCI. However, the image is very flickery. It's bearable on stationary screens like menus, but the moment it starts showing TV it's pretty terrible; goes very ugly where there is movement on the screen. Just wondering if anyone has any suggestions as to what is causing this, and how I might fix it. I'm using the Intervideo decoder in Sage to display video. |
#2
|
||||
|
||||
Sounds like interlacing artifacts...
Theoretically, a VGA->Scart with perfect scanline matching between the source video and the TV display would give the most solid image, in practice this is difficult to achieve for various reasons... There are lots of posts on the forums regarding deinterlacing, and a thread which discusses problems with VGA Scart cables... Edit: deinterlacing artifacts -> interlacing artifacts
__________________
Check out my enhancements for Sage in the Sage Customisations and Sageplugins Wiki Last edited by nielm; 08-23-2005 at 03:16 AM. |
#3
|
|||
|
|||
There is lots of talk on this subject here, http://www.avforums.com/forums/showthread.php?t=136811 including a lot of cool powerstrip timings. I am using the cable and have great tv quality just a little shakey on small text i.e desktop or internet.
|
#4
|
|||
|
|||
Thanks very much; I'm pretty sure it must be a powerstrip timing issue. I think I need to find a precise definition of what 4:3 UK TVs are expecting! There seems to be more than one version of PAL, and then there's the question of where teletext resides... nightmare.
|
#5
|
|||
|
|||
I am using widescreen timings so I cant give you mine but try this for basic 4:3 timings and play from there
PowerStrip timing parameters: 720x576=720,42,72,134,576,3,28,18,15125,41 Generic timing details for 720x576: HFP=42 HSW=72 HBP=134 kHz=16 VFP=3 VSW=28 VBP=18 Hz=25 Linux modeline parameters: "720x576" 15.125 720 762 834 968 576 579 607 625 interlace +hsync +vsync |
#6
|
|||
|
|||
Those are the ones from http://ryoandr.free.fr/english.html, aren't they? I started off with them, but got a bizarre image - the mouse pointer was OK, the screen was static but a total mess. Sounded like the tiling issue, but turning tiling off made things worse; my 9250 really doesn't like it as with Tiling off even my VNC image was corrupted!
Only going to the 720x540 reaolution suggested on http://www.idiots.org.uk/vga_rgb_scart/index.html has given a usable screen so far. I'll carry on playing. |
#7
|
|||
|
|||
Got it to work in 4:3 by messing about with the timings. Since I'm using a Huappuage to capture an analogue signal, everything I capture is captured in 4:3; however, the actual source is often designed for 16:9. When I was using the Radeon's S-Video out I was able to deal with this by manually setting the aspect ratio to 16:9 in Sage - then it evidently does something clever to compress the image to 16:9 dimensions so it looks right on the screen. Well, now when it does that I get the interlacing weirdness again. Curiously the interlacing weirdness seems to be worse when the recording quality is higher, and it's particularly bad in sports.
I'm not clear as to whether this is a sign that the Powerstrip timings are still less than perfect, or whether it's an inevitable consequence of playing an MPEG-2 file at a different aspect ratio to the one it was recorded at, or whether it's something to do with the codec I'm using; it doesn't seem to happen when using the Sage decoder, but that doesn't use hardware decoding and on my ancient machine girder can't get any CPU time whilst its decoding, which is frustrating! I suppose I'd expect it still to be a powerstrip timing issue; after all, in terms of what's getting to the screen, the computer must just be outputting the specified resolution, and if the screen can't put that together properly that's the fault of the resolution, right? |
Currently Active Users Viewing This Thread: 1 (0 members and 1 guests) | |
|
|