Poor quality over HDMI and BluRay

Poor quality over HDMI and BluRay

Author
Discussion

Davie_GLA

Original Poster:

6,646 posts

205 months

Thursday 27th August 2009
quotequote all
Hello all.

I have noticed that my TV is showing what should be HD films as being 'grainy'.

I've noticed it on my DVD player which to be fair upscales the DVD's, but when playing BluRay on my PS3 i have the same problem.

Now the blue ray was Ghostbusters, so i suppose that would have had to be converted at some point but the picture quality was crap.

Also, the cables i'm using aren't exactly top quality, could this be the issue?

Thanks in advance,

David.

Plotloss

67,280 posts

276 months

Thursday 27th August 2009
quotequote all
Probably the cable.

Despite what people who know no better will tell you, digital isn't just digital.

Davie_GLA

Original Poster:

6,646 posts

205 months

Thursday 27th August 2009
quotequote all
Yeah i was thinking the cable.

And i'm ready to learn why digital isn't just digital.

marctwo

3,666 posts

266 months

Thursday 27th August 2009
quotequote all
Digital signals have error correction. Contrary to popular belief, loss of data does not equate to no picture. If some data is missing/corrupted then you still get a picture but you'll get some noticeable degradation and artefacts. A better cable will mean less missing/corrupted data and less error correction = better picture.

Davie_GLA

Original Poster:

6,646 posts

205 months

Thursday 27th August 2009
quotequote all
Cool, every day's a school day.

So the cheap £10 cable i bought in HMV aint gonna cut it.?

OldSkoolRS

6,832 posts

185 months

Thursday 27th August 2009
quotequote all
Try another disc before you dismiss the cable, some discs have more film grain showing than others. It can become a source of heated debate as to whether it should be seen, so I'll stay out of that arguement....Also you may need to adjust the contrast and brightness settings on your TV to match the BluRay's output, most modern TVs allow different settings for each input so no worries about upsetting your existing TV picture.

Personally I've only found a long (10m) HDMI cable that was poor: It simply didn't work on 1080p, but was fine at lower resolutions. It's replacement was only about £50 (10 metres long remember) and has worked perfectly at 1080p upto 60Hz (highest data). I'm not a believer in different cables making a difference to picture quality having tested some short ones with my BluRay player going onto my projector on a very large screen.

Roop

6,012 posts

290 months

Thursday 27th August 2009
quotequote all
Cable generally makes a difference over longer runs. Try a recent movie. Some of the Pixar type ones are quite good for showing off Blu-Ray. If it still looks crap, check your settings in player and TV. Failing that, try another cable.

6655321

73,668 posts

261 months

Thursday 27th August 2009
quotequote all
Do you actually have your TV setup, as well as the PS3?

derestrictor

18,764 posts

267 months

Thursday 27th August 2009
quotequote all
The Ghostbusters Blu-ray is inherently noisy/grainy. It has ZILCH to do with the lead quality.

If you're viewing distance is inappropriate it will look ste on a Pioneer Kuro or Panny G/V series plasma. Otoh, at a viable distance, the effect becomes a non-issue and you are left with excellent clarity/focus and good colours (as opposed to Die Hard, for example - no grain but all the revelation [relatively speaking] of a mud pond.)

I run 10m HDMIs in the primary vegetation chamber and there is NO subjective diminshment of quality.

Lead quality on such runs IS pertinent, granted.

This is all before considering the calibre of signal processing at hand. To wit, which tv am ye using?

Davie_GLA

Original Poster:

6,646 posts

205 months

Thursday 27th August 2009
quotequote all
Thanks for the replies.

Good point on viewing distance - my room is a little small for a 42" plasma.

The tele is a Samsung QE42 something or other, and as far as set up goes no i haven't made any changes, should i be? Can someone walk me through the basic set up?

And granted the Ghostbusters film wasn't the best bench mark as i suppose it's been 're-mastered'.

Thanks again,

D.

6655321

73,668 posts

261 months

Thursday 27th August 2009
quotequote all
I know when I got my ps3, I had to go into system settings, and tell it what HD? I had, etc, and, despite it breing auto as default on my TV, you can change that setting on my TV as well.

Plotloss

67,280 posts

276 months

Thursday 27th August 2009
quotequote all
derestrictor said:
It has ZILCH to do with the lead quality.
A bold statement, given that we know nothing about the cable in question.

HDMI is the devil's work and its an emerging standard, which doesn't help matters.

For instance the £10 HDMI cable from HMV could be a standard speed cable passing only the 1.3 Cat 1 HDMI test, thus not supporting 1080p only 720p/1080i. For 1080p support a cable would need to exceed the high speed 1.3 Cat 2 test.

Obviously check with another disc as you've probably got one and its an easy test. If however the problem persists then I'd start with the cable as there is a LOT of misinformation out there.

ETA: Just seen that its a Samsung LED backlit (possibly), I've not come across degredation issues with Samsungs, only HDCP handshaking issues which are common but that would drop the picture entirely, not introduce artefacts. Their LCDs however arent great with motion processing so it may be a 'screen feature'


Edited by Plotloss on Thursday 27th August 19:37

OldSkoolRS

6,832 posts

185 months

Thursday 27th August 2009
quotequote all
Davie_GLA said:
The tele is a Samsung QE42 something or other, and as far as set up goes no i haven't made any changes, should i be? Can someone walk me through the basic set up?
You might find that the brightness, contrast, colour and sharpness settings are not appropriate for your player. The default settings for many TVs is to have everything upto the max, all of the 'Advanced' settings on (and many actually will make a good quality source worse BTW). You need to find a test disc such as HD Digital Video Essentials on BluRay:

http://www.amazon.co.uk/Digital-Video-Essentials-B...

It will explain about what you need to adjust and why. Essentially you need to adjust the brightness and contrast so that you have deep blacks without crushing dark greys down to black (or having black shown as dark grey) and that the contrast is set so that the whites don't crush or full white test patterns remain 'white' not tinted red, blue or green. I'd also recommend that you turn off all the advanced features such as 'contrast enhance', 'super white', 'black corrector' and noise reduction/sharpness set to off or '0'. If you find out the exact model number, you could read up on the appropriate 'Owners thread' on AVSForums searching starts here (you might find some 'recommended settings to start off with):

http://www.avforums.com/forums/plasma-televisions/

You need to get the basics right before worrying about 'better cables' and other possibilities. With the greatest respect to Plotless; if you are getting an image on the screen that is not breaking up or has coloured flashes, then I doubt changing the cable is going to resolve your issue.

Edited by OldSkoolRS on Thursday 27th August 19:40

Plotloss

67,280 posts

276 months

Thursday 27th August 2009
quotequote all
How would contrast and brightness adjustments account for screen artefacts/sparkle?

If they did then there would be a fundamental flaw in the engineering of the panel and chassis as over time artefacts would be introduced as the backlighting dimmed.

This is either a source issue (poor quality transfer) or a signal issue (intereference or poor specification cable)

headcase

2,389 posts

223 months

Thursday 27th August 2009
quotequote all
OldSkoolRS said:
Davie_GLA said:
The tele is a Samsung QE42 something or other, and as far as set up goes no i haven't made any changes, should i be? Can someone walk me through the basic set up?
You might find that the brightness, contrast, colour and sharpness settings are not appropriate for your player. The default settings for many TVs is to have everything upto the max, all of the 'Advanced' settings on (and many actually will make a good quality source worse BTW). You need to find a test disc such as HD Digital Video Essentials on BluRay:

http://www.amazon.co.uk/Digital-Video-Essentials-B...

It will explain about what you need to adjust and why. Essentially you need to adjust the brightness and contrast so that you have deep blacks without crushing dark greys down to black (or having black shown as dark grey) and that the contrast is set so that the whites don't crush or full white test patterns remain 'white' not tinted red, blue or green. I'd also recommend that you turn off all the advanced features such as 'contrast enhance', 'super white', 'black corrector' and noise reduction/sharpness set to off or '0'. If you find out the exact model number, you could read up on the appropriate 'Owners thread' on AVSForums searching starts here (you might find some 'recommended settings to start off with):

http://www.avforums.com/forums/plasma-televisions/

You need to get the basics right before worrying about 'better cables' and other possibilities.
A quick google doc to do the basics, http://docs.google.com/Doc?docid=0AQwyoj0hghSKZGZu... .

My experiance of HDMI is that Cable quality counts for the higher resolutions, BUT having a poor cable dosent really effect picture quality (not to the extent where i have noticed it anyways) generally a poor cable will work fine on a lower resolution but will start with intermittent handshake issues, pixilation, blocking etc when you exceed what the cable will do. And obviously this is multiplied by cable length.

As for PS3 setup, the latest ones actually do it themselves but its worth checking in your PS3 picture menu that the output resolution is set to 1080p.

I too have notices a grainyness on certain blueray films wich look like they have been 'mastered in' to me.

It reminds me of a passage written into an old Grundig VCR service manual 'Due to the high quality nature of the components used, you will see noise bars on the picture' biggrin

Edited by headcase on Thursday 27th August 19:52

OldSkoolRS

6,832 posts

185 months

Thursday 27th August 2009
quotequote all
Plotloss said:
How would contrast and brightness adjustments account for screen artefacts/sparkle?

If they did then there would be a fundamental flaw in the engineering of the panel and chassis as over time artefacts would be introduced as the backlighting dimmed.

This is either a source issue (poor quality transfer) or a signal issue (intereference or poor specification cable)
The OP complained of a grainy image (not sparkling IIUC) when watching BluRays, which could be caused by a number of factors, such as disc specific, TV settings, player settings, his expectations and as you state, the cable. As he has stated that he hasn't adjusted the TV at all, then I would suggest this would be a starting point; it won't cost him anything (unless he pays to have it calibrated of course) and it may well improve the image to his satisfaction especially judging by some of the out of box settings some TVs have: Just having the brightness up to high could wash out shadow detail and exagerate noise in the picture for example.

Or, as another poster suggested the particular BluRay was a poor example.....

Davie_GLA

Original Poster:

6,646 posts

205 months

Friday 28th August 2009
quotequote all
OK. Some great info and i've saved the google doc and will try that.

I will get the exact TV model and look it up on AVForums as well as post in on here.

THanks for all the replies, it's very much appreciated.

D.

derestrictor

18,764 posts

267 months

Saturday 29th August 2009
quotequote all
It's a combo of disc quality, tv(/settings) and viewing distance.

You can analyse this all you want but software (be it broadcast or stored data,) primarily, is the biggest casue of PQ woes.

Having said that, budget LCDs are far from the ideal route in the amelioration of such effects - no HDMI will counter such fundamental issues.




WeirdNeville

5,998 posts

221 months

Monday 31st August 2009
quotequote all
I would say it's most likely your TV set up.
It may well still be on it's "Shop" settings which are very vibrant and overwrought to wow the punters.
Try playing a little with contrast and colour settings to tone it down a bit, and also sharpness. Sharper isn't always better! It can "over edge" and lead to artifacts and grainyness. My TV on "vibrant" looks pretty horrid, it's far too bright and edged and makes live action look cartoony. Soften it down a bit and it's beautiful htough. I have a "cinema" setting which is actually very subdued and "normal" for games and TV.

People who immediately say "HDMI - buy a more expensive cable" Please, do a blind test. There may be measurable difference in cable quality but one of the joys of digital is that this doesn't lead to a loss in image quality. You've been sucked in by marketing.

For the record, I have tried £100 5 meter cables but have settled on a £10 10 metre cable. It's perfect!

scorp

8,783 posts

235 months

Monday 31st August 2009
quotequote all
marctwo said:
Digital signals have error correction. Contrary to popular belief, loss of data does not equate to no picture. If some data is missing/corrupted then you still get a picture but you'll get some noticeable degradation and artefacts. A better cable will mean less missing/corrupted data and less error correction = better picture.
I thought HDMI didn't have error correction ? It has error detection afaik, so it will probably drop data packets it knows got damaged. Still i wouldn't have thought you would have got grain like you would with analogue, more missing parts of the picture and sound or random blanking out.