Plasma - Will I really notice the difference
Discussion
Hi all, after much delibersation I have narrowed down my selection of 3 plasma's. I have found a Panny TXP50x10b for £739.01 or TX50g10b and TX50Pz80b both (although I cant seem to see the difference between the last 2) for £1179.01 all on thes DER site (these guys seem to be much recommended on here).
The difference in price seems to be that the more expensive sets are Full HD (1080p) whereas the cheaper is HD ready (1080i). I think I have got that right.
My question is that as I will be using the set for sky HD, PS3 & Blureay and will be sitting approx 12ft from the screen, is the extra £440 really going to make a difference? I do want to make sure that I am "future proofed" and so would welcome any advice.
Cheers
The difference in price seems to be that the more expensive sets are Full HD (1080p) whereas the cheaper is HD ready (1080i). I think I have got that right.
My question is that as I will be using the set for sky HD, PS3 & Blureay and will be sitting approx 12ft from the screen, is the extra £440 really going to make a difference? I do want to make sure that I am "future proofed" and so would welcome any advice.
Cheers
Sky doesn't pump out 1080p yet, so on that no, you won't notice the difference.
I'd probably get the cheaper one now and in a few years consider going to 1080p when they're half the price and TV chucks out video at that quality and relegate the old one to another room. Disks and games wouldn't be enough to justify 1080p to me, but your usage (and wallet!) may be different.
I'd probably get the cheaper one now and in a few years consider going to 1080p when they're half the price and TV chucks out video at that quality and relegate the old one to another room. Disks and games wouldn't be enough to justify 1080p to me, but your usage (and wallet!) may be different.
Edited by cs02rm0 on Sunday 17th May 17:03
cs02rm0 said:
Sky doesn't pump out 1080p yet, so on that no, you won't notice the difference.
I'd probably get the cheaper one now and in a few years consider going to 1080p when they're half the price and TV chucks out video at that quality and relegate the old one to another room. Disks and games wouldn't be enough to justify 1080p to me, but your usage (and wallet!) may be different.
So would the quality of games/disks be that much inferior on 1080i? Would it be a noticable difference at that distance?I'd probably get the cheaper one now and in a few years consider going to 1080p when they're half the price and TV chucks out video at that quality and relegate the old one to another room. Disks and games wouldn't be enough to justify 1080p to me, but your usage (and wallet!) may be different.
Edited by cs02rm0 on Sunday 17th May 17:03
Steve H said:
cs02rm0 said:
Sky doesn't pump out 1080p yet, so on that no, you won't notice the difference.
I'd probably get the cheaper one now and in a few years consider going to 1080p when they're half the price and TV chucks out video at that quality and relegate the old one to another room. Disks and games wouldn't be enough to justify 1080p to me, but your usage (and wallet!) may be different.
So would the quality of games/disks be that much inferior on 1080i? Would it be a noticable difference at that distance?I'd probably get the cheaper one now and in a few years consider going to 1080p when they're half the price and TV chucks out video at that quality and relegate the old one to another room. Disks and games wouldn't be enough to justify 1080p to me, but your usage (and wallet!) may be different.
Edited by cs02rm0 on Sunday 17th May 17:03
On a 50" screen sitting 12' away, you would not be able to tell the difference between 720p and 1080p. The difference only begins to become noticeable on that size panel at 10', so I recommend going for the HD Ready 720p panel.
If you plan to plug your PC into it however, which you haven't mentioned but for thoroughness I thought I would, then go for the 1080p set.
HTH
If you plan to plug your PC into it however, which you haven't mentioned but for thoroughness I thought I would, then go for the 1080p set.
HTH
allgonepetetong said:
On a 50" screen sitting 12' away, you would not be able to tell the difference between 720p and 1080p. The difference only begins to become noticeable on that size panel at 10', so I recommend going for the HD Ready 720p panel.
If you plan to plug your PC into it however, which you haven't mentioned but for thoroughness I thought I would, then go for the 1080p set.
HTH
Thanks for all of the help, I will not be using it for PC and so I may just be saving myself some money. Am I right in thinking that a 1080i is actually a 720p panel? (It does get confusing)If you plan to plug your PC into it however, which you haven't mentioned but for thoroughness I thought I would, then go for the 1080p set.
HTH
Cheers
Steve H said:
allgonepetetong said:
On a 50" screen sitting 12' away, you would not be able to tell the difference between 720p and 1080p. The difference only begins to become noticeable on that size panel at 10', so I recommend going for the HD Ready 720p panel.
If you plan to plug your PC into it however, which you haven't mentioned but for thoroughness I thought I would, then go for the 1080p set.
HTH
Thanks for all of the help, I will not be using it for PC and so I may just be saving myself some money. Am I right in thinking that a 1080i is actually a 720p panel? (It does get confusing)If you plan to plug your PC into it however, which you haven't mentioned but for thoroughness I thought I would, then go for the 1080p set.
HTH
Cheers
I still maintain that to display 1080i properly you need a 1920x1080 screen, anything less requires the TV to modify the picture to fit.
Steve H said:
Thanks for all of the help, I will not be using it for PC and so I may just be saving myself some money. Am I right in thinking that a 1080i is actually a 720p panel? (It does get confusing)
1080i and 720p are signal resolutions, not display resolutions.Older TVs are only able to handle those resolutions, but current models (for the past 2-3 years) have been able to accept the B-R 1080p output, and more recently the 24Hz/fps to eradicate motion judder.
Standard Plasma HD TVs are typically 1366x768 pixels for 50"+, 42" 1280x768, and 37" 1280x720.
LCDs have been 1366x768 almost from day one.
Full HD is 1920x1080, but contrary to what Flossy suggests, a 1080i signal on either type will be scaled up or down due to being an effective 810 lines of resolution when deinterlaced.
But there's not many instances where setting the Sky/Virgin HD box to that output is preferential to 720p, and certainly not a B-R movie or the PS3/360 console.
Edited by PJ S on Tuesday 19th May 16:24
PJ S said:
Full HD is 1920x1080, but contrary to what Flossy suggests, a 1080i signal on either type will be scaled up or down due to being an effective 830 lines of resolution when deinterlaced.
Could you please point me in the direction of the real specification of 1080i. If each frame has 540 lines why does deinterlacing lose so many?FlossyThePig said:
PJ S said:
Full HD is 1920x1080, but contrary to what Flossy suggests, a 1080i signal on either type will be scaled up or down due to being an effective 810 lines of resolution when deinterlaced.
Could you please point me in the direction of the real specification of 1080i. If each frame has 540 lines why does deinterlacing lose so many?1080p is pretty much a waste of time. All of the HD content you will watch on Sky HD for example will be produced at 1080i, 1080p baseband video for live sports is still some way off.
So any processing between I and P will be done by the set, when the actual footage is made in I and I can't image that anything in a £1500 set is going to do a very good job at real time standards conversion. There is a reason why we tend to try and keep everything the same level / spec / resolution at the point of production as even with the kit that we use the difference is noticeable.
So, from my point of view, 720p or 1080i, if I sit close enough and watch a lot of native 1080i footage (blue-ray etc) I'd plump for 1080i. The argument between I and P is a waste of time in my eyes, there is no reason why P is worth a penny more.
So any processing between I and P will be done by the set, when the actual footage is made in I and I can't image that anything in a £1500 set is going to do a very good job at real time standards conversion. There is a reason why we tend to try and keep everything the same level / spec / resolution at the point of production as even with the kit that we use the difference is noticeable.
So, from my point of view, 720p or 1080i, if I sit close enough and watch a lot of native 1080i footage (blue-ray etc) I'd plump for 1080i. The argument between I and P is a waste of time in my eyes, there is no reason why P is worth a penny more.
PJ S said:
FlossyThePig said:
PJ S said:
Full HD is 1920x1080, but contrary to what Flossy suggests, a 1080i signal on either type will be scaled up or down due to being an effective 810 lines of resolution when deinterlaced.
Could you please point me in the direction of the real specification of 1080i. If each frame has 540 lines why does deinterlacing lose so many?PJ S on the other thread said:
Signal resolutions: 1280x720p and 1920x1080i
In the case of the former, each frame of the picture content has 1280 x 720 (921,600) pixels of information
In the case of the latter, it's 1920 x 1080 x 0.5 (1,036,800) pixels of info
Dividing the smaller into the lager gives 1.125, which when multiplied by 720, gives 810.
This is the EFFECTIVE resolution a 1080i signal presents when deinterlaced properly/best method.
In the case of the former, each frame of the picture content has 1280 x 720 (921,600) pixels of information
In the case of the latter, it's 1920 x 1080 x 0.5 (1,036,800) pixels of info
Dividing the smaller into the lager gives 1.125, which when multiplied by 720, gives 810.
This is the EFFECTIVE resolution a 1080i signal presents when deinterlaced properly/best method.
I've no idea what a lower resolution display does to the frames before displaying them. I'm sure different manufacturers use different algorithms probably depending on the processing power available.
FlossyThePig said:
PJ S said:
FlossyThePig said:
PJ S said:
Full HD is 1920x1080, but contrary to what Flossy suggests, a 1080i signal on either type will be scaled up or down due to being an effective 810 lines of resolution when deinterlaced.
Could you please point me in the direction of the real specification of 1080i. If each frame has 540 lines why does deinterlacing lose so many?PJ S on the other thread said:
Signal resolutions: 1280x720p and 1920x1080i
In the case of the former, each frame of the picture content has 1280 x 720 (921,600) pixels of information
In the case of the latter, it's 1920 x 1080 x 0.5 (1,036,800) pixels of info
Dividing the smaller into the lager gives 1.125, which when multiplied by 720, gives 810.
This is the EFFECTIVE resolution a 1080i signal presents when deinterlaced properly/best method.
In the case of the former, each frame of the picture content has 1280 x 720 (921,600) pixels of information
In the case of the latter, it's 1920 x 1080 x 0.5 (1,036,800) pixels of info
Dividing the smaller into the lager gives 1.125, which when multiplied by 720, gives 810.
This is the EFFECTIVE resolution a 1080i signal presents when deinterlaced properly/best method.
I've no idea what a lower resolution display does to the frames before displaying them. I'm sure different manufacturers use different algorithms probably depending on the processing power available.
If you bothered to check the links I went to the trouble of posting up in the last post of the thread, you can read in more detail as to the hows and whys of 1080i being not a whole heap more resolution than 720p, which if you think about it for a moment, explains why the broadcasters adopted it as a format, since the transmission costs aren't much more than 720p being broadcast.
As you'll see from those links, 1080p is a serious amount of extra bandwidth and data rate required, hence the likelihood of it remaining a disc based medium for quite a long while still, even if the Japanese are trialing Ultra HD or Super HD at 3840x2160 or something like that.
I appreciate it is hard for people to get their head around how an interlaced version is vastly reduced in effective resolution once deinterlaced, but it's not hard when you understand the principle used to deinterlace.
I don't profess to fully understand myself, but those who do, have explained why, and it makes perfectly understandable sense when understood as to why it was accepted by broadcasters as a feasible option.
I think because the numbers are the same, only a letter difference, people think it's a cheap way of getting Full HD, when in fact it's nothing like that, and favours the broadcasters more than the consumer.
theboyfold said:
from my point of view, 720p or 1080i, if I sit close enough and watch a lot of native 1080i footage (blue-ray etc) I'd plump for 1080i. The argument between I and P is a waste of time in my eyes, there is no reason why P is worth a penny more.
You are getting your p and i mixed up. Blu-ray is 1080p, broadcast High Def TV is 1080i/720p.I don't believe that you would be able to tell the difference between a 1080i and 720p picture. 1080p however, when viewed at the correct distance for the given panel size is stunning.
Edited by allgonepetetong on Wednesday 20th May 11:59
Steve H said:
I am more confused now than when I started
G10 is a 1080p set, its the new version of the PZ81 essentially (Freesat).The S10 is 1080p without the Freesat
The X10 is 1080i and the equivalent of the old PX80
1080p will not be applicable to 95% of television users.
So its a choice of Freesat or not essentially. If you're not going to use Freesat get the X10.
Gassing Station | Home Cinema & Hi-Fi | Top of Page | What's New | My Stuff