So, how ‘good’ is the picture on a 1080p screen versus the older 720p and the even older, original analog Australian TV signal?
Some simple maths.
- Original TV was 576 lines. Rough estimate is it’s a 4:3 (4×3) picture ratio, so the width, in pixels (px), would be 576*4/3 which is 768 (see footnote)
- 720p (720 rows of pixels) at 16:9 is 720*16/9 which is 1280 pixels wide, or 1280 columns.
- 1080p is also 16:9, so 1920 pixels wide
Some more simple maths
Type | Rows | Col | Rows x Col (Total Pixels) |
Analog | 576 | 768 | 442,368 |
720p | 720 | 1280 | 921,600 |
1080p | 1080 | 1920 | 2,073,600 |
So, as a nice rule-of-thumb, each jump more than doubles the resolution – number of pixels – than the previous one.
Some quick comments, then a footnote:
- There weren’t that many actual 720p TVs, at least here in Australia. They were 768 x 1366, very much the resolution of a PC screen. Nicely giving away the game that they were essentially big PC screens. But the ‘double;’ rule is good enough here too (1,049,088 px total, that’ll do me as almost exactly 1/2 of the 1080p)
- 1080p is sold as High Definition. But it’s barely 2 Megapixel (!). If I tried to sell you a camera, in 2010, as being 2 MP, you’d probably laugh at me
Footnote: Due to technical reasons, to do with pixel shapes not being square, the equivalent digital resolution of our old TV is actually less than this, that is 576×720 pixels (414,720 px total). Makes the double rule slightly more obvious.