To provide you with an overview on New And existing technologies, hopefully helping you understand the changes in the technology. Together with the overviews we hope to bring topical issues to light from a series of independent reviewers saving you the time And hassle of fact finding over the web.
We will over time provide you with quality content which you can browse and subscribe to at your leisure.
Whether the broadcasters are going HD or not, TV sets and projectors are arriving thick and fast which claim to support higher resolutions than regular telly. But HDTV is far from just one standard – it incorporates a couple of different resolutions, two different scanning modes, and a number of different frame rates. In this article, we present a guide to what all the terms actually mean.
All lined up
Although you might come across ‘480p’ quoted as a form of HDTV with some plasma screens, the International Telecommunication Union Radiocommunication Sector (ITU-R, the standards body for telecommunications) defines it as 720 lines of vertical resolution or more. In fact, there are two basic HDTV resolutions – 720 lines and 1080 lines. The horizontal resolutions are 1280 and 1920 pixels respectively, although some non-standard HD formats such the HDV system used in camcorders use lower horizontal resolutions (ie 1440 x 1080).
These two basic HDTV resolutions – 1280 x 720 and 1920 x 1080 – are standard across the world, although some countries are favouring one over the other in broadcasting. In contrast, regular, or ‘standard definition’ TV (SDTV), operates at 720 x 576 in Europe, and 720 x 480 in the US. So HDTV offers at least twice the pixels as SDTV, or as much as five times as many pixels at its top resolution. This is a big leap, but it’s not the only change in store with HDTV.
In the computer world, we’ve gotten used to there being just one scanning mode for screens – progressive, where pixel updates start at the top left of the screen and scan down line by line from left to right. But TV standards were created decades ago, before LCD panels were invented and CRTs were the norm. When TV first arrived, the electron beam which writes the image on the phosphor dots on the inside of a CRT couldn’t be scanned very fast.
In order to cover enough ground for a high-resolution image at a fast enough frame rate, the beam had to split each frame into two halves called fields and scan them one after the other. Each field is made up of half the number of lines in the image. The two halves are combined, with an alternate line from each half as you go down. This is called interlacing, and all standard-definition TV transmissions use it as a legacy from the early years, even though TV technology no longer requires it.
Strangely, the interlaced image has persisted into HDTV. Modes which use interlacing are denoted by ‘i’, so 1080i is an interlaced HDTV format with 1080 lines scanned as two 540-line fields. In contrast, 720p is a progressively scanned format with 720 lines of resolution. There is no interlaced version of the latter in wide use. The pinnacle of HDTV quality is therefore 1080p – 1080 lines of resolution progressively scanned. But there is one more factor to consider with HDTV.
Although the resolutions of HDTV are now the same worldwide, the frame rates still aren’t, as these are based on the frequency of AC power and look set to remain that way for the foreseeable future. In the UK, AC power cycles at 50Hz, so our TV operates at 25 frames per second (made up of 50 fields for standard definition TV). In the US, AC power cycles at a nominal 60Hz, although the exact figure is 59.94Hz. So TV operates at 30 frames per second in America.
This gives you a range of variations on the basic three HDTV modes of 720p, 1080i and 1080p, based on the frame or field rate, which is denoted by a number at the end. So 720p25 is 25 frames at 1280 x 720 progressively scanned, but 1080i50 is 25 frames at 1920 x 1080 divided into 50 interlaced fields. Since one of the reasons for going HD is to watch movies in a format more like the cinema, HDTV is also available with 24 frames per second, again progressively scanned. This gives you 1080p24, and is a format you might come across with some HD optical formats.
The very pinnacle of HDTV is where a frame is scanned for every cycle of AC power – so 50 frames per second in Europe, or 60 in the US and Japan. This results in much smoother motion. If your TV supports 1080p50, it is capable of 50 frames per second progressively scanned. However, the circuits capable of handling this amount of video information are expensive, making the TVs prohibitive. Also, no broadcasters are planning to use this format at the time of writing, nor will it provide any improvement to films shot at 24 frames per second.
If you look at the specifications for your HDTV, you might be surprised to see it has a native resolution along the lines of 1366 x 768, which doesn’t seem to fit any of the formats we’ve described so far. However, the TV will contain some scaling circuitry which will allow it to scale up a 1280 x 720 image, or scale down one at 1920 x 1080, although usually only if it is interlaced (ie 1080i). The results can be good, but not as good as having a real screen pixel for every pixel in the image. So, for the best results, a HDTV with a native 1920 x 1080 resolution will be optimal for 1080i or 1080p material.