The era of high-definition television is in full swing. A few years ago, only early adopters owned an HDTV set and had very little access to high-definition content. Today, HDTV sets line store shelves and fill Amazon's warehouses. Prices have fallen and many families find that an HDTV fits their budget. But much about HDTV technology remains a mystery to the average consumer.
The two dominant formats on the market right now are LCD sets and plasma screens. The two technologies each had their own advantages and disadvantages when they first became available. Today, technological advances have all but erased the differences in performance between them.
When shopping for an HDTV, you'll notice there are a lot of odd terms and numbers. You may see that the television's resolution is either 720 or 1080 -- that refers to the number of horizontal lines of pixels displayed on the screen. In general, more pixels mean a sharper, clearer picture. But a 1080 resolution isn't always necessary -- for television sets 32 inches or smaller, you may not notice a difference.
To make matters more confusing, there's the issue of progressive scan versus interlacing. When HDTV sets debuted, the two options you had were a 720p (or progressive scan) set or a 1080i (interlaced) set. Later, manufacturers began to offer 1080p sets. The consumer was left to wonder which was better. Progressive scan delivers a sharper picture -- the television displays each line of pixels sequentially. Interlacing alternates the odd lines of pixels with the even lines -- changing them so quickly that the viewer sees a cohesive image.
At first, screen technology, resolution and television size were the primary factors shoppers needed to consider as they looked for a good HDTV. But now there's another element to consider: the refresh rate. A television's refresh rate refers to how often the television changes the display of pixels per second. But why is that important?