To show moving images, a television has to change out the pixels displayed on the screen. This is what we mean when we say the television refreshes the image -- it has to draw images in pixels so quickly that the human eye can't detect the process. If televisions didn't refresh the pixels, they could only display a still image. That's not good TV.
The standard television refresh rate is 60 hertz. That means the screen displays an image 60 times every second. An interlaced television will refresh the odd and even lines 30 times a second each in an alternating pattern. Even at this rate, we don't notice the screen refreshing because it's too fast for us to detect.
Early LCD high-definition televisions had great resolution but experienced some problems when displaying fast-moving images on screen. Action movies and sporting events in particular gave early LCD sets problems. The images tended to blur as they moved across the screen. Plasma screens didn't have the same problem, giving that format the advantage when it came to high-speed television content.
The solution to the LCD problem was to increase the refresh rate. A few years ago, the first 120 hertz sets showed consumers that by doubling the refresh rate, the set could reduce the blurring effect. By early 2009, sets with a 240 hertz or higher refresh rate were either on store shelves or scheduled for release.
The higher refresh rates indicate that the televisions refresh the screen more often each second. Whether the faster rate has a noticeable effect on the viewer's experience is subjective. A viewer may not be able to tell the difference between a set refreshing at 120 hertz and one with a 240 hertz refresh rate.
Now let's take a quick look at the relationship between an HDTV set's refresh rate and film.