When CDs were first introduced in the early 1980s, their single purpose in life was to hold music in a digital format. In order to understand how a CD works, you need to first understand how digital recording and playback works and the difference between analog and digital technologies.
In this article, we will examine analog and digital recording so that you have a complete understanding of the difference between the two techniques.
Thomas Edison is credited with creating the first device for recording and playing back sounds in 1877. His approach used a very simple mechanism to store an analog wave mechanically. In Edison's original phonograph, a diaphragm directly controlled a needle, and the needle scratched an analog signal onto a tinfoil cylinder:
You spoke into Edison's device while rotating the cylinder, and the needle "recorded" what you said onto the tin. That is, as the diaphragm vibrated, so did the needle, and those vibrations impressed themselves onto the tin. To play the sound back, the needle moved over the groove scratched during recording. During playback, the vibrations pressed into the tin caused the needle to vibrate, causing the diaphragm to vibrate and play the sound.
This system was improved by Emil Berliner in 1887 to produce the gramophone, which is also a purely mechanical device using a needle and diaphragm. The gramophone's major improvement was the use of flat records with a spiral groove, making mass production of the records easy. The modern phonograph works the same way, but the signals read by the needle are amplified electronically rather than directly vibrating a mechanical diaphragm.
What is it that the needle in Edison's phonograph is scratching onto the tin cylinder? It is an analog wave representing the vibrations created by your voice. For example, here is a graph showing the analog wave created by saying the word "hello":
This waveform was recorded electronically rather than on tinfoil, but the principle is the same. What this graph is showing is, essentially, the position of the microphone's diaphragm (Y axis) over time (X axis). The vibrations are very quick -- the diaphragm is vibrating on the order of 1,000 oscillations per second. This is the sort of wave scratched onto the tinfoil in Edison's device. Notice that the waveform for the word "hello" is fairly complex. A pure tone is simply a sine wave vibrating at a certain frequency, like this 500-hertz wave (500 hertz = 500 oscillations per second):
You can see that the storage and playback of an analog wave can be very simple -- scratching onto tin is certainly a direct and straightforward approach. The problem with the simple approach is that the fidelity is not very good. For example, when you use Edison's phonograph, there is a lot of scratchy noise stored with the intended signal, and the signal is distorted in several different ways. Also, if you play a phonograph repeatedly, eventually it will wear out -- when the needle passes over the groove it changes it slightly (and eventually erases it).
In a CD (and any other digital recording technology), the goal is to create a recording with very high fidelity (very high similarity between the original signal and the reproduced signal) and perfect reproduction (the recording sounds the same every single time you play it no matter how many times you play it).
To accomplish these two goals, digital recording converts the analog wave into a stream of numbers and records the numbers instead of the wave. The conversion is done by a device called an analog-to-digital converter (ADC). To play back the music, the stream of numbers is converted back to an analog wave by a digital-to-analog converter (DAC). The analog wave produced by the DAC is amplified and fed to the speakers to produce the sound.
The analog wave produced by the DAC will be the same every time, as long as the numbers are not corrupted. The analog wave produced by the DAC will also be very similar to the original analog wave if the analog-to-digital converter sampled at a high rate and produced accurate numbers.
You can understand why CDs have such high fidelity if you understand the analog-to-digital conversion process better. Let's say you have a sound wave, and you wish to sample it with an ADC. Here is a typical wave (assume here that each tick on the horizontal axis represents one-thousandth of a second):
When you sample the wave with an analog-to-digital converter, you have control over two variables:
The sampling rate - Controls how many samples are taken per second
The sampling precision - Controls how many different gradations (quantization levels) are possible when taking the sample
In the following figure, let's assume that the sampling rate is 1,000 per second and the precision is 10:
The green rectangles represent samples. Every one-thousandth of a second, the ADC looks at the wave and picks the closest number between 0 and 9. The number chosen is shown along the bottom of the figure. These numbers are a digital representation of the original wave. When the DAC recreates the wave from these numbers, you get the blue line shown in the following figure:
You can see that the blue line lost quite a bit of the detail originally found in the red line, and that means the fidelity of the reproduced wave is not very good. This is the sampling error. You reduce sampling error by increasing both the sampling rate and the precision. In the following figure, both the rate and the precision have been improved by a factor of 2 (20 gradations at a rate of 2,000 samples per second):
In the following figure, the rate and the precision have been doubled again (40 gradations at 4,000 samples per second):
You can see that as the rate and precision increase, the fidelity (the similarity between the original wave and the DAC's output) improves. In the case of CD sound, fidelity is an important goal, so the sampling rate is 44,100 samples per second and the number of gradations is 65,536. At this level, the output of the DAC so closely matches the original waveform that the sound is essentially "perfect" to most human ears.
CD Storage Capacity
One thing about the CD's sampling rate and precision is that it produces a lot of data. On a CD, the digital numbers produced by the ADC are stored as bytes, and it takes 2 bytes to represent 65,536 gradations. There are two sound streams being recorded (one for each of the speakers on a stereo system). A CD can store up to 74 minutes of music, so the total amount of digital data that must be stored on a CD is:
That is a lot of bytes! To store that many bytes on a cheap piece of plastic that is tough enough to survive the abuse most people put a CD through is no small task, especially when you consider that the first CDs came out in 1980. Read How CDs Work for the complete story!
For more information on analog/digital technology and related topics, check out the links on the next page.
Does Digital Sound Better Than Analog?
Some audiophiles believe that digital recordings fall short when it comes to reproducing sound accurately. They use an intricate language filled with jargon to describe an audio system's capabilities or shortcomings. Most of their criticisms deal with sound frequency.
Humans can hear sounds ranging from 20 hertz (Hz) to 20 kilohertz (kHz) [source: Hyperphysics]. A sound wave's frequency corresponds to our perception of a sound's pitch. The higher the frequency, the higher the pitch we hear.
Audiophiles describe an audio system's sound quality regarding different frequencies by using terms like full, warm and airy. A full or warm sound comes from a system that reproduces low frequencies well. An airy sound means that the music reproduced gives the listener the impression that the instruments are in a spacious environment and usually refers to sounds in the high frequency range.
Some audiophiles say that vinyl albums perform better in the lower frequencies, meaning they provide a warm sound. They argue that compact discs aren't as accurate at reproducing sounds at this range. Other people insist that there is no detectable difference between a well-produced digital file and an undamaged vinyl record.
An audiophile would likely point out that your sound system will be the most important factor when listening to music, not the media you put into it. But assuming you've put together a really strong system that can handle both analog and digital formats, which format should you choose when shopping for a new album?
It depends on the recording method. If the recording artist used an analog format to create the master recording, audiophiles would argue that an analog copy of the music is best. That's because there would be no need to convert the sound from analog to digital. The copy should be an accurate representation of the original track.
But if the artist used digital recording, then it would be best to buy the album on CD. In order to press a vinyl album from a digital recording, audio engineers must first convert the music from a digital signal back into an analog sound wave. Any time engineers have to convert a recording from one format to another, there's a chance that the quality will suffer.
In the end, the perception of musical quality is somewhat subjective. Two people standing in the same room listening to the same music might have very different opinions regarding the quality of the recording. One might describe the music as warm and airy, while the other could say it was harsh and flat. That can happen whether the listeners use digital or analog media.
So bottom line: Which is better? After much research and subjecting ourselves to hours of listening to music, we've come up with an answer. We're going to have to call this one a tie.