Unless you are doing your recording old-school-style using magnetic tape, you will need to be able to convert analog signals into digital signals.
This is because microphones and musical instruments produce analog audio signals, and the recording software in your computer can only deal with digital signals.
The piece of equipment that converts analog signals to digital signals is called an analog-to-digital converter (ADC), but what does it actually do?
What does an analog-to-digital converter do? A analog-to-digital converter converts analog signals from a microphone or instrument in digital signals that can be understood by computer software. The two most important aspects of analog-to-digital conversion are the sample rate and the bit depth. The sample rate refers to the number of times the analog signal is sampled each second. The bit depth refers to the amount of information, or detail, that each sample captures.
The computer you use for music production or other recording will be equipped with an audio interface, also known as a sound card. This may be inside your computer, but for music production it is likely to be an external audio interface.
As outlined above, musical instruments and microphones usually produce an analog electrical signal that represents the audio sound produced.
The circuitry inside the audio interface that converts the analog audio signal to a digital signal is called an “analog-to-digital converter” (ADC). The circuit that converts digital signals into to analog signals is called a “digital-to-analog converter” (DAC).
Even though this signal conversion goes both ways, when you hear or read about this process it’s often just referred to as analog-to-digital conversion (ADC), so I’ll stick with that here.
Two Important Factors in Analog to Digital Conversion (And Back Again)
There are two factors in the conversion of analog to digital signals (and vice-versa), which can have a big effect on the quality of the digital signal captured for recording, and the quality of sound output from the computer.
These factors are the sample rate and the bit depth, or resolution.
The way that an analog to digital converter works is that it takes lots of tiny snapshots of the audio signal so that it can be represented digitally.
This is kind of like an old movie camera that took lots of snapshots on a long photographic film of people and things that were moving. Although each image is static, if you run the film through a projector with each image displayed one after the other at high speed you can see the movement on the screen.
If you only take a couple of images each second you aren’t going to capture much movement and the projected film won’t really represent what was happening.
As you take more and more images each second you capture the original movement more faithfully, until you get to the point where more images don’t really add to the quality so it’s pointless taking them any more frequently.
It’s the same idea with an analog-to-digital converter. If you play back the snapshots of the sound one after another you can produce a digital version of the original analog signal.
If you don’t take snapshots of the audio signal very often then you aren’t going to capture the sound very faithfully. But as you increase the frequency with which the snapshots are taken, called the sample rate, then you get a more and more accurate representation of the original audio signal.
This is called the sample rate because each snapshot is called a sample, and the number of snapshots taken each second is called the sample rate. Sample rate is described in the same way as other frequencies, that is as Hertz (Hz), which refers to cycles per second.
For low frequency sounds you don’t really have to take snapshots of the audio signal very often to get an accurate representation. Because the distance between the waves is so short with high frequency sounds the number of snapshots taken each second has to be higher.
Since the highest frequency sound that humans can hear is about 20kHz, the snapshot rate, or sample rate, needs to be higher than that to make sure the analog audio signal is captured accurately.
It has been shown that the sample rate needs to be at least twice the frequency of the signal that is being converted from analog to digital. The highest frequency we can hear is 20kHz, so the sample rate needs to be at least 40kHz.
The most widely used sample rate, as used in CDs, is 44.1kHz. There’s all sorts of physics involved in why it’s this exact number, but it seems to me they added 10% plus a little bit extra just to be on the safe side.
If you check the settings when you have an external audio interface connected to your computer you will probably see that it can also provide a sample rate of 48kHz and maybe even higher. We’ll talk about why you might need these higher sample rates another time.
The other important factor in the conversion of analog audio signals to digital, and back again, is bit depth.
Each of the snapshots, or samples, of the sound waves captured in the process described above contains a certain amount of information. If you think about the individual images in an old movie film, the more detailed each image is the more information you have captured.
When the film is run through the projector the moving image looks rich and detailed (and runs smoothly because of the high snapshot (sample) rate we talked about earlier.
In an analog-to-digital converter the more bits of information you can use in each sample to represent the audio signal (I’m talking computer bits here – 1s and 0s), the more accurately you can capture the audio.
If you use one bit to represent the signal in each sample it can only be on or off. If you use two bits you can represent four levels of the height of the wave, with three bits you can represent eight, four bits can do sixteen and so on.
So how many digital bits to you need to use to represent sound wave in an acceptable way?
The minimum bit depth that seems to be used by most audio interfaces is 16 bits. This means that the analog-to-digital converter is taking samples of the sound that can represent 65,536 different points in the height of the analog audio signal’s waveform.
Most external audio interfaces used for music production now offer bit depths higher than 16 bit. For example 24-bit and 32-bit resolution is common.
24 bits lets the analog-to-digital converter represent 16,777,216 different points and 32 bits provides scope for 4,294,967,296. You can see that the resolution of the samples being taken from the sound increases significantly as the bit depth is increased.
Sound Cards and Audio Interfaces
People sometimes ask if they really need to buy an external audio interface for recording and music production. They point to the sound properties for their computer, which shows that the computer’s internal sound card is capable of achieving a perfectly adequate sample rate and bit depth.
This is true, but it’s the quality of the analog-to-digital converter that makes the difference when you use an external audio interface.
These interfaces have circuits dedicated to analog-to-digital conversion. The audio interface inside your computer is connected to the rest of the computer’s circuitry and has to fight for resources with other process being run at the same time.
This can lead to poor quality conversion of analog signals on the way into the computer, which results in poor quality recordings. When digital signals are converted to analog signals on the way our of the computer you can find that there is distortion or frequencies missing from the sound.
Companies that produce audio interfaces want to make them as useful as possible. You now have a range of amazing all-in-one input/output devices to choose from that can do more than just convert your analog signals into digital ones.
However, it’s analog-to-digital conversion, and back again, that is the most important function of your external audio interface, and you only get what you pay for.