Always striving for a unique sound, modern music is built on techniques that have been developed, modified and twisted by countless recording artists, producers and studio engineers. While some were invented by accident and others were developed over generations of technology, these audio effects have shaped music as we know it.
For example rather than buying an actual vintage amplifier, software plug-ins and affordable digital boxes let you replicate its signature sound, distortion and even room reverb.
Plenty of purists prefer certain original/hardware FX gear and recording tools (which can sell for inflated retro prices), but it’s great to have choice.
One of the oldest yet most recognisable music effects, the vocoder is typically used to alter the sound of speech. From harsh robot voices to warped and bent vocals, the vocoder has been used in countess songs. It’s not just music either, the vocoder has been regularly used to create voice effects for movies and TV shows. Some notable examples include the voice of Transformer Soundwave and the Cylons in Battlestar Galactica.
The vocoder started off as an experiment into creating synthesised human speech in the late 1920’s. Developed by Homer Dudley of Bell Labs, the vocoder concept was later used as a voice encryption system in World War II.
Over the following years various musicians experimented with the technology and in 1968 Robert Moog developed one of the first solid state musical vocoders. In 1971 the vocoder was used in the soundtrack for A Clockwork Orange.
It wasn’t until 1974 though that the vocoder was used in a widely successful song, Kraftwerks Autobahn. Other notable bands such as Pink Floyd, Queen and more recently, Coldplay have all used the vocoder.
Perhaps more extensively than any other band, Daft Punk has used the vocoder effect across its albums. One of the most recognisable uses is Around the World, were the lyrics are totally vocoder processed.
While diehard loyalists are quick to claim nothing sounds like an original analogue vocoder, you can now create the effect with software. The vocoder is still used for encryption too, as well as in telephone systems to compress speech to lower data rates.
An electronic sound processing technique, phaser splits and mixes waveforms with different phase delays, to create interference and a sweeping audio effect.
Phaser effects are often used with electric guitar, where a pedal can be used to control an out of phase copy of the main signal to add interesting wah wah like effects to an otherwise clean sound.
Phaser can be used to make a sound more synthetic, such as modifying a human voice into something robotic or alien.
Phaser has been used by many musicians, but one of the most prominent was Eddie Van Halen in the 1980s. Tracks such as Eruption and Atomic Punk made heavy use of the effect. In more recent times, Daft Punk used Phaser on their 2001 Album Discovery.
More recently we have this great phasing example from Australia’s Tame Impala. You can hear the band’s signature psychedelic phased guitars from the very start of Solitude is Bliss. Band leader Kevin Parker once joked, “if I could put my breakfast through a phaser pedal, I promise you I would.”
A music effect created by duplicating an audio signal, then delaying one slightly and mixing them back together. Flanging creates an ethereal, swirling harmonic sound, where varying the delay causes the effect to sweep up and down.
Flanging supposedly originated by playing back two identical recording on tape, with one physically slowed down slightly by pressing on the tape flange. Les Paul and other musicians experimented with flanging in the 40’s and 50’s, but it wasn’t until the the 60’s it became further developed and more widely used.
The Beatles used flanging on a number of songs, with one of the most famous Tomorrow Never Knows. In the 70’s it became possible to create the flanging effect with digital circuits. These days the effect can be easily created with computer.
Incidentially, The Beatles also helped pioneer intentional guitar feedback, close-microphone rigs for acoustic instruments, sampling, backward tapes, the music video, the concept album, stadium concerts and more! At The Beatles request, engineers at Abbey Roads Studios also directly invented artificial/automatic double-tracking (ADT) — a tape delay technique to more naturally double (thicken) voices or instruments on a recording.
Created by Antares Audio Technologies, Auto-Tune is a processing effect originally developed to help hide off-key vocal mistakes. The processing slightly shifts notes to the nearest true semitone but can also be used more aggressively to modify and distort voice pitch. (These days Celemony’s Melodyne software gives Auto-Tune quite a flexible run for producer’s money).
For many, Auto Tune was first made famous by Cher’s Believe, which used it as an effect in itself, not just a way to improve the music. The effect was used at the highest setting, so it corrected the pitch of each note in an harsh, artificial way to create a unique sound.
R&B singer T-Pain also helped popularise Auto-Tune as audio effect in it’s own right. The phone app I Am T-Pain simulated the Auto-Tune effect on your phone and was downloaded over 300,000 times in the 3 weeks after launch.
Using Auto Tune has become a fairly standard practice in the industry, especially for live performances, though many musicians are vocally opposed to it. Also see: every other track from Kanye West and Auto Tune The News.
Even animals are getting in on the action, with the auto tuned singing Husky, Mishka.
Most notably used with electric guitars, the wah wah effect alters tone with a sweeping frequency effect.
While the concept itself was not new (dating back to trumpet players in the 1920’s) the Wah Wah pedal as we know it was not invented until 1966. It was actually an accident — created during the re-design of the Vox Super Beatle guitar amplifier.
Wah Wah can also be held in one position to boost specific frequencies and emphasize an instruments ‘sweet spot’ — a favourite technique of Jimi Hendrix.
Wah Wah is used beyond guitars, from trumpets to electric pianos and violins.
When any sound is produced, it reflects off different surfaces in different ways, bouncing around until it’s absorbed. Called reverberation, it’s different to echo, and the reflected signals are heard within 50ms of the original, giving a warmer, spacious sound.
Reverberation gives music a specific tone and feel depending on the environment. Performance areas such as concert halls are designed so reverberation improves the sound, rather than letting it interfere with itself or create echoes.
Reverberation is also a factor in instrument and speaker design, where the natural resonance can create different effects.
Rather than rely on the sound from a specific space, reverberation can be generated artificially. Reverb system use techniques such as a transducer and receiver on each end of an adjustable spring or metal sheet, which can capture the variable vibrations. Reverb effects were used by bands such as the Beatles and Pink Floyd in the 1960’s.
In more recent times digital systems can create reverb and are able to mimic the sound of specific environments. Able to be run on a computer, reverb can be simulated for famous locations, such as the Sydney Opera House, or weirder spaces, such as the cockpit of a 747.
While most bands use reverb in some form or another, some put it to greater effect, such as in Pink Floyd’s Sorrow. The Phil Collins song, In The Air Tonight, uses gated reverb to create a unique punchy drum sound.