Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

10 Beginner Mistakes to Avoid When Mixing

Music
by Nick Messitte, iZotope Contributor June 19, 2018

So you have an interest in mixing music, and you’re looking to avoid the pitfalls that swallow up
lesser engineers. That, dear reader, is very wise of you.

What follows are some of the biggest traps I’ve seen bedevil nascent engineers. If you find yourself
guilty any of these, fear not, I’ve been guilty of them too! So have we all. Read on, and learn from
our mistakes.

1. Employing too much processing on a track


Piling oodles of plug-ins on individual tracks is by far the biggest mistake I see in beginners. We’ve
all fallen victim to this, and it’s hard, because we see our idols do it on tutorials, in blog posts, and
sometimes in person, if we’re lucky.

But our idols have a method to their madness. When Dave Pensado displays a six plug-in chain on a
YouTube video, he’s showing you a step-by-step blueprint. He knows why one EQ might be right
for the high-pass on a track, why the next is appropriate for boosting the high mids, why such-and-
such compressor has the perfect complementary tone, and most importantly, how all the plug-ins
are going to work together.

Yes, it’s important to search for the right moves, to experiment with new tones, but experimentation
can be detrimental—in the mixing phase at least—if you don’t have an idea of the sound you’re
going for in your head, if not a concrete reference of the sound on hand.

2. Trying to turn a sound into something it isn’t


I once watched a talented engineer who had just graduated from audio school sit with a classical
recording. Little by little, he turned the sound of a baritone vocalist into something overly
reverberant, a bit harsh, and altogether unnatural. Eventually, he turned to me and said “it’s not
working, is it?”

So he did something daring: he took off all the processing and limited himself to two plug-ins and
one send effect. To his surprise, it turned out quite well. I’ve since heard classical recordings this
man engineered, and they are as excellent as anything I’ve listened to.

I take no credit for his evolution—I didn’t offer any advice or criticism; limiting himself to two
concrete plug-ins was his idea. I only watched him learn that you don’t have to work so hard,
especially if you don’t fight the natural sound.

It comes back to this: Whenever you’re reaching for a new plug-in, do you know what you’re trying
to achieve with this next move? Are you serving or fighting the sound?
Of course, the answer to this question is dependent on maintaining a clear picture of what you want
to accomplish, which brings us to our next pitfall:

3. Not having a clear idea of what you want to do


This is an affliction that doesn’t just affect beginners—my peers and I get caught up in this one all
the time. It can be enthusiasm as much as anything else: We just want to keep working, so we often
don’t let ourselves stop to wonder what we’re trying to accomplish in the first place.

Now, a contractor would never build a house with only a loose idea of where the bedroom is. So
why do we think we can fashion a mix with little idea of how we want the bass to end up?

It may be suitable for artists and producers to mess around, but I’d wager our job, in the mixing
music phase, is more like artfully executing blueprints than painting a landscape. If you agree, then
it’s best to have a clear idea in mind—even if it’s only a glint—of what you want to do before you
set out doing it.

4. Not paying attention to gain-staging


You’d think in this world of floating point mathematics that conventional gain staging can be laid to
rest. However, many plug-ins—especially analog mods—respond to the strength of signal in the
manner of yesterday’s gear. If you use analog gear/emulations on an aux track, juicing the level of
the instruments feeding that aux might distort the sound past what you’d want. This can become
problematic, especially in large sessions, where you need to pay attention to the levels of many
moving parts.

Also, you’re stuck in your own system if you don’t pay attention to good gain structure. What do I
mean by this? Live mixing, working off a producer’s session, using an analog board in any
recording capacity—these money-making tasks are much harder to execute when your modus
operandi doesn’t play with these formats, some of which call for proper, analog-style gain-staging.

That’s why I tend to treat signal flow within a DAW as I would an analog console. It keeps me
more mobile, should I need to be. It also helps keep my sessions more organized, in case I decide I
need to assign channels or whole groups of channels to new/different busses.

Signal Flow Chart

5. Not paying attention to phase relationships


When you’re just starting out, it’s hard to know if two signals are out of phase, especially if no
one’s around to teach you how to recognize the predicament.

I remember one song sent to me by an old friend; he had complained about a generally washy
feeling to the drums in the rough mix. The overheads, it turned out, were out of phase with each
other—just flipping the polarity on the right overhead greatly tightened up the mix.
Indeed, drums are often problematic, so here’s what I do when presented with a multi-miked kit: I
check the phase of the overheads against each other, flipping one of the overheads to see which
gives me a more cohesive, solid picture. It’s usually a night and day difference.

Then I test the other mics against the overheads in solo. I listen for which combination has more
body in the lows and low-mids. I also watch the meters—chances are, the polarity arrangement that
yields the higher level will be the one that’s in phase.

But equally troubling for beginners, I’d wager, can be knowing when to leave elements out of phase
—or knowing when to manipulate phase relationships for intentional effect.

Drums don’t usually apply here, but multi-miked guitar cabs do. In this case, you can think of the
phase relationship between two mics as an opportunity for tonal variation—an EQ, almost. Keep in
mind that the quality of sound will change depending on the relational level of the tracks too.

To sum up: when mixing elemental instruments like drums and bass, check for phase, and try to
favor the cohesive picture. When mixing elements are not so foundational to the track, learn to use
phase relationships to create the best tonal picture.

6. Putting reverb on every track


I remember my early fear of a dry signal. Out of this fear, I’d slap reverb on nearly everything. But
in my early days, this approach yielded nothing but a pea soup of sound.

I was not yet cognizant of how differing reverbs signify particular trends or genres, or how some
sounds might have been recorded with reverb already—guitars being a good example, but also
synths given to you by a producer.

Once again, learning grew from limitation. So if putting reverbs on every track sounds like your
bag, I invite you to try the following: Limit yourself to only four or five reverbs across a whole mix
—and to shoot for fewer if possible. Maybe apply some verb to the drums, the vocals, a touch of
“exploding snare,” and a spring verb on an otherwise dry guitar.

The same goes for other effects. In our efforts to make everything interesting we can dull the overall
impact of the entire mix. Therefore, learn the intentionality behind modulation, delay, and
conventional pitch variance. Understand what, exactly, a phaser will get you, as opposed to a
flanger or a chorus. Learn how delays can expand the spaciousness of a sound (a synced, low-level
stereo delay), or establish a genre (a rockabilly slap, for example).

For related reading, check out our blog “9 Common Reverb Mistakes Mixing Engineers Make.”

7. Working in solo for prolonged periods of time


I’ve written about this at length in other articles, but it’s a classic mistake, one not ameliorated by
the plethora of YouTube tutorials out there. Sure, plenty of big names have great tips to offer, and
to demonstrate these tips, they’ll often play their results in solo so you can better hear them.
However, these engineers don’t always remember to warn you about working in solo; if you didn’t
know, and if you stumbled on the video, it might convey the wrong idea.

So let’s bow the old saw and reiterate that generally, it’s not good to mix a single sound in solo, as
you lose perspective quite quickly. Still there are caveats:

For brief moments in time where it’s necessary to home in on a problematic part of the sound—like
a resonant snare drum—soloing is appropriate. Also, there is nothing wrong with soloing a group of
tracks. To mix the drums, the bass, and vocal in solo in order to achieve a better micro balance is
useful in short intervals.

8. Not paying attention to timing and tuning


So many times I’m presented with a mix and asked, why doesn’t this sound like the real thing? Half
of the time there are sonic issues, but often it’s the editing: If something is off key or out of time, it
falls upon our shoulders to fix it as best we can—but always in line with the artist’s intentions.
Nobody is going to autotune Bob Dylan (at least, I hope not), but Justin Bieber is another story.

Similarly, a band like the White Stripes would get more off-the-grid leeway than an outfit like
Imagine Dragons. It behooves you, again, to vet the intentions, to check the references, and to make
the changes.

9. Adding tons of bass and treble


Ah the dreaded smile curve! It’s brought many a frown to budding engineers the world over. Rest
assured, we’ve all been there, piling on bottom end and treble as though they’re ingredients that can
never go sour.

This is one of the larger mistakes, leading to ear fatigue and poor translation across speaker systems
(you’re already applying a “Beats” curve to music that might very well be played on Beats
headphones—and Double Beats is never good!).

I don’t know if there’s a remedy for this but time and referencing. In my experience, the longer I
engineer, the less I feel the need to exaggerate bass and treble. As with earlier pitfalls mentioned,
discipline in accordance with referencing is the way out: don’t do it unless you know—and have
verified with a likable reference—that the track calls for it.

10. Going for a mastered sound


This pitfall is easily understandable, because everyone wants to have a fully polished record right
from jump. Who among us hasn’t fantasized about the mastering engineer saying “this needs
nothing”? Making the desire more tantalizing is the plethora of tutorials where grammy-winning
mixers show us their fully-limited mix-bus chains.

You must remember, at your beginnings, that these are seasoned engineers—and one day, you will
be too!

When you’re just starting out, it’s good to match a master in terms of timbre, sure, but not in terms
of loudness or level, because the tools to secure those higher levels are harder to hone.
Instead, bring the reference down to give you headroom, so you don’t have to fight with the digital
ceiling. Give yourself the chance to learn how signals play off of each other with reasonable
headroom before you worry about shaving off the peaks. Otherwise, you’re in danger of fashioning
a harsh mix, and what is worse, prolonging the learning process into a series of plateaus. It’s like
that old phrase: you’ve got to learn to walk before you can run.

Conclusion
Having written this article on ten common beginner mistakes, I’d like to invite you to do something
counterintuitive: Dive head first into every one of them. Take a project you’ve already mixed and
start from scratch. Spend an hour on a weekend working on everything in solo, applying reverb to
every track—and doing all of it with a complicated master chain in place.

This isn’t meant to be snide, negative reinforcement. Rather, I think two things may come of this
enterprise: you may hear for yourself the detrimental effects of these pitfalls, or conversely, you
may stumble upon something unique and amazing. Both outcomes are great, and they both serve the
larger goal of experimenting.

There is nothing wrong with experimentation, though you may find controlled experimentation to
be of better
11 Mixing Tips for Panning Music with
Intention
by Nick Messitte, iZotope Contributor July 25, 2018

Plenty of articles cover the logistics of panning—we even provide a good one here, which gets into
panning music for frequency separation and practical considerations for panning in electronic
music.

What we aim to do in this post, however, is to cover panning with intention—put another way, why
(or why not) you’d choose to pan something. Below you’ll find practical tips to intentional panning,
starting on an instrument-by-instrument level and expanding into headier concepts.

1. Pan overheads for a centered snare


If your intention is to have drums with a clearly defined snare—one that sits in the center of the mix
—that decision often starts not with the snare mic, but with the overheads.

Bring them up in solo—it’s okay, I won’t tell anyone—and pan them left and right at equal volume.
Listen in headphones for a second: does the snare lean toward the left of the picture? If so, bring the
left overhead channel closer to center by degrees until the snare is located in the middle. How’s
your kick doing now? Is it off to the side a smidge? You can compensate with minimal panning of
the right channel, or you can leave it: the kick drum mic will draw more focus to the center than the
close-snare mic, whose fundamental frequencies compete more with what the overheads are likely
to capture.

I prefer this audio panning technique to time-aligning plug-ins most of the time, because it feels
more organic to my ears. Plug-ins like Auto-Align can do wonders for a picture-perfect image, but
you might find they flatten out the sound, or snuff out the energy a bit.

Post script: if you don’t want a centered snare, that’s fine (a slightly off-center snare works well in
the Tool song “Ænema,” for example) but all the positioning recommendations above still apply,
and that’s because of this next tip:

2. Pan the close mic drums to match the overheads


If your intention is to create naturalistic, acoustic drums, make sure your close mics match the aural
positioning of your overheads. You may want the widest tom sound possible, but it is quite difficult
to negate the inclinations of the overheads and not sound immediately different from them.

This can be your intent, sure, but intention is everything: the snap and ambiance of close-mic’d
drums are frequently found in stereo overheads, so tread carefully, lest you induce disorienting
phasing mixups.

3. For hard rock guitars, try hard panning


This is a basic technique, but it bears repeating. If you have two hard rock guitars doubling each
other, the quickest way to that classic hard-rock sound is to pan them all the way left and all the
way right. That’s the convention most of the time, and if your intention is to fit or play within this
convention, that’s your move.

Examples of this technique in practice include “Sweat Leaf” by Black Sabbath, “Back in Black” by
AC/DC, “Master of Puppets” by Metallica, “Undone - The Sweater Song” by Weezer, “Spoonman”
by Soundgarden, and many, many, many more.

4. Try to get rid of fake stereo keyboards


Many times a software keyboard comes my way as a stereo track and, after listening to it, the first
thing I do is trash one of the sides. Why? Because the mix has so many other things going on—
some of them mic’d in stereo—that instruments made wide with delays and modulations only
muddy up the image.

Sometimes panning music with intention involves knowing when to collapse a stereo source to
mono, so you don’t obfuscate the general picture.

Indeed, good directionality in mixing comes not from stereo elements, I’d wager, but from excellent
positioning of mono sounds across the stereo spectrum. That’s how you achieve a mix that’s cleanly
defined, and easily beheld by the listener.

5. Don’t always go wide with background vocals


Left to my own devices, I’ll pan background vocals like this:

That’s a sample from my pet project, a work of audio fiction called Salmon’s Run, and sure, this is
a valid technique for stacks of background vocals. But last year I was blown away by the vocal
stack of a song from Super City, one of my favorite indie bands. The vocals sounded lush, full, and
amazing, but to my head-phonic surprise, they were panned thus:

These vocals are put largely in the center to make room for individual elements on the side. Yet
they feel full, stacked as they are in other ways—mostly with EQ and level.

Sometimes, especially for background vocals, the decision not to pan them can be even more valid,
especially in an already crowded mix. This is also a case for using references, because without
hearing a mix I like, I wouldn’t have anything to model my future work upon.

6. Use auto-panning as a genre indicator


Auto-panning mechanisms not only create interesting soundscapes, but also signify genres. A Leslie
on a vocal is a good example: slap that on a stack of voices and you’re instantly transported to Blue
Jay Way (even if the Beatles didn’t use stereo Leslies on that track).

Auto-panning electronic hi-hats and glitch vocals can automatically evoke classic 90s Electronica or
Big Beat (Check out “Kick That Man” by Meat Beat Manifesto, for example). Put some electric
guitars in reverse, slow down the panning speed, and you’re in psychedelic territory, as D’Angelo
did on Voodoo’s “The Root.” Add some eighties-style synthetic auto-panning, such as the type
found at the end of Styx’s “Mr. Roboto,” and you’re speaking the language of eighties prog-pop,
the sort of stuff that fits right into the textures of an indie artist like Bon Iver.

Listen to your favorite genres and learn what audio panning moves are typical—or what
autopanners are employed—and this will go a long way toward following or breaking convention.

7. Try LCR
LCR stands for left, center, right. If you find your mix lacks distinctive width or space, I want you
to try the following: with the only exception of acoustic drums, try panning every element in your
mix to one of these three places. Got a bass? Put it up the center. Got a piano? Only on the right.
Double tracked guitars? We covered that already. Violin? You’re starting to get the idea.

You’d think this exercise would create a rather rigid, regimented stereo picture. But go ahead, try it,
and tell me what you notice. I’ll bet you’ll hear a cleaner stereo image, one with less clutter and
more vibrancy. Now, you can select which standouts, which quirky instruments in your mix,
deserve the interstitial places, the 3 o’clocks and 5 o’clocks. They become soloists in the mix’s
ensemble.

If you haven’t tried it, you’ll find this exercise might teach you a lot about how to create space with
minimal panning moves, allowing reverb or special instruments to occupy the space between the
corners. As someone who routinely mixes both in the box and with analog gear, I can also tell you
that an LCR approach to an ITB mix helps me achieve a feeling of width that I don’t otherwise
when staying wholly digital.

Here’s an example of what a mix might look like with an LCR-like panning scheme. Other than the
drums (which are panned naturally to reflect the overheads), there’s only one standout element
given an “in-between” spot. That would be the viola.

LCR panning example in Neutron 2 Visual Mixer

8. Play with balanced and unbalanced panning schemes


There is panning for symmetry, where every element feels balanced. A shaker in one location can
counterbalance a 16th-note funk guitar part panned in the opposite direction, because the two
complement each other. This is a balanced approach.

Asymmetrical panning, on the other hand, leads to a feeling of imbalance—and this feeling can be
your friend.

Check out the song “A Distorted Reality Is Now a Necessity To Be Free” by Elliott Smith. Note
how the drums exist only in one headphone—yet this is no hastily panned stereo mix from the
sixties. This is for a jumbled effect, to denote the sad state of the singer’s song and of his life.

But keep listening and you’ll hear that during the climax, more drums come up in the opposing
speaker, playing counter rhythms and Ringo-like fills. As we grow into cacophony, the mixing
arrangement goes from unbalanced to balanced, and there is a lesson here:
Going from balanced to unbalanced and back again can create an emotional journey for the listener,
so give it a try. This applies not just with similar elements, but complementary one. Contrapuntal
rhythms panned in similar directions can lead to a thin feeling, while opposing them lends a
balanced width. Play with this balance not just statically, but with automated gestures as well.

9. Close your eyes and visualize


Every time I feel a soundstage getting away from me, I like to stop the playback, shut my eyes, and
picture the space I’m creating, down to the color. Is it a ballroom? If so, is it Roseland or
Hammerstein? Is it a black, all encompassing void? Well, would that be the void of a dark basement
—with all the concrete/cinder block reflections such a basement suggests—or the void of outer
space, which is soundless?

The specificity of this exercise allows me to make better decisions, not just with panning, but with
other aspects of stereo dimensionality. If I’m imagining seeing the band at the prudential center,
that evokes a much different landscape than watching them play a house party. From there, I can
make decisions about how far apart the band members are, how far away they seem, what their EQ
is like, and how reverberant elements of the mix might be. Which brings us to our next point:

10. Pan front to back, and behind your head


Don’t let anyone tell you that EQ and reverb can’t create a sense of physical location. They
absolutely can: they create another axis of panning—your front to back perspective. The further
away an element is, the less high end you’ll hear, as well as level. Depending on the environment,
you’d want to filter the lows (far away but still in doors) or boost them (far away but outside).
You’ll also hear more late reflections and fewer early reflections.

Don’t forget this vital axis, for then you are truly panning music with intention. That’s how you can
put a drummer behind a band, rather than along side of it.

You can even put elements outside the speakers, or behind your head, with polarity inverting
techniques. A simple one involves creating an auxiliary channel, reversing the stereo direction (so
that right becomes left and vice verse), and inverting the phase of the right channel. If you have an
element you wish to appear outside the speakers, you can bus it to this auxiliary channel with a
send. At certain levels, it may give you the perception of appearing behind your ears. Do not be too
liberal with this effect however; it can wreak havoc on the phase coherence of your mix.

11. Automate movement tastefully


Excitement can issue from dragging sounds around the stereo space. This is where automation
comes into play—not only from left to right, but from front to back, with EQ and the manipulation
of early/late reflections. As always, taste is the key: too much uncalled for panning can be
disorienting, or could date your mix in unexpected ways.

However, tasteful movement, done with intention, can add dimensionality to a mix. You could
move a rhythmic element around the headphones, as on the song “Radiator Sabotage” by Leland
Sundries. As the shaker moves around the left sphere of the mix during the sparse third verse, the
tune heads towards an exciting drum fill; the decision to move this element around contributes to
the build-up.
Another example is Radiohead’s “Identikit,” where the cascading vocals of “broken hearts make it
rain” show off panning from the front to back: suddenly an aloof Thom Yorke stands out from the
crowd on the third repeat, obliterating the other vocals in a phrase that reinforces the tune’s
loneliness). Examples of active panning can be heard all throughout D’Angelo’s Voodoo, from the
opening traverses of reverse cymbals in “Playa Playa” to the backwards guitar solo of “The Root.”
Indeed, that record is an exemplar when it comes to tasteful automated panning, with too many
examples to enumerate.

Conclusion
Doubtlessly we can find more panning scenarios and more possible solutions to all your panning
woes. Since we’ve given enough to get you started, we’re going to stop here and dwell again on
intention. Intention is the most vital thing in a mix—it dictates your every decision. It’s the why
behind your what, how, or when. It applies to all aspects of the mix, and panning is no exception. It
is our hope, now, that the tips we’ve provided will get you closer to the mix that you’re intending—
one that critics will have no reason to pan.
10 Tips for Better Mixes Through Panning
August 1, 2016

Panning helps to determine how wide our mix ends up sounding to the listener. It can be used to
create space in a mix, enhance existing space, and create a more immersive musical experience for
the audience.

In the image above, the mix is panned very narrowly, and the Stereo Vectorscope tells us the
listener will not hear much spatialization, or width.

However In this image, the mix is panned very widely, and you can see the result. A much wider
mix without any processing required beyond panning.

Using panning musically


How do we know what to pan, and where? In much of today’s popular music, the backbeat and lead
vocal are the focal points of the mix. Because of this, the kick, snare, and lead vocal are usually
panned center, often referred to as C or 0 by most DAWs. The other elements of the mix are what
the mix engineer typically uses to create a stereo image of your song. Our ears tend to focus on the
signals in a mix that are panned center or panned extreme left or right, while the points in between
are less distinct.

The idea here is that we’re creating an audio picture for our listeners to experience. In some cases
this will mean that, if you close your eyes and hear your mix, you can picture all of the musicians
playing their instruments as if they were positioned on a stage. In other cases, it just means that
you’re trying to create movement and excitement by having newer instruments pop up in your
stereo field for the ear to focus on.

There are no hard and fast rules for this; just guidelines, but here are a few tips:

1. Double-tracked guitars: When recording guitars, double-tracking (recording the same part
twice) as well as panning one recording extreme left and the other extreme right can create a
much fuller-sounding mix without overloading the instrumentation of the arrangement.
2. Complimentary panning: If there are two instruments in your mix that occupy a similar
frequency range, try panning them opposite of one another. You don’t have to pan them to
the extreme. For instance, a guitar panned slightly to the left will complement a keyboard
panned slightly to the right. This will create a better balance throughout your mix, as the
listener won’t perceive all the instruments to be coming at their ear from exactly the same
position—which can be fatiguing and make it hard to know what the ear should focus on.
3. Snare on or off center: Panning a snare dead center can immediately make it sound
punchier, while panning it slightly to one side might cause the listener to focus slightly more
on the lead vocal or kick drum.
4. Narrow verses, wider choruses: Try keeping a narrower image across your whole mix
during the verses of your songs and then widening that image by panning the elements that
appear in the choruses further away from center. Having certain elements pop out like this,
or even just move temporarily to a more extreme pan setting, will create excitement.
5. Check in mono: Be sure to occasionally listen to your mix in mono to ensure you aren’t
losing too much in the translation. It’s possible to spend a long time panning everything,
only to go too far and realize your mix sounded more impactful before you even began!
6. Consider the club: If you’re mixing any form of electronic music that’s likely to be played
back in a club setting, bear in mind that most playback systems are mono. Having identical
audio signals panned both to the left and the right can cause phase cancellation when the
mix is collapsed to mono, particularly in the low end. You should still mix a nice-sounding
wide mix, but keep checking it in mono to make sure you aren’t losing anything when the
mix is collapsed to mono.
7. Check in headphones: Check your mix in headphones to make sure it doesn’t sound too
disjointed or off balance. Your monitor speakers might be excellent, but since headphones
lack the crosstalk (audio information from the right speaker reaching the left ear and vice
versa), the experience can sound different. Remember, much of your audience might be
listening to music in headphones!
8. Don’t overdo it: Make sure that the elements you pan don’t make the left or right side too
rhythmically busy. For example, when mixing two instruments that occupy a similar higher-
end frequency range, such as an acoustic guitar and a hi-hat, you can pan each instrument
opposite sides. Since these two instruments are usually playing a similar rhythm (8th or 16th
notes), keeping them opposite of each other maintains a similar timbre and rhythmic feel in
both speakers. Panning a lot of rhythmic elements to one side could be quite distracting.
9. Vintage vibe: With that said, sometimes older recordings, or modern recordings mixed with
nostalgic, vintage methods, might pan the drums almost all the way to the right, and the bass
opposite on the left. Doing this will require more effort and attention on the part of the
listener, but it can result in interesting textures.
10. Less is more: Sometimes the widest sounding mixes don’t come from panning everything,
they come from panning just a couple of interesting elements while maintaining a strong,
balanced center. This also tends to correlate very well in mono. Try just making just one
element of your mix wide and spacious, like doubled-guitars, a stereo piano track or
overheads, and make everything else work around center with careful level setting and
judicious EQ. You’ll be surprised how powerful this can be!
10 Common Delay Mistakes Mixing Engineers
Make
by Nick Messitte, iZotope Contributor July 24, 2018

A little more than a year ago, we began a series on common mistakes in mixing, covering
compression, EQ, and reverb. Today, delay has its turn. This is a list of mistakes meant to educate,
not reprimand. So if you find yourself guilty of some of the following, don’t worry—we all have.
Let’s have at it!

1. Not knowing why you’re using delay in the first place


Delays can provide ambiance. Delays can provide emphasis. Delays can change the perceived
location of the signal, either in front-to-back perception or in panning (the haas effect). Echoes like
slapbacks can indicate genre. Timed stereo delays can reinforce the groove or help to create a new
feel for the song.

These are all different use-cases for delay, and not knowing which one you’re trying to elicit is
perpetuating a great disservice to your mix, because that way, sloppy accidents lie. Know why to
delay, before knowing when and how.

2. Putting delay on every track


Similar to putting reverb on every track, overuse of delay is a big offender in the early career of a
mixing engineer—especially when combined with overuse of reverb. Sometimes a track needs to be
dry. That’s fine! It comes back to understanding the basic uses of delay: the simulation of an
acoustic space, the ability to add a sense of direction to a signal, the evocation of a specific genre, or
the fortification/emphasis of a particular rhythm.

This isn’t to say you need to limit your use of delay all that much; Dave Pensado has repeatedly
mentioned how Jaycen Joshua piles on delays that, when the mix playback stops, last for days.
Same goes for STL GLD’s rapper, Moe Pope, who requested that the producer, The Arcitype, add
six bars of delay to his vocals. It’s part of his sound.

3. Badly timed rhythmic delay


The secret to mixing is not unlike the secret to comedic (timing) success (forgive the joke). Delays,
which are discrete repeats of the initial signal, are by nature rhythmical, and thus, influential in
regards to timing.

Timing can be used in several ways here. To reinforce the spaciousness of a drum bus, you can use
a stereo delay perfectly aligned to a subdivision of the drum’s tempo, edged ever so slightly into the
mix. Conversely, to stand out more, an element can make use of a delay intentionally unaligned—a
millisecond value of a prime number like 23, 29, 31 or 37 can be particularly helpful in this
endeavor.
But discretion is called for. If using unconventional timing, you must take great care with the exact
value of the delay—you must time it by ear, lest it feel utterly wrong. Likewise, the subdivision of a
timed delay is equally important: a triplet subdivision might, for example, add too much swing to an
otherwise straight tune.

For example, take a listen to this drum loop:

This is a processed drum loop, taken out of a mix. As you can hear, it has a straight feel. The
straight feel is accentuated in the following example by some stereo delay:

I’ve put the delay higher than I usually would in a mix so you can hear what’s happening. Notice, if
I change the feel of the delay to dotted and triplet repeats in either side, how the whole feel of the
drum loop changes.

This is why you have to be quite careful when timing your delays.

4. Delaying leakage by accident


Particularly common in acoustic, live arrangements, this mistake involves not only delaying a sound
source, but also delaying its accompanying ambiance—ambiance that holds the sound of other
instruments. Imagine the drums bleeding into a vocal mic; when it comes time to add delay to that
vocal, the snare is going to echo as well. One must be careful not to induce this kind of effect
without intention or else all sorts of weird timing issues can compound, resulting in a muddy image,
or other negative impacts that divert attention from the song.

The way I see it, you have two options to eradicate such issues: smart filtering of the delay so that
only important frequencies are carried is one. You could filter the offending sound out of the delay
with EQ. If that’s not possible, you could duck it by sidechaining the delay with a compressor, one
keyed to the offending instrument.

However, if a less filtered sound is called for, a bit of RX trickery can go a long way. All you need
is the original track, the offending track that’s causing the bleed, and RX 6; simply use the De-bleed
module to create a mult relatively free of offense, and base your delay off the mult, not as a send,
but as a separately effected track, with delay all the way wet.

5. Not processing delays


Preset diving is the way many of us learn in the digital epoch, and sometimes speed precludes
experimentation with individual delay parameters. But still, too many engineers forget that they
have the option to effect the delay further, to process it with external EQ, modulation, or dynamics
processing.

Using EQ on the delay will help it sit in the mix, rather than reinforce bloat. If the delay gets in the
primary source’s physical space, subtle stereo chorusing on the delay can help the two decoagulate.

The dynamics of a delay are equally important: I’ve gotten lots of mileage when sidechaining
delays to a snare or kick for a sort of staccato bounce (you can also blend this in with the original
delay to create an even richer tapestry, if the mix calls for it). Sometimes you want the delay to
change depending on the dynamics of a given part, and for that, tools like iZotope’s DDLY are
quite adept.
6. Not automating delays
Sophistication comes from a musical approach to all elements, and for sophistication, nothing beats
the bespoke, hands-on touch of automation. It always surprises me when automating an echo
doesn’t occur to a friend or colleague. Why not speed it up as it fades out, if you want to give it
some scene-stealing character? Why not control its fadeaway to your exact specifications with level
automation? Why not freeze the delay, if that parameter is offered, to create sudden blasts of
interest? Why not manipulate a pre-delay EQ to change the overall timbre of the effect?

Of course, none of this matters if you don’t know why you’re using delay in the first place, but you
don’t want to miss an opportunity if one presents itself.

7. Not making use of manual delay


Automating a delay is great—until it’s too much work for the simple task at hand; who here has
automated a delay for a nice “throw” or single-played repeat? If that’s you, ask yourself the
following:

Why didn’t you just mult the throw to a new track, and then place/pan/EQ/render the phrase to your
exact predilections?

It might seem like too much work at the outlay, but in the long run, manual delays can be a much
faster way to your end goal, and can even save on CPU. Check out how I processed a throw
manually in its own dedicated track with some Neutron EQ, followed by a little stock plug-in delay:

Manual Delay Throw Example

8. Stereo delays that flam in mono


Raise your hand if you’ve ever put on your DAW’s digital delay to create a sense of width—in
effect, to create a stereo track out of one element.

I don’t know why you fell for this gag, because now you’re raising your hand while reading an
article, and I don’t even know where you are. But I do know why you fell for the stereo delay trick:
it simply works.

Until you monitor in mono, that is; then you hear the unnatural delay slap, like bad flutter echo, or
comb filtering. What was suddenly full and lush now sounds like bad acoustics.

What’s the fix here? Not using a stereo delay for panning? No! A bit of manipulation on one of the
sides can do the trick. If it’s a melody, I may try a subtle pitch shifter that modulates just enough;
done right, it won’t cause attention in the mix, and it won’t have that horrid flam effect in mono.

If I’m delaying a mono percussive element such as a drum loop, there’s no better tool in my box
than Neutron 2’s Transient Shaper, which allows me to manipulate the dynamic response of
percussion on a multiband basis; maybe a bit more sustain on the snare, or a bit more attack on the
kick, will avert the dreaded doubling.
9. Not putting delay on a reverb (or other effect)
Sometimes the biggest mistake lies not in how delay is used, but how it is not used. A good
example might be a ballad with a spacious lead vocal. Often, reverb is not enough, because the
signal clings to the center vocal in spatial temperament. Here a little stereo delay can go a long way
to clearing out the middle, so the vocal can comfortably sit in all that space.

10. Not using delays to create ambiance


This error can often befall people working in sound design, but it also extends to music mixers
looking to evoke real-sounding spaces. The most concrete indicator of a space is the quality—and
location—of its reflections. To rob an important source of this indicator is to deprive it of the
vitality it could otherwise display. You could think of it in these metaphorical terms: The delay
provides the fundamental to the reverb’s harmonics.

To make sure you’re always attuned to the way sound delays in certain spaces, there’s no better
practice than investigating these places with your own ears. Go into churches of various sizes and
listen to the choirs. Hear the vocals as they balance and take note of how they bounce around.
Repeat this with exercise with living rooms, bathrooms, barrooms, the great outdoors, and
anywhere else you can think of. Pen and paper, as I’m quick to say, goes a long way here to
remembering your determinations. You’ll find that when you have a specific sonic image of these
spaces engraved in your mind, setting delays will be much more gratifying—and take much less
time.

Conclusion
Doubtless, there are more mistakes I could enumerate here, but this covers the basic gamut. It is our
hope that thus educated, you can level up without delay.

You might also like