How To Use Audio Effects (Plugins)

Overview

Audio effects are some of the most essential tools to sculpt and design sounds. They can be split into hardware also referred to as outboard gear (guitar compressors, delay unity, guitar pedals, etc.) and software also known as virtual instruments or VST plugins which can be loaded into the DAW.

If this is new to you then I recommend reading this article from the eMastered Audio blog giving you a great overview of the basics: https://emastered.com/blog/audio-effects-explained

Before you proceed to the next topic I want to give you a quick reminder that there are more plugins out there than you could use in a lifetime. So ignore the noise and focus on learning the ones that are native to your DAW. They are generally really good quality and you can get some excellent results using them. Then expand your toolbox slowly in areas where you want to invest time into.

Now let's dive into the how-to-use each plugin effectively.

Types Of Audio Effects & How To Use Them

Equalizers

Commonly shortened to EQ, an equalizer allows us to adjust the frequency content of sounds by boosting or subtracting portions of a signal.

Here is a great video from Aftertouch Audio explaining what an EQ does, and how to use one in more depth: https://youtu.be/RMLxzQr-08E

Compressors & Limiters

Compressors and limiters are dynamics tools, that allow us to reduce the difference in volume between the quietest and loudest parts of a signal - making quieter sounds louder, and louder sounds quieter.

This can be used to bring out the small details in a sound or make sure that the transients (the short but very loud parts of a sound, like the crack of a snare drum) aren’t overpowering the listener.

Additionally, they can be used to gently reduce volume changes in a signal over a longer period of time, and give a more consistent overall loudness - this kind of use case is very common in dialogue processing.

A limiter is a type of compressor that acts much more aggressively by effectively stopping a sound from getting louder than a specific threshold. This helps prevent the sound from clipping which causes digital distortion and artifacts.

Check out these great videos and articles for some more technical explanations and examples:

Audio Compression Basics by Universal Audio:
https://www.uaudio.de/blog/audio-compression-basics/

No-Nonsense Guide: How To Use Compression by Aftertouch Audio:
https://youtu.be/7a-SJrfBu68

Limiters & How To Use Them by iZotope:
https://www.izotope.com/en/learn/an-introduction-to-limiters-and-how-to-use-them.html

Reverb & Delay

Reverb and delay are similar effects to one another, designed to add depth to a sound and replicate its behaviour in physical space.

A delay works by duplicating a signal and playing it back after a specified amount of time, creating a distinct repeating echo of the original signal.

A reverb is a more complex combination of delays and other processes that creates a more sustained, indistinct sound, based on the input signal.

They can both also be used to achieve some crazy unnatural effects when pushed to their extremes, making them invaluable creative tools.

This article is a great comparison of the two, and explains a lot of the terminology associated with delay and reverb nicely: https://www.musicianonamission.com/reverb-vs-delay/

Additionally, here is a fantastic video explaining how to use reverb by Aftertouch Audio:
https://youtu.be/a3SssOrB7z0

In Game Audio

Reverb especially is incredibly important in game audio as a means of maintaining player immersion, ensuring that sounds feel like they’re happening in the environment shown on the screen.

The actual sound files themselves will often not have reverb applied to them, and reverb will instead be processed in real time by the engine, instead of having different versions of the same sound for each unique environment in the game.

Delays might also be added to a sound by a game’s engine, but usually in more specific situations and environments, such as a canyon or a cave.

Pitch, Frequency & Formant Shifters

In a nutshell, we can make sounds higher or lower using pitch or frequency shifting. This is a very useful effect to make sounds bigger and smaller, as well as a plethora of other creative uses.

For example, you might shift the pitch of an impact sound down to make the impact feel heavier. Formant notes are resonant frequencies that are naturally emphasised by the shape and size of an object producing a sound, such as an animal’s voice box and vocal chords. Instead of pitch shifting an entire signal, a formant shifter changes just these resonant frequencies. This can be useful in situations where you want to change the ‘voice’ of a sound without affecting the overall tonal balance or other frequency content of a sound.

For a more detailed breakdown, check out this video on pitch shifting from the developers of FL Studio: https://youtu.be/Zm7KrENPIv0

Distortion & Saturation

Distortion (aka Saturation) is the process of altering a signal’s shape to add harmonics, usually by ‘driving’ the signal into a device or algorithm that amplifies it non-linearly (meaning that the output signal is different as the input level changes).

The most common example of distortion is the guitar amplifier - the circuit tries to amplify the signal but runs out of headroom; the quieter parts of the signal keep getting louder, but the louder parts cannot be made any louder and so begin to distort as the amplifier tries to push them further and further.

Like a guitar amp, this process can be used to create harsh and aggressive sounds, but it can also be used in more subtle ways, to add warmth and character to an otherwise dull sound.

Tyler from Aftertouch Audio has put together a brilliant video on the topic explaining it in detail:
https://youtu.be/3155ZG2sojs

Further Reading

The terms saturation and distortion are often used interchangeably, but while the two processes are very similar, they aren’t identical.

This distinction isn’t likely to matter in most situations, but if you’re interested in the technical details of audio processing, this article by Sage Audio explains the differences very well:
https://www.sageaudio.com/blog/mixing/whats-the-difference-between-distortion-and-saturation.php

Modulation (Phaser, Flanger & Chorus)

Modulation is a catch-all term for effects that use alter certain properties of a signal in a regular, usually repeating, pattern.

Common examples include pitch, phase, and stereo pan position, sometimes modulating multiple parameters at once, and are usually broken into sub-types such chorus, phaser, flanger and others*.* These effects are very similar to each other and thus are in the same category. They are especially useful for sci-fi sound effects, as well as creating watery textures and whooshing/sweeping noises, but of course, the possibilities are endless.

This excellent video explains the processes in more detail and how they differ from each other:
https://youtu.be/4vRleMQdZZU

Vocoders

Vocoding is a synthesis/modulation effect that works by splicing a carrier signal (an audio signal like the voice) with a modulator signal (a saw wave for example), creating a new signal as the two interact.

The vocoder is perhaps most strongly associated with the classic ‘robot voice’, and it of course a useful tool for dialogue sound design, but they can be used with any form of audio to create unique new sounds.

Vocoders are strange effects that benefit from experimentation, so I’d recommend watching this video by Tyler from Aftertouch Audio and then loading up a vocoder and playing around with it yourself:
https://youtu.be/UPzFoAbwagY

 

Want to learn more about game audio?
Check out our Learning Roadmap!

Previous
Previous

What DAW Should I Learn For Game Audio?

Next
Next

How To Record Sounds & Build A Sound Library