It was a random comment made at the end of last fall’s guitar lesson. My teacher has been playing music for a long time and his band has had record deals with both big labels and independent labels. He loves the analog, the human, the vintage, and the imperfect. He still likes to mix songs with an old analog board, and he has had the same “mastering guy” finish the band’s albums for a long time, so that didn’t surprise me.
I was surprised when he said that he had been trying LANDR, an online music service that offers AI-powered mastering, for a while. You pay a monthly fee and upload a well-mixed track. Within a minute, the system sends back a song that meets current loudness standards and is pumped up with extra processing for clarity, EQ, stereo width, and dynamics.
The Art of Subtle: Why Mastering Demands Human Expertise
When it’s done, mastering, the music should sound “more like a record.” But this part of the process has always needed careful value judgments since each change affects the whole stereo track. And trained people are very good at making subtle value decisions.
I thought there would be a line about how ears and hands still have a little edge over our machine masters. Rather, I heard: “In the last year, LANDR has improved so much that it now sounds as good as, or in some cases better than, things we’ve mastered professionally.”
AI-powered mastering systems let you make changes over and over again, while mastering professionals usually only allow a certain number of changes. They give you answers in one minute, while master engineers might take up to a week. Mastering engineers may charge anywhere from $30 to several hundred dollars a track, based on how much experience they have. Now they can sound better than people? From someone who won’t buy a guitar made after 1975, this was very nice of them to say.
I marked the comment as something I should look into later.
Apple launched version 10.8 of Logic Pro, its top digital audio workstation (DAW), and GarageBand’s big brother or sister, a few weeks after we talked. As part of the update, Apple added Mastering Assistant, their take on AI-powered mastering. If you used Logic, you got this feature all of a sudden for free, and you could use it on your laptop, desktop, or iPad.
In other words, both my guitar teacher and Apple seemed to agree that AI-powered mastering was now something that could be done well. So I had to give it a shot.
In 2023, I learned how to track and mix music in a small home studio that I built for less than a thousand dollars. These days, gear is incredibly cheap and of high quality compared to the old analog era. It was surprising how detailed this process was. I didn’t know the difference between my 1176 and my LA-2A for months. I also didn’t know the difference between my dBFS and my LUFS. It took a lot of trial and error to get the mic placement right, especially for complicated analog instruments like acoustic guitars.
I started writing songs while I was in this engineering boot camp because I had a long-held desire to put on tight leather pants and bring real rock ‘n’ roll back from the dead. Some of my compositions also made me happy, which doesn’t always happen when you’re making art. The words were clever, and the tunes were easy to hum. I started to record.
From Okay to Amazing: Why “Radio-Pro Sound” Feels Unexplainable
But there was something in the way. Even after I learned the right way to do things and my tracks went from “meh” to “now we’re talking,” they never quite had that radio-pro sound. How do I get it? A lot of people talked about “mastering” as if it were a secret art as if there was some magic that would “glue” your track together and give it a high-gloss finish. They said that only experienced mastering engineers could do this magic, which involved mastering compressors and EQs that were almost too complicated to understand.
It turned out that this was a joke about the truth, but it wasn’t wrong. If someone gave you a mixed track and a mastering compressor like the one below, could you start turning some knobs right away and be sure you were making the sound better? Then, what if you worked on the track’s general EQ balance? How about adding color to the tape? Or How about making the stereo area bigger without sounding like a gimmick? How about sibilance processing or some other kind of dynamic compression? What about learning all the technical details of how to limit and boost sound to meet the major streaming services’ loudness standards?
But I wasn’t sure if I could get the sound I wanted without years more of training in my ears and skills. I was pretty good at tracking and mixing, but polishing seemed like one too many tasks for me. So I had always planned to hire a mastering engineer once I had made enough songs for an album. But now some tools said they could save me time and money, make the “dark art” less mysterious, and make me sound more like a real artist. They did work, right?
To those who are about to rock
I tested these tools with a song. My music uses Def Leppard’s acoustic ballad “Miss You in a Heartbeat” chord progression. This song’s video has some fantastic design choices. It was fun to play guitar arpeggios. I performed bass guitar beats and made drum sections with Logic’s excellent Drummer tool. I chewed a pencil while staring at the blank paper. And a narrative began.
It was about a likable loser who sings late-1980s hair metal, which is the best kind of karaoke, in front of a group of rowdy country music fans. He does all of this while wearing wigs and leather like a Rock of Ages extra. Will the storm of angry boos push him off the stage? Or will he try to end the eternal war between classic rock and current country for one night?
After writing the song, it was time to record it with my producer (me) and engineer (me again). This song is going to be a classic. I put some thick blankets and quilts on the walls of my spare room studio to dull the sound a bit more so I could record the voices and acoustic instruments. I turned off the heat in my house to get rid of background noise and shivered my way through bass, guitar, keyboard, and voice performances that I then put together into single tracks.
Each MIDI keyboard uses Logic instruments. Voice and instruments were recorded using a $180 Universal Audio Volt 2 interface and $150 Audio-Technica 2035 mic. This cheap microphone costs more than Billie Eilish’s debut Audio-Technica. Since I’m bad at singing, Melodyne adjusted my pitch. I mixed mostly in Logic with built-in instruments, effects, and Universal Audio compressor, and tape distortion plugins. The AI was added after I liked the mix.
Insiders will use mastering specialists
This wasn’t a test to see what these tools could do to improve a $300,000 major-label project taped at a big studio. That world is going away, and only a few people could ever get to it. Mastering experts will still be used by those who are still inside.
Also, the goal wasn’t to find the “best” AI mastering service. This wouldn’t help because it depends on the style and settings you choose for the AI as well as the quality and type of the track you’re sending in.
A “sonic accelerant” for the hobbyist, the amateur, the laptop producer, the opening act, the indie label, the famous artist making demos in their home studio, or the mix engineer with five minutes to send a band home with something they can play in the car. That was not the goal.
More personally, I wanted to see if I could make something that sounded like the love child of a Mountain Goats and Fountains of Wayne song in my home studio with very little equipment. Would it sound “like a record”? Also, would it sound good enough for someone to listen to it twice on their own?
Just below is a version of my song “Hair Metal Renaissance” that was improved by AI. Listen if you want, and then let’s talk more about tech. I’ll tell you which tool was used to master this version of the song and explain why I chose that one at the end of this piece.
AI is All Over
Most people utilize AI-powered mastering. In addition to Apple and LANDR, audio plugin maker Waves offers online AI editing. Of course, German Brainworx does. Master Channel has one. You can utilize Maastr. Bandlab is free and functional, although it has few options. Boston-based iZotope develops Ozone, a pricey yet sophisticated mastering tool. It makes amazing AI masters in your DAW with one click.
This work, like many AI, is controversial. Is it okay to educate a machine on EQ curves, vocal levels, and dynamic ranges and then develop a product that may put some of those individuals out of work? Isn’t every apprentice “learning from humans” when studying the craft? Bedroom studios and independent artists benefit from a cheap, fast, and (at least) good AI master. Mastering engineers may not.
The companies that made these tools also say that they might not be so bad for human audio engineers after all!
In a blog post, Bandlab says, “AI cannot fully replace a human mastering engineer.” “It doesn’t let you put your record in order, find mistakes like clicks and pops in the music, label tracks with metadata, or export them in the formats needed by different distributors.” It also can’t run a lathe for making vinyl records… If you give it the right mix, it can still do a good job on a single song. For the most part, it’s a great way for both mixing and mastering artists to learn.
The same thing is true for iZotope. The company says in its 27,000-word (!) Ozone manual its AI mastering tool can give “beginners the confidence to share your work with the world” and “experts a valuable second opinion and a faster setup.”