Now I can hear what Apple, Spotify, and YouTube are going to do to my track in a playlist and compare my master against commercial masters at the same LUFS level. This way I don't alter the master processing. 6 to avoid clipping, but if I have to turn my master down then I will adjust the Gain plugin after the master limiter. If I have to turn the volume up, I'll use the peak limiter in Limiter No. Since I have already adjusted the mix compression, mastering compression, and mastering limiter to sound best for my mix, I don't want to change any of these settings just to hit a LUFS target. This is an approximation since the LUFS readings vary quite a bit in most tracks. With the LUFS target set in MLoudnessAnalyzer, I open Magic A/B and adjust the level of the reference tracks so they are all hitting the same LUFS target. I flip through the presets occasionally as I'm mixing. If I ever needed more flexibility, such as for mastering a two-track mix, I might use the Nova EQ or even the stock Channel EQ in Logic.įinally, I have Audified's Mixchecker. The limitations are based on a popular design used in mastering studios, and they lead you to adjust things in a subtle way. Only the high and low shelf controls are broadband. Also, the 432 EQ offers a tiny bit of harmonic color, and the Q controls, with its five narrow bandwidth options, are limiting in a good way. I really could use any modern EQ, but I admit I'm a sucker for skeuomorphism. In this case, the stepped controls are fine. Those are great for the analog world, but I prefer continuous controls in the digital realm. It's not because of the stepped controls. If I ever want EQ on the master bus, I'll typically use the IK Multimedia Master 432-EQ. That said, sometimes you just want to make a subtle tweak to the mix. If there's a problem with the EQ, I fix it in the mix. I am now avoiding using EQ on the master bus. In part, it's because too many choices can zap our productivity. In part, it's because the audio community is nostalgic and wants to reproduce the sounds that succeeded in the past. So why are we working with simulations of tools that were extremely limited in the analog domain? In part, it's because some engineers are used to using them in a workflow. The only limitations in creating tools are our imagination, programming skills, and CPU horsepower. In the digital realm we can have unlimited bands of infinitely variable EQ. Tools that were invented 20 to 50 years ago in the analog domain often had limitations, not because they were best for the job, but because they were expensive or difficult to produce with more (or different) features. We have the opportunity to use simulations of proven tools in new ways, and this can be awesome. Of course, if I send the mix to be mastered elsewhere I will remove my master bus processing. This gives me an approximation of what a mastering engineer will do and helps me make better-informed mix decisions. I reference commercial tracks and adjust the mastering compression and limit as needed. I already mix into compression on the mix bus for a bit of glue, but for a more complete picture I send that audio into a basic mastering chain. Mixing without mastering chain processing gives me an incomplete picture. So why should I wait until a song is mastered to find out if my mix still translates after it has been squashed in mastering? It will flatten out a mix balance, smashing transients and bringing low level information forward. The audience only ever hears the finished product. You can decide if it appeals to you.Īlmost nobody will ever hear an unmastered mix. I'll share my mastering chain methodology. And I know this idea might offend people. I don't know if you should mix into a mastering chain, but I do.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |