Mixing Discussion

@BDolzani if you seriously investigate Reaper, it will astonish you. Dan Worrall uses it because it does things that no other DAW does. One thing I love about Reaper is that a track is a track: it doesn't matter what the track contains, whether it be audio, MIDI or even video. You can edit video on Reaper and take advantage of the audio capabilities of a DAW.

https://stash.reaper.fm/
 
  • Like
Reactions: BDolzani
another studio friend was saying bite the bullet and just get the industry standard PT to share sessions etc if I need to (with any/most studios say for mixing).
'Industry standard'. That is clever ProTools marketing talk - if enough people repeat those words they keep the myth alive. Over the last, say five, years I've seen and read about more and more professional studios and mastering engineers that (sometimes hesitantly) reveal their use of Cubase, Nuendo, Ableton, Reaper and sometimes even state publicly they did so to get away from Avid. ProTools is always later getting up to speed in DAW-land, they were one of the last ones for example to use floating point maths for mixing and even today don't have their MIDI handling up to scratch.

And I've collaborated in many projects over the years - and most of the times I didn't even know what DAW or recorder the other parties were using. Simple sending .wav files over always works!
 
when I hear someone tell me 'PT is the industry standard' I immediately think 'hmmm you have no skills but think that using PT will get you up to standard'
 
'Industry standard'. That is clever ProTools marketing talk - if enough people repeat those words they keep the myth alive. Over the last, say five, years I've seen and read about more and more professional studios and mastering engineers that (sometimes hesitantly) reveal their use of Cubase, Nuendo, Ableton, Reaper and sometimes even state publicly they did so to get away from Avid. ProTools is always later getting up to speed in DAW-land, they were one of the last ones for example to use floating point maths for mixing and even today don't have their MIDI handling up to scratch.

And I've collaborated in many projects over the years - and most of the times I didn't even know what DAW or recorder the other parties were using. Simple sending .wav files over always works!

You are dead on target Arjan. PT was one of the first and for many, the only real choice. I myself used it to edit a single in the 90s when "hard disk recording" (as it was called back then) was just becoming viable. The only other choice was Cool Edit Pro. So that's probably their position but if I had to choose a DAW to call "industry standard" it would be 2 actually: Cubase and Logic.

And, as you said, audio is audio.
 
  • Like
Reactions: mixerizer
In defense of my friend, the last thing he wants is to use digital, so I think he's coming from a 'jump in quick and try and forget about it' mindset. My profile pic is the doorway of his old studio where we recorded 100% to tape in 2016. He reluctantly uses PT as do many people. I'm fine with using indie tools for my very very small setup, so I'll start with/stay with Reaper for now, or forever who knows.
 
Thought y'all might find this interesting. Some good tips to be had, IMO.
This is free on Youtube, excerpted from the premium ($$) "Mix with the Masters" web site.

Smoke Break -Mixing Carrie Underwood's Vocal - Chris Lord-Alge.
 
Last edited:
  • Like
Reactions: Arjan P and shredd
Learned a bunch.
Notice how CLA worked the EQ and FX, and had to look back at the board to see what the final settings were? CLA's video is the perfect example of what we mean when we tell people "use your ears" to "dial in" compression, EQ, and FX. The only "correct" setting is the one that works.

Mixing should never be a paint-by-numbers process. ;)
 
Last edited:
  • Like
Reactions: mixerizer and -mjk-
reluctantly uses PT as do many people
Got a chuckle out of this post…sounds like ME. I’d just as soon have a chain-saw pedicure as devote my musical efforts to PT or anything like it…I’m strictly all purpose-built hardware, old-school, home-based, hobbyist/amateur level, & am content that my gear setup and ability/talent levels reflect/reveal it.
Besides, it’s not like anyone actually listens to my recordings…I have songs posted 10 years ago that haven’t hit triple digits in views/listens. Whut-eva!!!
When I’m a pro like you guyz (perhaps in an alternate universe or future lifetime) I’ll probably consider it!!!:rolleyes:
 
I have songs posted 10 years ago that haven’t hit triple digits in views/listens

Links?
 
  • Like
Reactions: Logrinn
Notice how CLA had to look back at the board to see what the final settings were? CLA's video is the perfect example of what we mean when we tell people "use your ears" to "dial in" compression, EQ, and FX. The only "correct" setting is the one that works.

Mixing should never be a paint-by-numbers process. ;)

and this leads me to my preference for eyes-closed mixing using a physical console like the DM-4800 - having a knob or meter in a certain physical space, and being able to move, adjust and confirm the sound without menu flipping, etc. is important for me keeping my focus on my ears versus my eyes.
 
One of the advantages of DAWs is being able to see the ‘audio frequencies’ of each track. That allows the user to adjust EQ etc. accordingly, so each instrument can be heard as clearly/separately as possible, thus helping to avoid muddy mixes.

I don’t have a DAW or want one, so without knowing what the frequencies are on each instrument, mixing effectively would be likely to depend on me having a wealth of mixing experience and/or excellent, fine-tuned hearing (mine is just about average).

Although I’ve done a lot of recording over the years, my experience is limited to playing and singing, rather than mixing or engineering.

I find that without a visual guide to the frequencies of each instrument, I don’t know which ones need EQ adjustment to allow them to be heard clearly in the mix, although I know which instruments need to be panned centrally:- vocal, kick drum, snare, bass guitar - and that, for example, two rhythm guitars can be panned hard right and left and where other instruments are usually placed on the panning spectrum.

However, try as I might, I’m still likely to get muddy mixes, as I can’t easily tell which instruments are operating on similar frequencies. (For instance, some keyboard ‘tones’ might have similar frequencies to electric or acoustic guitars.)

I’d rather not have someone else mix my music, so I’d like to know if there’s an app or computer program (not a DAW) that can tell me about the audio frequencies on every recorded track of a song. If so, then I can use EQ adjustment, reverb/delay myself where needed and obtain clearer results with better separation.

(I apologise for not knowing all the correct technical terms, but I hope you get the gist of what I’m saying !)

Can anyone help please? Thanks!
 
Hello Bugatti -
Condolences on this sticky problem. It's a considerable challenge to get good mixes and I can empathize with your challenges: I also do not use a computer/DAW approach to music production; and my hearing is well below average.
I do my best, but trust me - Geffen is not beating my door down. When it comes to production quality, I couldn't carry @Arjan P or @-mjk- or @Mark Richards's jock without a wheelbarrow. These are the guyz that probably have real answers for you.
You should also browse the "stickies" posted at the top of the DP forum category - TONS of great info.

ANYway. I was going to point out that there is a boatload of exactly the info you're looking for (the specific frequency of various instruments/etc) online - though it's worth pointing out that it's not quite what you're suggesting you'd use a DAW for (which analyzes the specific thing you've recorded).
But such info would contribute to resolving the mixing challenges you're having - by giving you what you need to distinguish what ranges an instrument or voice would occupy.
The good news is that I believe you're on the right track - you understand that different instruments/voices occupy different freq'y ranges; and that "cluttering" any fr'y range will make for "mud" - lack of definition, separation, etc.

Basically - look on google, and there's a TON of content on y/tube on the topic. I googled "musical instrument frequency chart", and got a ton of results.

Good luck!
 
Last edited:
  • Like
Reactions: Mark Richards
...without knowing what the frequencies are on each instrument...I can’t easily tell which instruments are operating on similar frequencies...likely to get muddy mixes...
This comprehensive chart may provide a good jumping off point.
  • [Dark Red] is the low fundamental range;
  • [Red] is the fundamental range;
  • [Yellow] is the range of overtones;
  • [Black] is the range of "air', breath sounds, etc.
  • The very bottom of the chart indicates the frequency adjustments available on a 1/3rd octave (31 band) equalizer, and how a range of frequencies impact various instruments.
  • The small graph on the right indicates the average ear's sensitivity to various frequencies at different dB levels.
Generally, "muddy" sound occurs in the 200 Hz to 400 Hz range (more or less) when various instruments step on each other in that range. The cumulative amplitude is most often the primary cause of the muddy sound.

BTW, There's a "Mixing Discussion" sticky thread in the "Recording 101" forum (located in the "Rock 'n Roll" area). The "Recording 101" forum is the best place to pursue any further questions you may have, as any discussions will benefit all members, not just those owning a DP-24/32/SD portastudio. (The moderators may decide to move your thread there for that reason.)

Music Frequency Chart (Color).JPG
 
Last edited:
Many thanks to you both! (Lovely Bernese, @shredd!)

First of all, I must apologise for not posting this query elsewhere in the forum, as I’m not au fait with all the topics covered here.

I turned to the forum after searching in vain for helpful content about suitable apps and programs on the net - though judging by your replies, I should have been searching for typical audio frequencies of musical instruments. So thanks to you both for putting me on the right track!

I’m now wondering whether adding reverb/delay or another other effect to an instrument might alter the ‘frequency profile’ of an instrument.

As suggested, the chart should now provide me with a good basis for a more intelligent approach to mixing in future, rather than my present ‘suck it and see’ approach!

Anyway, thanks again for your help - it’s very much appreciated.
 
One of the advantages of DAWs is being able to see the ‘audio frequencies’ of each track. That allows the user to adjust EQ etc. accordingly, so each instrument can be heard as clearly/separately as possible, thus helping to avoid muddy mixes.

Not really. Unless you are looking at the audio spectrum, showing a waveform doesn't give you a clue about the spectrum.

Mixing by eye is not very practical. You can hear the mud, and that's the major obstacle that most people have so you're already on your way to fixing it. But a question worth asking is: do you hear a muddy mix at your monitoring position or only on other devices? If it sounds good while you're mixing and doesn't translate then you have to fix that first.

Mix by ear. Find that is covering what you want to hear more of and practice "unmasking" it by manipulating the spectrum that the offending instrument occupies. Move guitar frequencies down a bit for example. Often, inexperienced mixing engineers put too much bottom on the snare to make it pop out but it sometimes muddies up the guitars and vocals (especially in the reverb). Try adding 8k to the snare so you can hear those hits but out of the way of the central portion of the mix. Mix by ear and keep at it until you get it where you want. You will get it.
 
Sorry @-mjk-; I overlooked this.

My SoundClick and y/tube channels are linked in my signature...I useta use SoundClick for originals and y/tube for covers...but it's sort of mixed-up anymore. And I'm tending away from y/tube 'cuz I'm lame/lazy about making vids (and y/tube has basically devolved into a trailer-trash tiktok anyway).

Got some new stuff in there lately...mostly covers/re-do's. Some are projects I neglected for months, some even longer, for various reasons. Been on a "finish stuff & move on" tear lately.
 
Last edited:
I never look at frequencies in my DAW, cause it adds nothing to my mixing decisions. But I'm aware that experience made me confident in this area. I must also admit, that my channel strip EQ or most EQ plugins do show the frequencies of the particular channel - but that's because it is there and can't be switched off. And I only see this because I already decided on EQ.

One important thing I can add to the discussion is to not apply any EQ while soloing a certain channel/instrument, unless it is to remove low frequency rumble in otherwise higher frequency sources. So in other words, no musical EQing in solo mode.

Another big one is to 'mix by arrangement': Don't play the notes that conflict with other instruments. For example: you have a busy mix with drums, bass guitar, acoustic guitar, electric guitar and piano. Great opportunities for mud! But, examples of mixing by arrangement would be: 1. Don't play left hand on the piano 2. Don't play the lower two strings on the guitars 3. Don't let the (melodic) bass go too high, or keep the bass simple when other instruments are busy. The idea is to simply not record what you later want to EQ out.

Hope this helps!
 
  • Like
Reactions: -mjk- and Logrinn
I’m now wondering whether adding reverb/delay or another other effect to an instrument might alter the ‘frequency profile’ of an instrument.
No, reverb is more about placement: Imagine yourself in a big hall and someone is 1 meter away from you singing at you, with a drummer playing 20 meters in the background. Translated to mixing: they both have the same place in the mix (center) but the drummer has much more reverb 'on' his kit than the singer 'on' his vocal. If the singer moves backwards toward the drummer it would translate as adding reverb to the vocal..

You say 'or another effect' but that's way too broad: some effects do have an effect on the 'frequency profile' of the instrument, others do not.
 
  • Like
Reactions: -mjk-

New threads

Members online