Page 1 of 1

best audio driver for Mac

Posted: Wed May 29, 2019 8:18 pm
by eMko
Hi guys,
the question is, if there is better audio driver for mac than core audio? Why Im asking.. I have Roland tr8s connected with usb cabel to Ableton so I can control with push 2 drum rack and I still have problem with latency.. The latency is getting bigger while Im playing and I tried everything.. setting driver comensation error and all this things and always, I fix the problem only for few seconds/minutes.. after that the beat is late again.. the only thing I didnt try is change the audio driver. And this is happening with my both macbook pro and mac mini with diferent audio soundcards

Re: best audio driver for Mac

Posted: Wed May 29, 2019 10:59 pm
by oratowsky
Core Audio is the name of the audio architecture in macOS. It is the only audio driver.

Did you check Delay Compensation? It should be off. Options > Delay Compensation
Did you try making the buffer size smaller?
Is there any pattern as to when the latency gets better or worse?

Re: best audio driver for Mac

Posted: Thu May 30, 2019 12:51 pm
by doghouse
Did you install the Roland drivers? The user manual says they are required, none of the Roland instruments are USB class compliant.

Re: best audio driver for Mac

Posted: Thu May 30, 2019 1:32 pm
by eMko
Yep, I tried without delay comensation, with smaller buffer size... I also have the drivers for roland installed. Without them you can not see Roland in ableton.. What piss me off is that my friend has a laptop with windows... more shitty one than I have and he just set in midi setting in Ableton midi clock sync delay and it work. If I do that, the delay is still moving :-/

Re: best audio driver for Mac

Posted: Thu May 30, 2019 11:48 pm
by TLW
I’m pretty sure there was someone on the forum who had what sounds like a very similar problem with a TR fairly recently.

I think in the end they tracked the problem down to a plugin which added latency.

If you’ve not done so I suggest creating a new empty Live project, add no plugins to it at all and see if the problem still exists in that project.

What I am pretty sure about is that CoreAudio/MIDI does not keep adding more and more latency. If it did lots of people, maybe everyone, with a Mac would have the problem and we don’t.

Having said that, it’s not impossible the Roland driver is misbehaving. Or that there’s some kind of conflict going on if you’ve configured the Roland and another audio interface as a composite device in CoreAudio or you’re using e.g. the Roland as Live’s input audio device and something else as the output audio device.

Re: best audio driver for Mac

Posted: Fri May 31, 2019 1:38 pm
by eMko
I tried to dig for the answers throught the internet, but no luck :-/ Obviously im the only one... I dont use any plugins.. the ableton is just fresh install.. I made short video so you can see what happening and you can see my setup... maybe that will help and there will be just some bullshit I didnt know about..

Re: best audio driver for Mac

Posted: Sun Jun 02, 2019 11:42 pm
by TLW
OK. I don’t have either a TR or a Behringer interface which limits my experimenting, but I think there’s at least two different things going on in that project. I hope my analysis is correct, but fi it isn’t I’m more than happy for someone to correct me. Sorry this is so long, but there’s a lot involved.

How Live decides where to place audio recordings on the timeline when you are monitoring through the audio track you are recording to isn’t exactly intuitive.

Live assumes that if we are monitoring via an audio track then we want the audio placed as near as possible to when we heard it.

We will hear it only after the audio has gone through the interface into Live and back out through the interface - after the time dictated by the size of the audio buffer.

To compensate for that latency Like shifts the recorded audio back along the timeline by the audio latency setting in preferences. It’s not an exact process because it doesn’t take account of the time it takes for sound to get from monitors to our ears, but let’s keep things simple and pretend we’re wearing headphones.

The idea is to make things consistent between audio monitored through an audio track and other audio monitored direct via an interface or heard as an acoustic instrument, guitar amp, whatever, at the same time.

You have a fairly high audio latency setting showing a round trip of 25.3ms and are recording into tracks with the monitoring set to "in", so Live should be shifting the audio by the length of the audio buffer. I’m usually running a buffer of around 6-7ms and I see Live apply that amount of shift when recording while monitoring through the same track.

You also have a variety of MIDI clock adjustments set and you are using two different audio interfaces, which can make for unpredictable results. Especially if either of the drivers involved don’t report their latency correctly or either device has "hidden" internal buffers Core audio doesn’t know about (a surprising number of interfaces have a small internal buffer the driver doesn’t report).

May I make a suggestion? Start with a clean slate. Switch on delay compensation. Clear the MIDI clock adjustments. Open the Live lesson called "audio i/o" and run that test on the Behringer. If the TR8s can send audio from a DAW through its outputs and accept an incoming audio signal also run that test on the TR8s. Then run it again using the TR8s as the input and Behringer as output interface. See how things line up with that - let’s be certain there’s no audio driver/interface hardware issue going on to affect things.

The way Ableton suggest to monitor through audio tracks without having the audio latency automatically applied to recorded audio is a bit messy, but works. If anyone has a better idea for how to do this I’d be pleased to hear it.

First create the required tracks as usual and set their monitoring to "in" - the "external plugin" instrument can be used for this as well. Then duplicate those tracks and set the monitoring to "off". You monitor via the tracks set to monitor their input and record into the tracks with monitoring off. That way Live assumes you are hearing the sound from a source other than Live and the automatic compensation for audio latency making you hear things "late" isn’t applied. The downside is you get a bunch of duplicated tracks, but once recording is done the tracks used just for monitoring can be deleted.

Or just record on the same track as the monitored one and shift the audio the required amount afterwards.

As for MIDI, hardware MIDI is very rarely exactly spot-on in timing - polyphonic MIDI can’t be because chords sent over MIDI are always slightly staggered because MIDI is a serial protocol so sends one note after another. MIDI clock isn’t always absolutely perfect, it can wander by small amounts and computers/DAWs often aren’t perfect sources of MIDI clock.

If a hardware synth receives MIDI - even an internal MIDI instruction to play a note sent to the sound engine by it’s own sequencer - it takes some time for it to turn the MIDI into audio. Unfortunately that amount of time varies from synth to synth. Zooming in to tracks and searching for ways to configure software to automatically correct for a couple of milliseconds can drive people mad. Well, it has me before now on PCs and Macs and using more than two DAWs. :-) Fixing that kind of slop is what audio quantising and warp are about. Or if comparing the synth’s recorded audio with the MIDI on the time-line shows a consistent shift between the two, using track delays or manually shifting the audio the required amount of time fixes things.

I hope this helps.