The time that has passed after applying a tempo automation is not what one would expect.
Here's the example:
I have created a track that slows down over 4 measures from 150 to 50 bpm using a linear fade.
When exporting the sound file and also looking at the sequence time, I get results that are not what I have expected.
The total time that has passed at measure 5 is different to when I try the same in Cubase. Also, when I try to calculate this mathematically, I get the same results as in cubase. The difference between Ableton and Cubase is already 100 milliseconds after 4 measures.
It suggests that either Ableton is not doing a real linear interpolation by design, or that it's a bug...?
See my results below:
Code: Select all
Measure 1 2 3 4 5
Beat 1 5 9 13 17
Time (Ableton) 0 1.740 3.867 6.604 10.447
Time (Cubase) 0 1.750 3.892 6.654 10.546
Time (calculated) 0 1.750 3.892 6.654 10.547
To do so, use the following function:
Code: Select all
time = beats * 60 / (endBPM-startBPM) * log (endBPM/startBPM)
And did anyone else encounter this inconsistency?
Update:
It seems that Ableton changes its tempo on every 16th note, not continuously. When changing the tempo on every 16th note, we actually get the time values that I have described above.