Thank you, theophilus, for being considerate and thoughtful in all of your responses, even when we disagree. I would like to respond to three of your points in a like manner, if you will please bear with me.
theophilus wrote:think we are talking two different things... with External Instrument, yes, it shouldn't matter what you put after it, audio will be aligned
Yes, we are speaking of slightly different things. I am not meaning to place effects on the External Instrument channel itself. I am referring to the effects being inserted on another channel of the project, which pushes that channel out of sync with Live's base PDC. The External Instrument will be lined up with Live's base PDC calculation, but will still sound out of sync with the track that is out of sync with the grid. Make sense?
It doesn't matter if the hardware instrument is in perfect sync with Live's base PDC, either by using External Instrument or an Inner Clock Systems device, or even an efficient Direct Monitoring solution (which is entirely doable), if there are several other tracks within Live that are already out of sync with Live's own timing base. It just sounds wrong, and may be misinterpreted as the hardware being out of sync, when it isn't.
theophilus wrote:well, we have to agree to disagree i guess, and i don't know why you are taking it so personally. there are probably some small class of sound design that is absolutely impossible...
Well you touched upon it right there, why I take it personally. "Some small class of sound design" dismisses the importance of what I, and many others, are talking about, and makes it seem like some obscure issue which doesn't have real world importance. For you, this appears, at least, to be somewhat of a curiosity which doesn't really bother your process that much. For me, it has cost several thousands of dollars in lost studio time with clients, and headaches within my own projects, some of which had to be rebuilt in other hosts because of it, along with having to invest in other software tools and education materials, to work around the issue. So yes, I get frustrated by the dismissiveness, whether inadvertent or intended, which is quite honestly being handed out from those who either don't really understand the issue, or who think it either doesn't exist or is insignificant, because they haven't experienced directly. Some days, it can be infuriating.
Over the last few years, I have become tired of explaining it over and over and over and over... whether here or in other arenas; yet I will give it one more try, in the hope that you, who clearly has demonstrated a genuine desire to understand it, will hear me.
It is extremely common, in genres like uplifting trance and progressive house, which are very popular on a global scale, not some obscure art house experimental electronica, for synths to be run through a considerable series of effects. Like this...
Synth > Reverb > EQ > Delay > Filter > Gate > EQ > Transient Design > Dynamics
Depending on the quality level and type of plugin, and the order in which they are placed in the series, any or all of these can introduce latency to the signal. Often, the Delay, Filter and Gate are tempo synced to the host clock, with all kinds of internal modulation happening in realtime within each effect, independent of Ableton's effect automation, sometimes on 16th notes and sometimes across whole measures, and feeding their own internal tempo based processing into the next effect, which is adding its own tempo based processing on top of it, feeding into the next plugin, and so on.
This type of effect chain can happen across dozens of instrument and audio tracks - and it is extremely unlikely that any two or three tracks will use the exact
same synths and plugins in the exact
same number and order. Sometimes the processing is applied to prerecorded audio tracks, or external hardware synths being fed midi events from Live (which have been setup properly for accurate timing with the host, regardless of direct monitoring or not).
Knowing what you do now about Live's PDC problem, do you see how this could possibly get out of control? Dozens of tracks, with different latency offsets, running through several different combinations of tempo based effects? Forget about the automation problem, just set that aside for the time being. Just consider what happens to the sound itself when all of the tempo based effects, running their own internal modulations, are off grid and feeding into each other.
Now take all of those tracks, and any effects returns (which might be sending to other returns), and route them all
into a series of multiband sidechain busses (audio tracks), being fed a sidechain signal that is in time with Live's base PDC grid, yet out of time with any of the tempo effected tracks and any returns using sends. If it seems complicated to even read and think about here, imagine what it could be like in a 48+ track project with multiple returns and busses.
Can it be managed? Sure. But the entire scenario I'm describing, which is a real world situation, is handled to the sample accurate level across all instruments, tracks, effects and busses in at least two other popular DAWs that I am familiar with, and reportedly is more accurate and manageable within other hosts that I either don't use anymore or have never used.
theophilus wrote:but there is a larger class that, if you're willing to accept that there are many more or less interchangeable effects and some of those have zero or very low latency, and some have much more, you can still get it done in live. for example, if you have a large latency-inducing compressor at the beginning of your chain, you can replace it with a zero- or low-latency compressor - there are some decent ones on the list that can have zero- or low-latency. sure, compressors aren't exactly the same, but you can get close, and if you have a zillion tempo-based effects after it, how important are those small artifacts anyways. if you use zero- or very-low-latency effects, your tempo-synced effects won't be out of sync.
As a professional, this is another argument which drives me bonkers, and which I strongly disagree with.
Often an artist or producer's sound is derived specifically from their instruments and effects. Suggesting that they just use other instruments and effects is not practical. Maybe on the hobbyist level, but not professionally. Every synth's oscillators sound unique, and delays from different manufacturers can sound very different and offer vastly different feature sets. If an artist has invested in advanced, and often expensive, instruments and effects, and in the time to master them, there is no easily interchangeable replacement, and the "well, just use something else" remark becomes exasperating.
Same with Live. If one has invested in the Suite, and the hardware, and Max and Bome's and iOS apps, and the tutorials and books, and given over a fifth of their life to using it, "just use something else" is not a casual decision or process. I wish that some of the members here, as colleagues invested in Live themselves, were more sensitive and understanding of this.