Thanks Amaury for the detailed response.
Before I get back into the theory, let me give you a concrete example.
Yesterday I recorded a piano overdub on the end of my song Distance of a Touch
(you can hear the complete song here
). To do this in Live I had on one track an audio clip of a rough mix of the song. There were no FX on that track. There were no other tracks in this Live set.
On a MIDI track I opened Steinberg's The Grand and played along. (I was wearing headphones so there was no delay to the sound through the air!).
This is the version as recorded without
any negative track delay added afterwards:
Something feels a bit weird.
And in this version the piano part has been given a negative track delay to the value of the latency reported in Live's audio tab:
In this version, the piano just feels right. At least to my ears.
Now, to answer your questions:
do you have lots of device on the track
No, just Steinberg's The Grand. no other processors.
or on the Master track
and is your plugin buffer size the same as the audio buffer size?.
yes it is.
What do you hear while playing? Do you listen to your performance? If so, do your finger hit the notes on time, and so do you hear the music delayed? If so, isn't it annoying, is it workable at all?
With apologies for sounding a bit obscure, I can only tell you my experience.
In my experience of recording music I do not deliberate at all about what I'm hearing and what I'm playing. I simply play. You can hear the results above.
Regarding the various comparisons about how musicians play live (the example earlier in this thread of the pianist and the orchestra, for instance) - I've been thinking more about this, and with respect I don't think it's comparable to what Live is doing to the MIDI data.
When people play in an orchestra everyone is watching the conductor. Thus they are all performing (moving their fingers etc) with the conductor's beat. They're not trying to play in time with each other, although they do hear each other. That sounds weird, but it's a very complex thing, involving muscle memory, psychoacoustics, hand-eye coordination, and all sorts of other behavioural magic.
Similarly when you play with a typical small rock band, everyone is listening to the drummer. Thus again everyone's moving in sync with the same beat.
Now the sound of the whole orchestra or the whole band may itself be delayed to the audience - that depends on the size of venue, acoustics, etc. But when the sound does
reach the audience, its elements are still in sync with each other, because each element was performed to the same beat.
What Live is doing is moving individual elements away from the beat. To continue with the orchestral analogy, Live is moving one instrument according to how it is heard at the back of the concert hall. And that's different to the beat of the conductor.
The concert hall is the computer latency. The beat is Live's metronome or my other audio track. I perform according to the metronome not the latency. It is as simple and as surprising as that.
I hope this explanation helps!