Yeah, like I said, it's doing 'something'! Can't find anything concrete except for general consumer Apple talk:
"The Process Buffer Range: setting determines how large a buffer Logic sets aside for its mixing engine. As with the I/O Buffer Size, smaller settings decrease latency, but increase CPU load. The default setting of Medium is usually fine, but if you experience frequent System Overloads, then try setting it to Large."
So there's a mixing buffer, and i/o buffer.
Well, if Apple won't give specifics... I'll find them out
I have a neato utility which let's me see what my UAD plugins are running at as far as a buffer latency. If I use a 128 buffer in Logic, I can still see that the UAD plugs are running at 1088 samples. Changing the i/o buffer in logic has no effect on the plugins... however, changing the process buffer range does indeed change it. Setting it to large increases the buffer to a whopping 2112 samples.
So yes, Logic maintains a pseudo-hidden mixing buffer, but no, it doesn't change it on playback, it's always there.
Now we've all learned something today!
Yeah, I think this also explains why it takes a split second the first time when you have a softsynth part running and you play it with your keyboard after playing back a midi region on that track. I think this also explains why when you move an audio region away from the play head that it takes some time to actually take affect.
I think this is a really clever way of doing things, but like it's been said since Live's application is meant for performance too they didn't take that approach. Would be cool if you could enable something like this in preferences for the many people that use Live for production only rather than live performance.