Announcement: Max For Live workshop at Ircam.
-
- Posts: 127
- Joined: Fri Sep 11, 2009 7:54 am
- Location: Paris
- Contact:
Announcement: Max For Live workshop at Ircam.
Hey,
I'll be teaching (in French…) a workshop at Ircam soon. There's some seats available, more info over here.
Hope to see you there.
I'll be teaching (in French…) a workshop at Ircam soon. There's some seats available, more info over here.
Hope to see you there.
ej
Re: Announcement: Max For Live workshop at Ircam.
ej,
I would love to get to Ircam sometime. It sounds like such an interesting establishment. I will have to plan a trip to France. Good luck with your class.
Regards,
Mike
I would love to get to Ircam sometime. It sounds like such an interesting establishment. I will have to plan a trip to France. Good luck with your class.
Regards,
Mike
Websites:
Max For Live Community site:
http://www.max4live.info
http://www.noisemakers.info
Controllers: Lemur, Ohm 64, Monome, APC40, Launchpad
Daw: Live 8 Suite
Audio Interfaces: Apogee Ensemble & Duet
Monitors: JBL LSR 4300
Max For Live Community site:
http://www.max4live.info
http://www.noisemakers.info
Controllers: Lemur, Ohm 64, Monome, APC40, Launchpad
Daw: Live 8 Suite
Audio Interfaces: Apogee Ensemble & Duet
Monitors: JBL LSR 4300
Re: Announcement: Max For Live workshop at Ircam.
emmanuel,
quel est le programme ?
quel est le programme ?
Julien Bayle
____________________________________________________________________________________________________
art + teaching/consulting
ableton certified trainer
____________________________________________________________________________________________________
____________________________________________________________________________________________________
art + teaching/consulting
ableton certified trainer
____________________________________________________________________________________________________
-
- Posts: 127
- Joined: Fri Sep 11, 2009 7:54 am
- Location: Paris
- Contact:
Re: Announcement: Max For Live workshop at Ircam.
Something like that.julienb wrote:emmanuel,
quel est le programme ?
ej
Re: Announcement: Max For Live workshop at Ircam.
emmanuel,
in your program overview I've noticed the item "threading: why send/receive is bad".
Can you recommend any other method for data communication between M4L devices?
in your program overview I've noticed the item "threading: why send/receive is bad".
Can you recommend any other method for data communication between M4L devices?
Re: Announcement: Max For Live workshop at Ircam.
not using multiple device and directly link the sub-system...broc wrote:emmanuel,
in your program overview I've noticed the item "threading: why send/receive is bad".
Can you recommend any other method for data communication between M4L devices?
Julien Bayle
____________________________________________________________________________________________________
art + teaching/consulting
ableton certified trainer
____________________________________________________________________________________________________
____________________________________________________________________________________________________
art + teaching/consulting
ableton certified trainer
____________________________________________________________________________________________________
Re: Announcement: Max For Live workshop at Ircam.
I concur. If we're all using a bad method, it would be nice to get a heads up.broc wrote:emmanuel,
in your program overview I've noticed the item "threading: why send/receive is bad".
Can you recommend any other method for data communication between M4L devices?
Could you please elaborate a little on that Emmanuel?
Re: Announcement: Max For Live workshop at Ircam.
+1 (of course)hoffman2k wrote:I concur. If we're all using a bad method, it would be nice to get a heads up.broc wrote:emmanuel,
in your program overview I've noticed the item "threading: why send/receive is bad".
Can you recommend any other method for data communication between M4L devices?
Could you please elaborate a little on that Emmanuel?
Julien Bayle
____________________________________________________________________________________________________
art + teaching/consulting
ableton certified trainer
____________________________________________________________________________________________________
____________________________________________________________________________________________________
art + teaching/consulting
ableton certified trainer
____________________________________________________________________________________________________
-
- Posts: 36
- Joined: Wed Mar 16, 2005 5:24 pm
- Location: Deep in 'it'.
Re: Announcement: Max For Live workshop at Ircam.
As I understand the issue:
Send and Receive are quite useful constructs to use while programming, because
units that were not designed together can be 'connected remotely' by send/receive pair.
But this happens at a execution time cost in Max/Msp/Jitter & runtime, and in MaxForLive,
it introduces an unpredictable latency to the values being 'sent': an audible one, much too often.
To 'directly connect' to another "sub" piece of maxforlive, it will have to be in the same master patcher,
and the values being 'sent' be communicated by audio / data connect 'lines'. not very convenient, eh? But no ruinous latency in control! Welll....
there is another, more complex method of remote communication, one that underlies the LiveAPI...it admits only of extremely short networking latencies (ever), definitely not audible, and can handle all sorts of data, up to and including complex spectral data structures: it is called OSC . Do check it out.
With it you can control any control value in the running Live App, and if you could program it, even control other applications, even other applications running on other machines on a shared network.
MaxForLive provides a lotta ways to get and work with the 'namespace' & the cool thing is that if you program your patcher's data objects with recognizable 'scripting names' I think they'll probably be available to controlled with the MaxAPI...check it out...and you didn't even have to do much beyond naming stuff! Think of Live as a single very complex ur-instrument that can both contain & call a large set of internal instruments, and provide (most of ) the 'control points' of said instruments in a unified hierarchical 'control space' or 'namespace'. This 'namespace' will change every time you add or remove a new live device or track. Thus the LiveAPI idea of the Live 'path': it's a 'path' to the place in the namespace hierarchy where the device control value you're interested in lives, in the running Live Application. It is way too cool.
As for remote audio connections, uh, this a DAW we're in, and it has a type of audio channel called "auxiliary", if I'm not mistaken. Could be useful in routing audio around? They often come in different sizes (#chans).
Good Luck!
just my tuppence.
YourMileageMayVary.
L&K,
J2K
Send and Receive are quite useful constructs to use while programming, because
units that were not designed together can be 'connected remotely' by send/receive pair.
But this happens at a execution time cost in Max/Msp/Jitter & runtime, and in MaxForLive,
it introduces an unpredictable latency to the values being 'sent': an audible one, much too often.
To 'directly connect' to another "sub" piece of maxforlive, it will have to be in the same master patcher,
and the values being 'sent' be communicated by audio / data connect 'lines'. not very convenient, eh? But no ruinous latency in control! Welll....
there is another, more complex method of remote communication, one that underlies the LiveAPI...it admits only of extremely short networking latencies (ever), definitely not audible, and can handle all sorts of data, up to and including complex spectral data structures: it is called OSC . Do check it out.
With it you can control any control value in the running Live App, and if you could program it, even control other applications, even other applications running on other machines on a shared network.
MaxForLive provides a lotta ways to get and work with the 'namespace' & the cool thing is that if you program your patcher's data objects with recognizable 'scripting names' I think they'll probably be available to controlled with the MaxAPI...check it out...and you didn't even have to do much beyond naming stuff! Think of Live as a single very complex ur-instrument that can both contain & call a large set of internal instruments, and provide (most of ) the 'control points' of said instruments in a unified hierarchical 'control space' or 'namespace'. This 'namespace' will change every time you add or remove a new live device or track. Thus the LiveAPI idea of the Live 'path': it's a 'path' to the place in the namespace hierarchy where the device control value you're interested in lives, in the running Live Application. It is way too cool.
As for remote audio connections, uh, this a DAW we're in, and it has a type of audio channel called "auxiliary", if I'm not mistaken. Could be useful in routing audio around? They often come in different sizes (#chans).
Good Luck!
just my tuppence.
YourMileageMayVary.
L&K,
J2K
-
- Posts: 127
- Joined: Fri Sep 11, 2009 7:54 am
- Location: Paris
- Contact:
Re: Announcement: Max For Live workshop at Ircam.
That's exactly it ;-)julienb wrote:not using multiple device and directly link the sub-system... :?broc wrote:emmanuel,
in your program overview I've noticed the item "threading: why send/receive is bad".
Can you recommend any other method for data communication between M4L devices?
ej
Re: Announcement: Max For Live workshop at Ircam.
emmanuel_2 wrote:That's exactly itjulienb wrote:not using multiple device and directly link the sub-system...broc wrote:emmanuel,
in your program overview I've noticed the item "threading: why send/receive is bad".
Can you recommend any other method for data communication between M4L devices?
Julien Bayle
____________________________________________________________________________________________________
art + teaching/consulting
ableton certified trainer
____________________________________________________________________________________________________
____________________________________________________________________________________________________
art + teaching/consulting
ableton certified trainer
____________________________________________________________________________________________________
Re: Announcement: Max For Live workshop at Ircam.
Thanks for the advice. Does this mean the unpredictable latency even happens in devices which aren't remotely connected. E.g. Devices that use Send/Receive for internal communication.josquin2000 wrote:As I understand the issue:
Send and Receive are quite useful constructs to use while programming, because
units that were not designed together can be 'connected remotely' by send/receive pair.
But this happens at a execution time cost in Max/Msp/Jitter & runtime, and in MaxForLive,
it introduces an unpredictable latency to the values being 'sent': an audible one, much too often.
To 'directly connect' to another "sub" piece of maxforlive, it will have to be in the same master patcher,
and the values being 'sent' be communicated by audio / data connect 'lines'. not very convenient, eh? But no ruinous latency in control! Welll....
there is another, more complex method of remote communication, one that underlies the LiveAPI...it admits only of extremely short networking latencies (ever), definitely not audible, and can handle all sorts of data, up to and including complex spectral data structures: it is called OSC . Do check it out.
With it you can control any control value in the running Live App, and if you could program it, even control other applications, even other applications running on other machines on a shared network.
MaxForLive provides a lotta ways to get and work with the 'namespace' & the cool thing is that if you program your patcher's data objects with recognizable 'scripting names' I think they'll probably be available to controlled with the MaxAPI...check it out...and you didn't even have to do much beyond naming stuff! Think of Live as a single very complex ur-instrument that can both contain & call a large set of internal instruments, and provide (most of ) the 'control points' of said instruments in a unified hierarchical 'control space' or 'namespace'. This 'namespace' will change every time you add or remove a new live device or track. Thus the LiveAPI idea of the Live 'path': it's a 'path' to the place in the namespace hierarchy where the device control value you're interested in lives, in the running Live Application. It is way too cool.
As for remote audio connections, uh, this a DAW we're in, and it has a type of audio channel called "auxiliary", if I'm not mistaken. Could be useful in routing audio around? They often come in different sizes (#chans).
Good Luck!
just my tuppence.
YourMileageMayVary.
L&K,
J2K
And in regards to your comment about OSC, are you talking about using UDPsend/receive instead of send/receive for all intents and purposes?
Re: Announcement: Max For Live workshop at Ircam.
I am also very interested in hearing the same information that Hoffman2K is asking. Currently I am running a complex set that relies on sends. If replacing the sends with osc modules would improve the timing I would surely implement it.
In fact, I am open to any advice on how to improve timing when using tens or hundreds of communicating pairs of objects between Live and M4L.
In fact, I am open to any advice on how to improve timing when using tens or hundreds of communicating pairs of objects between Live and M4L.
-
- Posts: 127
- Joined: Fri Sep 11, 2009 7:54 am
- Location: Paris
- Contact:
Re: Announcement: Max For Live workshop at Ircam.
Let me explain a little bit more. Each MfL device potentially live its life in a different thread than the others, and you don't have any control over it (Live does the threading repartition/optimization). Furthermore, each device use its own scheduler (synched to the Live transport). Using send/receive within the device is totally fine because it will stay in the same thread, but if you use send/receive across devices the timing is just unpredictable. OSC won't do a better job, updreceive has to resynchronize what it received with the scheduler of the device its into.
ej
Re: Announcement: Max For Live workshop at Ircam.
Great! Thank you for clearing this up- very, extremely helpful!!!emmanuel_2 wrote:Let me explain a little bit more. Each MfL device potentially live its life in a different thread than the others, and you don't have any control over it (Live does the threading repartition/optimization). Furthermore, each device use its own scheduler (synched to the Live transport). Using send/receive within the device is totally fine because it will stay in the same thread, but if you use send/receive across devices the timing is just unpredictable. OSC won't do a better job, updreceive has to resynchronize what it received with the scheduler of the device its into.