VST Tunnel - Internet Live Collaborations.Anyone try it yet?
VST Tunnel - Internet Live Collaborations.Anyone try it yet?
http://www.vstunnel.com/en/
Seems pretty cool. Don't know what kind of bandwidth it would use as I am not where I can test.
So does anyone have any feedback on it yet?
Seems pretty cool. Don't know what kind of bandwidth it would use as I am not where I can test.
So does anyone have any feedback on it yet?
3ghz Pentium 4 (Prescott), XP Sp2, 1gig Ram, Dual Monitor with Matrox Millenium, MOTU Traveler, Event EZ8 Adat card. Also IBM THinkpad t40 1.6 1 gig ram
-
- Posts: 8803
- Joined: Wed Mar 31, 2004 3:12 pm
- Location: www.fridge.net.au
- Contact:
haven't tried this one.
Machinate and I tried out digitalmusician.net's offerings and also Ninjam.
The problem with DML.net was that we could not both be in sync and hear each other in sync... but it's great for running a remote session.
Ninjam lets you sync up by inducing a buffer delay kinda like PDC where we were 1 bar apart. That was heaps of fun.
Machinate and I tried out digitalmusician.net's offerings and also Ninjam.
The problem with DML.net was that we could not both be in sync and hear each other in sync... but it's great for running a remote session.
Ninjam lets you sync up by inducing a buffer delay kinda like PDC where we were 1 bar apart. That was heaps of fun.
Real-time jamming over the net will probably not be possible within our lifetime.
The speed of light in a fiber is slower than that in a vacuum, but let's take the speed of light in vacuum: 670,616,629.38 miles per hour, and the circumference of the Earth: 24,901 miles -- half of which is the distance to the furthest possible point from me -- 12,450 miles. It therefore takes the fastest possible signal 0.13367 seconds to get to my friend. A little over a tenth of a second. While that's OK for a telephone conversation, it's too long for musical sync. (And his signal has to get back to me, so thats another tenth of a second to wait!)
Try setting your DAW's latency (buffer size) to 133 milliseconds and try playing a softsynth and you'll see how even a seemingly short delay makes it nearly impossibly to play. And in reality, that number is even longer as light is slower in a fiber than in vacuum.
And communicating through vacuum -- through radio via a geostationary satellite -- won't help any, as they orbit 22,223 miles above the Earth, the trip to satellite and bounce back to a friend right beside me would be at least 44,446 miles, and to one on the other side of the world even further. At least 5 times longer than the trip under the ocean.
I notice the delay on my music system and it's only about a 12 ms *round trip*. Thats 0.012 seconds, or about a twentieth the time of a trip round Earth at light speed.
So, until someone discovers hyperspace, super luminal travel, wormholes, subspace radio, or something... we'll NEVER be able to jam properly with our friend on the other side of the planet. Sad but true.
And yes, you can jam a bar apart and sync it that way, but... what if you want to change to another chord? There's no way we can both change at the same time and both hear each other change (unless we're a whole repeating-phrase length apart... but then what if we want to change to a different pattern, or a pattern of a different number of bars? It just won't work.)
So you can jam on one chord, and hear the delayed version of what your friend played during the previous bar a few seconds ago. But that's about it.
They did this on TV a while back, for the Olympics I think? But only the audience saw it in sync. The separate performances in different locations around the world happened synced to click tracks. Or possibly it could work in a cascade where the first location hears nobody but click track, the 2nd hears the 1st, the 3rd location hears the 1st and 2nd re-synced, and so on... This way you could have a live audience at the last location see their orchestra perform seemingly in sync with all the other incoming "live" feeds. But only the last location in the chain would experience it that way.
This is what some guy told me anyway.
The speed of light in a fiber is slower than that in a vacuum, but let's take the speed of light in vacuum: 670,616,629.38 miles per hour, and the circumference of the Earth: 24,901 miles -- half of which is the distance to the furthest possible point from me -- 12,450 miles. It therefore takes the fastest possible signal 0.13367 seconds to get to my friend. A little over a tenth of a second. While that's OK for a telephone conversation, it's too long for musical sync. (And his signal has to get back to me, so thats another tenth of a second to wait!)
Try setting your DAW's latency (buffer size) to 133 milliseconds and try playing a softsynth and you'll see how even a seemingly short delay makes it nearly impossibly to play. And in reality, that number is even longer as light is slower in a fiber than in vacuum.
And communicating through vacuum -- through radio via a geostationary satellite -- won't help any, as they orbit 22,223 miles above the Earth, the trip to satellite and bounce back to a friend right beside me would be at least 44,446 miles, and to one on the other side of the world even further. At least 5 times longer than the trip under the ocean.
I notice the delay on my music system and it's only about a 12 ms *round trip*. Thats 0.012 seconds, or about a twentieth the time of a trip round Earth at light speed.
So, until someone discovers hyperspace, super luminal travel, wormholes, subspace radio, or something... we'll NEVER be able to jam properly with our friend on the other side of the planet. Sad but true.
And yes, you can jam a bar apart and sync it that way, but... what if you want to change to another chord? There's no way we can both change at the same time and both hear each other change (unless we're a whole repeating-phrase length apart... but then what if we want to change to a different pattern, or a pattern of a different number of bars? It just won't work.)
So you can jam on one chord, and hear the delayed version of what your friend played during the previous bar a few seconds ago. But that's about it.
They did this on TV a while back, for the Olympics I think? But only the audience saw it in sync. The separate performances in different locations around the world happened synced to click tracks. Or possibly it could work in a cascade where the first location hears nobody but click track, the 2nd hears the 1st, the 3rd location hears the 1st and 2nd re-synced, and so on... This way you could have a live audience at the last location see their orchestra perform seemingly in sync with all the other incoming "live" feeds. But only the last location in the chain would experience it that way.
This is what some guy told me anyway.
Well.....it might be OK for that. But we'll NEVER be able to do any hippy-dippy-ass jam band type stuff that way.kennerb wrote:But what if your only making 8 bit dub?kramer wrote:Real-time jamming over the net will probably not be possible within our lifetime.
Yeah...it's seems like just a novelty and I don't really see the point, other than for the quick exchange of ideas if you're writing a song with someone far away. I guess that's really the intended market.
But it was when I was at the Country fair listening to a Hippy Jam Band that I discovered super luminal travel and instantly thought that the internet would be the prime place to bring all said things together. My crest falls at the thought of not being able to make this a reality.kramer wrote:Well.....it might be OK for that. But we'll NEVER be able to do any hippy-dippy-ass jam band type stuff that way.kennerb wrote:But what if your only making 8 bit dub?kramer wrote:Real-time jamming over the net will probably not be possible within our lifetime.
Yeah...it's seems like just a novelty and I don't really see the point, other than for the quick exchange of ideas if you're writing a song with someone far away. I guess that's really the intended market.
3ghz Pentium 4 (Prescott), XP Sp2, 1gig Ram, Dual Monitor with Matrox Millenium, MOTU Traveler, Event EZ8 Adat card. Also IBM THinkpad t40 1.6 1 gig ram
I also heard we'll never possibly need more than 64Kb of memory for our computers...kramer wrote:Real-time jamming over the net will probably not be possible within our lifetime.
btw, didn't monolake and someone else who I forgot already do a live performance over the internet at 2 locations at the same time?
Don't take the brown acid man.............that's alls ise cans tells ya'kennerb wrote:But it was when I was at the Country fair listening to a Hippy Jam Band that I discovered super luminal travel and instantly thought that the internet would be the prime place to bring all said things together. My crest falls at the thought of not being able to make this a reality.kramer wrote:Well.....it might be OK for that. But we'll NEVER be able to do any hippy-dippy-ass jam band type stuff that way.kennerb wrote: But what if your only making 8 bit dub?
Yeah...it's seems like just a novelty and I don't really see the point, other than for the quick exchange of ideas if you're writing a song with someone far away. I guess that's really the intended market.
I don't know anything about this. In fact I don't know shit about shit. But here's an idea. I need critical thinkers to pick holes in this for me as it's late and I'm about to fall off my chair (in real-time...)
Two remote computers connected to the net. A software host (yet to be devised...) run by both machines is connected to an internet clock, so they have a common benchmark to sync to. Both machines can start and stop together using midi time code to control tempo and start/stop positions. Sync can be achieved by dividing the work on both machines into two types -identical local files on each machine output sequenced midi (say drums, bass, percussion) which has a delay compensation equivalent to the period required to reach the other computer, and live input which has a shorter delay compensation equal to the initial delay minus soundcard latency for live input. The whole setup on both machines is controlled by delaying the start of audible playback until after the streamed signal has reached the other computer. So, it goes a little something like this...
1 Internet clock goes "bong!". This starts a "pre-roll" on both machines. Tempo for each musician to play to is set by a metronome which is heard as soon as the internet clock signals for the software to initiate a set delaytime, then make the metronome audible.
2 Both machines have a precalculated delay. Their connection is a direct stream through a minimum of servers. Both machines can ping each other regularly to find out if any adjustment is needed to the delay time.
3 Streaming of live audio (say a guitar on track 1, machine A, piano on track 2, machine B) begins simultaneously from both machines in both directions, whether the instruments are playing or not. Delay equals soundcard latency + "travel period" for the signal to reach the remote machine.
4 Synchronisation of sequenced audio begins simultaneously in both directions, with a delay equal to the "travel period" for the signal to reach the remote machine.
At this point no audio is being heard by either musician, as the signal is still on it's way from each remote computer.
5 Playback of live audio and sequenced audio begins from each machine and can be heard by either musician. They should be in sync (ha!) because the delay time has compensated for the late arrival of the remote signal, and both musicians heard the audio starting "late" so that the streaming had already begun sending signal to the other machine. The whole shebang would depend upon being able to start playing your instrument right on the money through a direct monitor (like the us-122 has, direct playthrough, no latency) which you hear, but you are playing along with a delayed, synced signal. And vice versa for your colleague.
If anybody knows what I mean by this could they decipher it into English (or any other language)...
Fire at will !
Two remote computers connected to the net. A software host (yet to be devised...) run by both machines is connected to an internet clock, so they have a common benchmark to sync to. Both machines can start and stop together using midi time code to control tempo and start/stop positions. Sync can be achieved by dividing the work on both machines into two types -identical local files on each machine output sequenced midi (say drums, bass, percussion) which has a delay compensation equivalent to the period required to reach the other computer, and live input which has a shorter delay compensation equal to the initial delay minus soundcard latency for live input. The whole setup on both machines is controlled by delaying the start of audible playback until after the streamed signal has reached the other computer. So, it goes a little something like this...
1 Internet clock goes "bong!". This starts a "pre-roll" on both machines. Tempo for each musician to play to is set by a metronome which is heard as soon as the internet clock signals for the software to initiate a set delaytime, then make the metronome audible.
2 Both machines have a precalculated delay. Their connection is a direct stream through a minimum of servers. Both machines can ping each other regularly to find out if any adjustment is needed to the delay time.
3 Streaming of live audio (say a guitar on track 1, machine A, piano on track 2, machine B) begins simultaneously from both machines in both directions, whether the instruments are playing or not. Delay equals soundcard latency + "travel period" for the signal to reach the remote machine.
4 Synchronisation of sequenced audio begins simultaneously in both directions, with a delay equal to the "travel period" for the signal to reach the remote machine.
At this point no audio is being heard by either musician, as the signal is still on it's way from each remote computer.
5 Playback of live audio and sequenced audio begins from each machine and can be heard by either musician. They should be in sync (ha!) because the delay time has compensated for the late arrival of the remote signal, and both musicians heard the audio starting "late" so that the streaming had already begun sending signal to the other machine. The whole shebang would depend upon being able to start playing your instrument right on the money through a direct monitor (like the us-122 has, direct playthrough, no latency) which you hear, but you are playing along with a delayed, synced signal. And vice versa for your colleague.
If anybody knows what I mean by this could they decipher it into English (or any other language)...
Fire at will !
MacBook Pro Retina, Live 9.5, Reason, UC33, KRK RP5s, Teenage Engineering OP1, Korg ESX2, Korg Prophecy, Clavia Nord Lead, Bass, Guitars.
http://soundcloud.com/motorradkinophone
http://soundcloud.com/motorradkinophone
-
- Posts: 8803
- Joined: Wed Mar 31, 2004 3:12 pm
- Location: www.fridge.net.au
- Contact:
this is possible... but definately far from realtime jamming ...telekom wrote:I don't know anything about this. In fact I don't know shit about shit. But here's an idea. I need critical thinkers to pick holes in this for me as it's late and I'm about to fall off my chair (in real-time...)
Two remote computers connected to the net. A software host (yet to be devised...) run by both machines is connected to an internet clock, so they have a common benchmark to sync to. Both machines can start and stop together using midi time code to control tempo and start/stop positions. Sync can be achieved by dividing the work on both machines into two types -identical local files on each machine output sequenced midi (say drums, bass, percussion) which has a delay compensation equivalent to the period required to reach the other computer, and live input which has a shorter delay compensation equal to the initial delay minus soundcard latency for live input. The whole setup on both machines is controlled by delaying the start of audible playback until after the streamed signal has reached the other computer. So, it goes a little something like this...
1 Internet clock goes "bong!". This starts a "pre-roll" on both machines. Tempo for each musician to play to is set by a metronome which is heard as soon as the internet clock signals for the software to initiate a set delaytime, then make the metronome audible.
2 Both machines have a precalculated delay. Their connection is a direct stream through a minimum of servers. Both machines can ping each other regularly to find out if any adjustment is needed to the delay time.
3 Streaming of live audio (say a guitar on track 1, machine A, piano on track 2, machine B) begins simultaneously from both machines in both directions, whether the instruments are playing or not. Delay equals soundcard latency + "travel period" for the signal to reach the remote machine.
4 Synchronisation of sequenced audio begins simultaneously in both directions, with a delay equal to the "travel period" for the signal to reach the remote machine.
At this point no audio is being heard by either musician, as the signal is still on it's way from each remote computer.
5 Playback of live audio and sequenced audio begins from each machine and can be heard by either musician. They should be in sync (ha!) because the delay time has compensated for the late arrival of the remote signal, and both musicians heard the audio starting "late" so that the streaming had already begun sending signal to the other machine. The whole shebang would depend upon being able to start playing your instrument right on the money through a direct monitor (like the us-122 has, direct playthrough, no latency) which you hear, but you are playing along with a delayed, synced signal. And vice versa for your colleague.
If anybody knows what I mean by this could they decipher it into English (or any other language)...
Fire at will !
absolutely... it's realtime + delay jamming...sweetjesus wrote:this is possible... but definately far from realtime jamming ...
The nifty software sync host which one of you bright sparks should invent could be called RealDelay...
...although I bet most British transport companies, and M-Audio, are already trying to market that name...
MacBook Pro Retina, Live 9.5, Reason, UC33, KRK RP5s, Teenage Engineering OP1, Korg ESX2, Korg Prophecy, Clavia Nord Lead, Bass, Guitars.
http://soundcloud.com/motorradkinophone
http://soundcloud.com/motorradkinophone
Both machines are playing any backing tracks such as drums from identical or similar prepared files on each machine. An internet clock controls the start point for playback. The sync signal is sent from each machine at the same time and the delay required for internet sync elapses before audible playback begins. So both machines start late, as it were, but synchronised. Live audio is played back "here" through direct monitoring in time with the "late" sync from remote machine. Actual audio playback doesn't begin until the sync signal has arrived from afar. This also allows time for the live streaming audio from "here" to reach "there", so that your pal can noodle with you.Machinate wrote:uhm, telekom, so what you're saying is that it's a process whereby the signal from one player is delayed, so that it is in sync with the other player upon arrival?
So really the process is... delay actual audible playback until a remote sync signal arrives, by which time your streaming audio has already been going for a bit and is arriving at your pal's machine.
It's late. I might not be making sense. And stuff like this was always hard for me to understand. So I might shut up now.
MacBook Pro Retina, Live 9.5, Reason, UC33, KRK RP5s, Teenage Engineering OP1, Korg ESX2, Korg Prophecy, Clavia Nord Lead, Bass, Guitars.
http://soundcloud.com/motorradkinophone
http://soundcloud.com/motorradkinophone
-
- Posts: 8803
- Joined: Wed Mar 31, 2004 3:12 pm
- Location: www.fridge.net.au
- Contact:
this could be done with NINJAM as it stands except for the midi side...telekom wrote:Both machines are playing any backing tracks such as drums from identical or similar prepared files on each machine. An internet clock controls the start point for playback. The sync signal is sent from each machine at the same time and the delay required for internet sync elapses before audible playback begins. So both machines start late, as it were, but synchronised. Live audio is played back "here" through direct monitoring in time with the "late" sync from remote machine. Actual audio playback doesn't begin until the sync signal has arrived from afar. This also allows time for the live streaming audio from "here" to reach "there", so that your pal can noodle with you.Machinate wrote:uhm, telekom, so what you're saying is that it's a process whereby the signal from one player is delayed, so that it is in sync with the other player upon arrival?
So really the process is... delay actual audible playback until a remote sync signal arrives, by which time your streaming audio has already been going for a bit and is arriving at your pal's machine.
It's late. I might not be making sense. And stuff like this was always hard for me to understand. So I might shut up now.
you could have all your channels routing to two channels, one of which is used to the NINJAM loopback, and the other would be for your own monitoring with a buffer delay plugin (i.e. Live's delay plugin with feedback set to 0 and dry/wet set to 100) and adjust the latency in miliseconds..
what would suck is this way both people cant hear what they are doing until a bar or two later...