Audio/visual any one?
-
- Posts: 98
- Joined: Wed Sep 27, 2006 11:01 am
- Location: UK brighton
Audio/visual any one?
Hi all, thought it was about time I gave you a sneak peek of the audio/visual extension i’ve built for Ableton Live, entitled JAM (Just Add Music).
I believe the fundamental flaw with using Ableton (when performing) is that the general public rarely know what effects and trickery you use, they just think all your hard work is part of the music track your playing. A visual indication is needed… As for producers imagine having your audio effects perfectly insync with visual effects.
I’ve upload a set I quickly made to Youtube all the links are found at:-
http://www.youtube.com/itsthejayj
**plz note that i have edited the music videos outside of JAM, looping certain part of the videos so they last the same length of a promo track.
Some screengrabs here
http://www.youcanbespecialtoo.com/JAMscreengrabs
I asked you all for your opinions. Although all core functionality is completed its a little while off commercial release. However I would still like to know what area’s you would like development.
JAM is controlled by Ableton, This means you use Ableton exactly the same way you would normally; it affects the visuals/video automatically in real time, knitting the audio and visual together. Weather your in session view or arrangement view, the only thing you have to set up is the video’s path in JAM.
+ My setup consists of one Powerbook running Ableton live, one macbook pro running JAM all controlled using a Korg microcontroller.
Jay j
Breif Functionality list
4 Channel video mixing
Video Looping
Auto / Quantized loading of clips / global effects
Auto stopping of clips
Possibility of 49 clips on each channel (this hasn’t really got a limit could put as many as you lot want )
16 Global function effects: (global functions are loaded in sync with the Quantization menu setting)
4 channels Slit screen
Blend mode Screen
Pop
Multi-screen
Wobble
Psychedelic
Disperse
Rainbow
Broken TV
Poppers
The big picture
Fade switch
Horizontal Split screen
Circular wrap distortion
Kaleidoscope
Effects are dependant on the midi clock speed
32 different blend modes
Full midi clock Tempo sync
Reverse switch – (uses Reversinator vst)
EQ three visual effects
Low, mid, high kill switches
Filter sweep visual sweep
Fade to grey visual smear
Flange function visual blur effect
Beat repeat visual loop system
Cut-o-matic blank flash visual effects
Phaser visual sepia effect
Auto pan
3D A/V Master pan
Record / playback ability in abletons arrangement view
Video Camcorder support
Nintendo Wii support
14 Audio monitor channels
I believe the fundamental flaw with using Ableton (when performing) is that the general public rarely know what effects and trickery you use, they just think all your hard work is part of the music track your playing. A visual indication is needed… As for producers imagine having your audio effects perfectly insync with visual effects.
I’ve upload a set I quickly made to Youtube all the links are found at:-
http://www.youtube.com/itsthejayj
**plz note that i have edited the music videos outside of JAM, looping certain part of the videos so they last the same length of a promo track.
Some screengrabs here
http://www.youcanbespecialtoo.com/JAMscreengrabs
I asked you all for your opinions. Although all core functionality is completed its a little while off commercial release. However I would still like to know what area’s you would like development.
JAM is controlled by Ableton, This means you use Ableton exactly the same way you would normally; it affects the visuals/video automatically in real time, knitting the audio and visual together. Weather your in session view or arrangement view, the only thing you have to set up is the video’s path in JAM.
+ My setup consists of one Powerbook running Ableton live, one macbook pro running JAM all controlled using a Korg microcontroller.
Jay j
Breif Functionality list
4 Channel video mixing
Video Looping
Auto / Quantized loading of clips / global effects
Auto stopping of clips
Possibility of 49 clips on each channel (this hasn’t really got a limit could put as many as you lot want )
16 Global function effects: (global functions are loaded in sync with the Quantization menu setting)
4 channels Slit screen
Blend mode Screen
Pop
Multi-screen
Wobble
Psychedelic
Disperse
Rainbow
Broken TV
Poppers
The big picture
Fade switch
Horizontal Split screen
Circular wrap distortion
Kaleidoscope
Effects are dependant on the midi clock speed
32 different blend modes
Full midi clock Tempo sync
Reverse switch – (uses Reversinator vst)
EQ three visual effects
Low, mid, high kill switches
Filter sweep visual sweep
Fade to grey visual smear
Flange function visual blur effect
Beat repeat visual loop system
Cut-o-matic blank flash visual effects
Phaser visual sepia effect
Auto pan
3D A/V Master pan
Record / playback ability in abletons arrangement view
Video Camcorder support
Nintendo Wii support
14 Audio monitor channels
Last edited by itsthejayj on Fri Mar 16, 2007 12:27 am, edited 1 time in total.
-
- Posts: 38
- Joined: Tue Oct 25, 2005 4:06 am
-
- Posts: 1098
- Joined: Fri Feb 09, 2007 12:05 pm
- Location: UK
-
- Posts: 828
- Joined: Thu Jul 22, 2004 2:37 pm
- Location: kyoto, japan
- Contact:
can we say drop a video clip or clips into the program and have live automatically do the jump-step kinda stuff that you have going on.
I know for me at least, i can vj and i can perform my music, but as far as the live feel goes I'd kill for an app where I could drop some qt files and have the midi or whatever of ableton spit interesting patterns/ or at least timing to an evolving visual piece that could be linked out to svideo.
looks great, sleek, simple, let us know when we can try it.
I know for me at least, i can vj and i can perform my music, but as far as the live feel goes I'd kill for an app where I could drop some qt files and have the midi or whatever of ableton spit interesting patterns/ or at least timing to an evolving visual piece that could be linked out to svideo.
looks great, sleek, simple, let us know when we can try it.
-
- Posts: 1743
- Joined: Sun Jul 11, 2004 5:07 am
- Location: Melbourne AU
- Contact:
The idea of providing the audience with a 1:1 correlation between audio and video is good. We attempt to do this in my band too, by manipulating openGL structures in real-time, triggering shape-generating envelopes and such.
There's also a max-based trio(?) out there that has a virtual dancer, controlled by musical parameters. It's definitely a field worth investigating, for sure.
There's also a max-based trio(?) out there that has a virtual dancer, controlled by musical parameters. It's definitely a field worth investigating, for sure.
mbp 2.66, osx 10.6.8, 8GB ram.
-
- Posts: 1067
- Joined: Sat Feb 21, 2004 4:32 pm
- Location: london
this all looks very cool.. but i'm not quite sure how different it is to already established VJ apps, such as modul8, arkaos, resolume etc.. have you got any screenshots of the app itself when used in conjuction with live? i've been using modul8 (experimentally) and its pretty fantastic... albeit demanding on all but the latest systems...
http://www.myspace.com/wardclerk
http://www.myspace.com/bighairufreqs
LIVE 8.21/ Reaktor 5.51/VDMX/Quartz Composer
http://www.myspace.com/bighairufreqs
LIVE 8.21/ Reaktor 5.51/VDMX/Quartz Composer
yeah, modul8 uses a ton of openGL as far as I know, so you generally need a decent gfx processor, think mbp or tower with a good card in it.
When I started this project the first thing I did was to get a geforce 8800 gts. It'll take anything you throw at it. The real bottleneck is running multiple (3-4+) 800x600 videos simultaneously. This is usually more than harddrives can take.
When I started this project the first thing I did was to get a geforce 8800 gts. It'll take anything you throw at it. The real bottleneck is running multiple (3-4+) 800x600 videos simultaneously. This is usually more than harddrives can take.
mbp 2.66, osx 10.6.8, 8GB ram.
interesting stuff,is some points the video is not much clear about wich are the added effects/jumps.. apart of the rayblur, blackscreen,colourization effects i noticed,
i'd like to see more interface shots too!
it runs on the same pc in dualscreen more, uses OpenGL and no limits about effects or layers, it's fully customizable! and not hard as a maxmsp!
visualjockey can be linked to live with midi in, MIDIOUT and realtime sound analisys, generate sin,cos,tri,saw,rnd waves and bools signals based on the live's bpm to create crazy tricks about streching/cutting video clips in sync, or to control anything! automatize/sequence them and then send those vars/bools back to live as midi signals to control it, it's hard to explain in a few words... the possibilites are huge
you might want to check this about looping tricks http://visualjockey.stalker.nl/viewtopic.php?t=3535

i'd like to see more interface shots too!
an intresting job can be done with the last beta of visualjockey(pc),corygilbert wrote:I'd kill for an app where I could drop some qt files and have the midi or whatever of ableton spit interesting patterns/ or at least timing to an evolving visual piece that could be linked out to svideo.
it runs on the same pc in dualscreen more, uses OpenGL and no limits about effects or layers, it's fully customizable! and not hard as a maxmsp!
visualjockey can be linked to live with midi in, MIDIOUT and realtime sound analisys, generate sin,cos,tri,saw,rnd waves and bools signals based on the live's bpm to create crazy tricks about streching/cutting video clips in sync, or to control anything! automatize/sequence them and then send those vars/bools back to live as midi signals to control it, it's hard to explain in a few words... the possibilites are huge
you might want to check this about looping tricks http://visualjockey.stalker.nl/viewtopic.php?t=3535

Interesting!
I do the same, but with lighting as well as video, and all within Live Rewired to Arkaos VJ and through a MIDI>DMX converter. Combining lasers, washes, scanners, blinders, and strobes with the video makes the audio impact even more powerful. It's the synergistic effect of audio, video, and lighting all choreographed and working together.
The same CCs that filter sweep, EQ, etc can control video and lighting parameters. Plogue Bidule can generate CCs from audio for beat detection, visual EQs, etc. The CC scratching audio, for example, can scrub video real-time, and pan a laser tunnel, too. One drum pad hit at the end of a phrase can trigger audio crash cymbal and thunder samples, send a nuclear blast video clip to the projectors, and flash the blinders and strobes. The same CC that does an audio EQ kill can hue shift the live video feed, color the washes the same color, and pan/tilt/zoom the cameras. I'm working on sequencing entire tracks so that the audio, video, and lighting are all contained in one Live clip that I can drag from the browser.
Follow Actions, MIDI effects, dummy clips, and virtual MIDI buses can translate beat repeat, for example, into stuttering/scratching/juggling laser sweeps, strobing patterns, as well as dynamic Arkaos video parameters that follow the audio.
Keeping it all organized is the biggest nightmare! I found that running three instances of Live (audio, lighting, video) connected with the IAC bus rewired to Arkaos VJ makes it a lot easier to organize, only one computer is needed, and it can all be performed single-handed. And using Core Image and Arkaos hardware effects for video uses up almost zero CPU.
I do the same, but with lighting as well as video, and all within Live Rewired to Arkaos VJ and through a MIDI>DMX converter. Combining lasers, washes, scanners, blinders, and strobes with the video makes the audio impact even more powerful. It's the synergistic effect of audio, video, and lighting all choreographed and working together.
The same CCs that filter sweep, EQ, etc can control video and lighting parameters. Plogue Bidule can generate CCs from audio for beat detection, visual EQs, etc. The CC scratching audio, for example, can scrub video real-time, and pan a laser tunnel, too. One drum pad hit at the end of a phrase can trigger audio crash cymbal and thunder samples, send a nuclear blast video clip to the projectors, and flash the blinders and strobes. The same CC that does an audio EQ kill can hue shift the live video feed, color the washes the same color, and pan/tilt/zoom the cameras. I'm working on sequencing entire tracks so that the audio, video, and lighting are all contained in one Live clip that I can drag from the browser.
Follow Actions, MIDI effects, dummy clips, and virtual MIDI buses can translate beat repeat, for example, into stuttering/scratching/juggling laser sweeps, strobing patterns, as well as dynamic Arkaos video parameters that follow the audio.
Keeping it all organized is the biggest nightmare! I found that running three instances of Live (audio, lighting, video) connected with the IAC bus rewired to Arkaos VJ makes it a lot easier to organize, only one computer is needed, and it can all be performed single-handed. And using Core Image and Arkaos hardware effects for video uses up almost zero CPU.