Audio/visual any one?

Discuss music production with Ableton Live.
Post Reply
itsthejayj
Posts: 98
Joined: Wed Sep 27, 2006 11:01 am
Location: UK brighton

Audio/visual any one?

Post by itsthejayj » Thu Mar 15, 2007 11:35 pm

Hi all, thought it was about time I gave you a sneak peek of the audio/visual extension i’ve built for Ableton Live, entitled JAM (Just Add Music).

I believe the fundamental flaw with using Ableton (when performing) is that the general public rarely know what effects and trickery you use, they just think all your hard work is part of the music track your playing. A visual indication is needed… As for producers imagine having your audio effects perfectly insync with visual effects.

I’ve upload a set I quickly made to Youtube all the links are found at:-
http://www.youtube.com/itsthejayj

**plz note that i have edited the music videos outside of JAM, looping certain part of the videos so they last the same length of a promo track.

Some screengrabs here
http://www.youcanbespecialtoo.com/JAMscreengrabs

I asked you all for your opinions. Although all core functionality is completed its a little while off commercial release. However I would still like to know what area’s you would like development.

JAM is controlled by Ableton, This means you use Ableton exactly the same way you would normally; it affects the visuals/video automatically in real time, knitting the audio and visual together. Weather your in session view or arrangement view, the only thing you have to set up is the video’s path in JAM.

+ My setup consists of one Powerbook running Ableton live, one macbook pro running JAM all controlled using a Korg microcontroller.

Jay j

Breif Functionality list

4 Channel video mixing
Video Looping
Auto / Quantized loading of clips / global effects
Auto stopping of clips
Possibility of 49 clips on each channel (this hasn’t really got a limit could put as many as you lot want )
16 Global function effects: (global functions are loaded in sync with the Quantization menu setting)

4 channels Slit screen
Blend mode Screen
Pop
Multi-screen
Wobble
Psychedelic
Disperse
Rainbow
Broken TV
Poppers
The big picture
Fade switch
Horizontal Split screen
Circular wrap distortion
Kaleidoscope

Effects are dependant on the midi clock speed
32 different blend modes
Full midi clock Tempo sync
Reverse switch – (uses Reversinator vst)
EQ three visual effects
Low, mid, high kill switches
Filter sweep visual sweep
Fade to grey visual smear
Flange function visual blur effect
Beat repeat visual loop system
Cut-o-matic blank flash visual effects
Phaser visual sepia effect
Auto pan
3D A/V Master pan
Record / playback ability in abletons arrangement view
Video Camcorder support
Nintendo Wii support
14 Audio monitor channels
Last edited by itsthejayj on Fri Mar 16, 2007 12:27 am, edited 1 time in total.

ConsciousPilot
Posts: 38
Joined: Tue Oct 25, 2005 4:06 am

Post by ConsciousPilot » Thu Mar 15, 2007 11:51 pm

This looks interesting. I currently use Arkaos rewired with Ableton for my AV work.

Is this a standalone program or what? I watched the video but really still have no idea what JAM really is. Can you do multiple layer blending, controlled with MIDI controllers?

Tone Deft
Posts: 24152
Joined: Mon Oct 02, 2006 5:19 pm

Post by Tone Deft » Thu Mar 15, 2007 11:55 pm

Is is trance(tm) compatible?
In my life
Why do I smile
At people who I'd much rather kick in the eye?
-Moz

NorthernMonkey
Posts: 1098
Joined: Fri Feb 09, 2007 12:05 pm
Location: UK

Post by NorthernMonkey » Thu Mar 15, 2007 11:58 pm

The screenshots take forever to load, any chance of a faster server so I can see what's going on as the video doesn't tell me jack.
..?

corygilbert
Posts: 828
Joined: Thu Jul 22, 2004 2:37 pm
Location: kyoto, japan
Contact:

Post by corygilbert » Fri Mar 16, 2007 12:05 am

can we say drop a video clip or clips into the program and have live automatically do the jump-step kinda stuff that you have going on.
I know for me at least, i can vj and i can perform my music, but as far as the live feel goes I'd kill for an app where I could drop some qt files and have the midi or whatever of ableton spit interesting patterns/ or at least timing to an evolving visual piece that could be linked out to svideo.
looks great, sleek, simple, let us know when we can try it.

Clearscreen
Posts: 1743
Joined: Sun Jul 11, 2004 5:07 am
Location: Melbourne AU
Contact:

Post by Clearscreen » Fri Mar 16, 2007 12:59 am

looks very interesting! how much cpu do you need to run this? is it cross platform? is it flash based (seeing your other work on your site is mostly flash)? do you have to have videos trimmed ready to loop?
Hp Elitebook 2.8Ghz. Live 7.0.14 & Live 8.1.5, XP Pro. and stuff...

Machinate
Posts: 11648
Joined: Thu Jun 24, 2004 2:15 pm
Location: Denmark

Post by Machinate » Fri Mar 16, 2007 1:12 am

The idea of providing the audience with a 1:1 correlation between audio and video is good. We attempt to do this in my band too, by manipulating openGL structures in real-time, triggering shape-generating envelopes and such.

There's also a max-based trio(?) out there that has a virtual dancer, controlled by musical parameters. It's definitely a field worth investigating, for sure.
mbp 2.66, osx 10.6.8, 8GB ram.

nobbystylus
Posts: 1067
Joined: Sat Feb 21, 2004 4:32 pm
Location: london

Post by nobbystylus » Fri Mar 16, 2007 1:24 am

this all looks very cool.. but i'm not quite sure how different it is to already established VJ apps, such as modul8, arkaos, resolume etc.. have you got any screenshots of the app itself when used in conjuction with live? i've been using modul8 (experimentally) and its pretty fantastic... albeit demanding on all but the latest systems...
http://www.myspace.com/wardclerk
http://www.myspace.com/bighairufreqs
LIVE 8.21/ Reaktor 5.51/VDMX/Quartz Composer

Machinate
Posts: 11648
Joined: Thu Jun 24, 2004 2:15 pm
Location: Denmark

Post by Machinate » Fri Mar 16, 2007 1:52 am

yeah, modul8 uses a ton of openGL as far as I know, so you generally need a decent gfx processor, think mbp or tower with a good card in it.

When I started this project the first thing I did was to get a geforce 8800 gts. It'll take anything you throw at it. The real bottleneck is running multiple (3-4+) 800x600 videos simultaneously. This is usually more than harddrives can take.
mbp 2.66, osx 10.6.8, 8GB ram.

zappen
Posts: 345
Joined: Sat Jul 16, 2005 1:42 am

Post by zappen » Fri Mar 16, 2007 3:34 am

interesting stuff,is some points the video is not much clear about wich are the added effects/jumps.. apart of the rayblur, blackscreen,colourization effects i noticed,
i'd like to see more interface shots too!
corygilbert wrote:I'd kill for an app where I could drop some qt files and have the midi or whatever of ableton spit interesting patterns/ or at least timing to an evolving visual piece that could be linked out to svideo.
an intresting job can be done with the last beta of visualjockey(pc),
it runs on the same pc in dualscreen more, uses OpenGL and no limits about effects or layers, it's fully customizable! and not hard as a maxmsp!
visualjockey can be linked to live with midi in, MIDIOUT and realtime sound analisys, generate sin,cos,tri,saw,rnd waves and bools signals based on the live's bpm to create crazy tricks about streching/cutting video clips in sync, or to control anything! automatize/sequence them and then send those vars/bools back to live as midi signals to control it, it's hard to explain in a few words... the possibilites are huge
you might want to check this about looping tricks http://visualjockey.stalker.nl/viewtopic.php?t=3535

8)

hambone1
Posts: 5346
Joined: Fri Feb 04, 2005 8:31 pm
Location: Abu Dhabi

Post by hambone1 » Fri Mar 16, 2007 10:20 am

Interesting!

I do the same, but with lighting as well as video, and all within Live Rewired to Arkaos VJ and through a MIDI>DMX converter. Combining lasers, washes, scanners, blinders, and strobes with the video makes the audio impact even more powerful. It's the synergistic effect of audio, video, and lighting all choreographed and working together.

The same CCs that filter sweep, EQ, etc can control video and lighting parameters. Plogue Bidule can generate CCs from audio for beat detection, visual EQs, etc. The CC scratching audio, for example, can scrub video real-time, and pan a laser tunnel, too. One drum pad hit at the end of a phrase can trigger audio crash cymbal and thunder samples, send a nuclear blast video clip to the projectors, and flash the blinders and strobes. The same CC that does an audio EQ kill can hue shift the live video feed, color the washes the same color, and pan/tilt/zoom the cameras. I'm working on sequencing entire tracks so that the audio, video, and lighting are all contained in one Live clip that I can drag from the browser.

Follow Actions, MIDI effects, dummy clips, and virtual MIDI buses can translate beat repeat, for example, into stuttering/scratching/juggling laser sweeps, strobing patterns, as well as dynamic Arkaos video parameters that follow the audio.

Keeping it all organized is the biggest nightmare! I found that running three instances of Live (audio, lighting, video) connected with the IAC bus rewired to Arkaos VJ makes it a lot easier to organize, only one computer is needed, and it can all be performed single-handed. And using Core Image and Arkaos hardware effects for video uses up almost zero CPU.

Post Reply