Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Discuss music production with Ableton Live.
wayfinder
Posts: 176
Joined: Tue Jul 28, 2009 4:01 pm

Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by wayfinder » Sat Feb 27, 2021 11:24 am

Hello!

I decided to finally use Live for the purpose for which I originally bought it (back in ... uh... 2009): Playing live! I have an A&H Xone:K1 and a small MIDI keyboard, and those two and a laptop are everything I want to use - I would like my setup to be super portable, and these three items fit in my backpack with all necessary cables and my sound card and headphones, which is perfect: I can set up anywhere and all I need is a wall outlet and a line in. So I started to plan how I was going to set up the performance set and the controls, and I quickly reached the point where it became apparent that pure in-app MIDI mapping would not be enough for what I wanted. So I am now delving into Remote Scripts!

Unfortunately, there is no official support for writing these, but a few people have made astonishing efforts to collect all kinds of info. There's also remotify, an app that can generate scripts from a nice friendly GUI. I tried to work with that at first, but I realized the control scheme I envisioned was probably not going to be a straightforward thing to implement in their editor (nevermind that it would be somewhat costly to unlock the necessary capabilities). So I decided to forge ahead on my own for now.

I hope that writing down my experiences with the system, and the challenges I'm facing, will not just keep me motivated to go through with this, and eventually have a great way to play live, but may also help others who have advanced needs in this area.

Resources

My first research efforts brought up the decompiled remote scripts that came with Live, and reference docs for a crucial part, the _Framework (both provided by Julien Bayle @ https://github.com/gluon/AbletonLive10. ... oteScripts & https://structure-void.com/AbletonLiveR ... Framework/), the reference for Live's API from nsuspray @ https://nsuspray.github.io/Live_API_Doc/10.1.0.xml, and Hanz Petrov's intro to _Framework @ http://remotescripts.blogspot.com/2010/ ... asses.html. I also found two remote scripts for the Xone:K2, which has an essentially identical layout to my target controller, and while they didn't work the way I wanted my script to work, the methods used in them have already been extremely valuable to learn from (https://github.com/macfergus/live-xonek2 by macfergus and https://maps.djtechtools.com/mappings/8209 by Nicola de Bello)

The first hurdle was that much of the information on how to write my own script was not just fragmented and spread over dozens of different forums, repositories and websites, but much of it was outdated, pertaining to versions of Live or Python that had in the meantime changed. This continues to be a source of churn, and it's part of why I'm writing this thread.

Development Environment

I use VSCode to write the scripts, on a Windows machine. The script files are kept in Live's Resources/MIDI Remote Scripts directory, which is important - it's also possible to get them to work by placing them in the User Library, but that has one incredible drawback: No hot reload! The only way to see changes in scripts in the User Library is to close and re-open Live, which can take 5 minutes on my machine, and that is a major drag for when all that's changed is a typo somewhere. But scripts in Resources can be unassigned and re-assigned in the Live Preferences > Link/MIDI dialog, and will happily reload without having to quit the app. I keep Live's Log.txt open in a window to use for inline debugging.

My First Script

I am not a (good) programmer. I can do a bit of scripting, and I have dabbled in gamedev and shader coding, but I've never written Python before, and I am unfamiliar with many of even the most basic programming concepts. I am coming to this as a beginner, and probably doing a lot of things wrong initially. Please, if you have superior knowledge, correct me!

It took me hours to just get a "Hello World" to work, but now it does.

For this to work, there need to be two files: __init__.py, which will let Live use my code, and the file containing that code (I called mine wfK1.py). Forgive me for going into excruciating detail here, but to me as a non-initiated user it helps to understand what exactly is going on.

The init file looks like this ('#' prefaces comments):

Code: Select all

# this tells Live the filename in which to look for my code

import wfK1


# this is the standardized function with which Live loads
# any script. c_instance is the Control Surface slot in Live's
# prefs, as far as I can tell

def create_instance(c_instance):


    # this is what it should load
    # (the thing inside my file that's called 'K1')
    
    return wfK1.K1(c_instance)
And then my code looks like this:

Code: Select all

# this lets me use Live's own generic Control Surface code
# to handle a bunch of background tasks in interfacing
# with the app, so I will only have to customize the behavior
# I want and not re-do all the plumbing

from _Framework.ControlSurface import ControlSurface


# this is the thing that'll be loaded in the __init__.py file.
# it's going to be based on the generic Control Surface
# (the slot that was called c_instance in __init__.py)

class K1(ControlSurface):


    # this defines the function to construct what code
    # ('self', i.e. this very thing I wrote below) the slot
    # ('instance') will be assigned
    
    def __init__(self, instance):
    
    
        # this tells the compiler (which turns the script into
        # instructions Live can actually execute) to start with
        # the generic Control Surface initial setup. Super
        # means that we're executing a command in there
        # instead of code in here. 
        
        super(K1, self).__init__(instance, False)
        
        
        # this is, as far as I can tell, a protection against crashes,
        # everything I do will be encapsulated in this guard. I
        # found a bunch of sources and scripts that
        # recommended to import the 'with' command, which
        # wasn't available in older versions of the script language,
        # but that turned out to not be necessary in Live 10
        
        with self.component_guard():
        
        
            # now we can do things! The first line shows a message
            # in Live's own status bar, the second adds a line to Log.txt
            
            self.show_message("Hello World")
            self.log_message("wayfinder K1 remote script loaded")
Aaaaannd.... success!

Image
(Screenshot from Log.txt)

Next step: Getting something on the controller to actually control something in Live!
Last edited by wayfinder on Sat Feb 27, 2021 9:02 pm, edited 2 times in total.

The Rabbits
Posts: 132
Joined: Tue Apr 07, 2020 2:23 pm

Re: Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by The Rabbits » Sat Feb 27, 2021 12:46 pm

Not sure if you found this or not but there's a few hints to be gleaned from these examples. All current scripts decompiled.

github/gluon/AbletonLive11_MIDIRemoteScripts

Good luck. I'm on a similar path trying to get more useful functionality for my Launchkey. The built in features are great but there's plenty of things I want to do.

Collapse and expand group tracks and move to the next and previous groups. Navigate through device chains with expansion of racks and moving through their chains. Enable and disable devices. Control all sends on tracks and mute/solo returns. Do all this in Arrangement view.......

Almost forgot. Add the following line to your options.txt file. It's fairly obvious what it does.

-AutoShowPythonShellOnError

wayfinder
Posts: 176
Joined: Tue Jul 28, 2009 4:01 pm

Re: Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by wayfinder » Sat Feb 27, 2021 12:50 pm

Thanks, and good luck to you too! It sounds like a lot of (rewarding) work to be sure :)

The Rabbits
Posts: 132
Joined: Tue Apr 07, 2020 2:23 pm

Re: Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by The Rabbits » Sat Feb 27, 2021 1:00 pm

I wrote software for accountants for 30 years. This is way more rewarding!!

wayfinder
Posts: 176
Joined: Tue Jul 28, 2009 4:01 pm

Re: Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by wayfinder » Sat Feb 27, 2021 2:39 pm

Image

Adding the first control functionality

To test whether adding controls to the script works, I wanted to control a Mute Track button with one of the buttons on my Controller. I found out that the mute button lives in the Live channel strip, which is called a 'Mixer' internally. So it is necessary to import the MixerComponent in addition to the ButtonElement in order to script that functionality. I guess in Ableton's nomenclature, "Elements" are things on the controller, and "Components" are things in Live. This is how it looks at the top of the file:

Code: Select all

from _Framework.ButtonElement import ButtonElement
from _Framework.MixerComponent import MixerComponent
Then I can replace my messaging code from the Hello World script with the button setup:

Code: Select all

            # this creates the means to control 1 track's channel strip
            
            self.mixer = MixerComponent(1)
            
            
            # in the strip (number 0), I'm setting the mute button 
            # to be controlled by a ButtonElement on my controller. The
            # 'True' means it will be an on/off toggle instead of just muting
            # while the button is held. The 0 in the ButtonElement is the type
            # of data the button produces - a MIDI note, in this case. 13 is
            # the MIDI channel my controller is using, and 36 is the note value
            # the button I want to use sends (that's a C2)
            
            self.mixer.channel_strip(0).set_mute_button(ButtonElement(True, 0, 13, 36))
            
            
            # since the "mute" button is actually an "is channel on" button, the
            # button would light up when the visual indicator in Live was showing
            # the "off" status, and the other way around, so I invert that behavior
            
            self.mixer.channel_strip(0).set_invert_mute_feedback(True)
Now I can hit a button to turn mute on and off on the first track, and the button will even light up! Next up: Figuring out how to make it light up in the color I want (the controller can show red, yellow, or green, depending on the note that is sent to it. The ButtonElement that automatically lights up the button has only one note though, and it will therefore only use one color, so I'm going to have to think of something).
Last edited by wayfinder on Tue Mar 02, 2021 12:29 pm, edited 2 times in total.

jbone1313
Posts: 578
Joined: Wed Dec 12, 2007 2:44 am

Re: Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by jbone1313 » Sat Feb 27, 2021 6:41 pm

Awesome! Love this. I have thought about going down this path, but I have not had the will to do the all the research you did. This will be enormously helpful for me.

I will be following this thread closely!

wayfinder
Posts: 176
Joined: Tue Jul 28, 2009 4:01 pm

Re: Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by wayfinder » Sat Feb 27, 2021 7:33 pm

Cheers, I'm having a lot of fun, and also a bunch of frustration to make me appreciate the wins so much more :D This next bit took me a long while to figure out, and in the end it turns out I almost had it right at the beginning but my solution didn't work because I indented it wrong (Python, you little rascal!), and I spent another few hours on trying different approaches of escalating complexity before taking another look at my earlier attempt and going, wait, WHY didn't this work? Anyway:

Image

Adding Different Colors To The LEDs

I found a few references to another mechanic which could conceivably lead to the same outcome, using a messaging forwarding callback, but this works now and that's good enough for me, and I'll definitely never ever eat these words :P The basic concept I chose was to extend the ButtonElement class to add colors to it. This is what the code for that looks like

Code: Select all

# these are just fancy names so that the code can be more easily read.
# the reason for these particular values is that the notes that turn the
# LEDs different colors are three octaves apart - 36 notes.

RED = 0
AMBER = 36
GREEN = 72


# i'm creating a new element, based on an existing one (ButtonElement)

class K1ButtonElement(ButtonElement):


    # these are the input parameter my new button takes. they are the same
    # the regular button uses, with color added at the end. this function 
    # will construct the button object from them

    def __init__(self, is_momentary, msg_type, channel, identifier, color):

	
	# first i'm creating a regular button 

        ButtonElement.__init__(self, is_momentary, msg_type, channel, identifier)
        
        
        # then i add the color property
        
        self.color = color
        
        # the construction ends here, but i'll have to override the original
        # button's function to turn on the LED to take color into account
        
    # this is handled by a function called turn_on
         
    def turn_on(self):
    
    
    	# first I need to replicate the original behavior to make sure that
    	# the correct MIDI message is sent to Live.
    
        self.send_value(ON_VALUE)
        
        
        # now I can send the command to turn the LED to the correct
        # color. MIDI messages are made up of up to three bytes; the first
        # here is 0x9n in hexadecimal, or 144 + n in decimal, where n is
        # the midi channel, that's a note on event. Then two data bytes,
        # which are the note (original note + color modifier), and the
        # velocity (which for this to work needs to be at least 1).
        # the send_midi command takes them as three values in a list, that's
        # why there are double brackets around the MIDI message
        
        self.send_midi((144 + self._original_channel, self._original_identifier + self.color, 127))
Now I have the means to make any button light up in any of the three colors! All I need to do is change this:

Code: Select all

self.mixer.channel_strip(0).set_mute_button(ButtonElement(True, 0, 13, 36))
Into this:

Code: Select all

self.mixer.channel_strip(0).set_mute_button(K1ButtonElement(True, 0, 13, 36, GREEN)) # or RED, or AMBER
Next step: setting up the basic track controls!
Last edited by wayfinder on Tue Mar 02, 2021 12:29 pm, edited 1 time in total.

jbone1313
Posts: 578
Joined: Wed Dec 12, 2007 2:44 am

Re: Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by jbone1313 » Sat Feb 27, 2021 11:24 pm

I promise to not derail this thread with any sidebars, but I did want to share some thoughts (and get any feedback on the same) on the hour or so I have spent on this thread and the linked documentation.

It seems like the _Framework scripts are a wrapper of a subset of the underlying API, and it seems they are mostly limited to control surface stuff.

The first thing I started digging into is whether and how I could create a script that would allow me to change device parameter values based on MIDI notes coming from a MIDI port. From what I can tell, that is not really possible using _Framework.

(Side note: you might be wondering why I do not just use MIDI mapping. If so, the reason is I want to do something where I change the parameter values for ALL devices matching a specific name. More specifically, I want to have Beat Repeat devices on all my tracks, which can all be changed by the same MIDI note. You might be wondering why I need them on all my tracks. The reason is the dreaded PDC issue with tempo-based Live devices; for those devices to play in time, they must precede any latency-inducing devices. Hence, placing them on a bus is no good.)

Then, I started digging into the API documentation. It seems pretty straightforward; however, I started bumping into my lack of Python knowledge pretty quickly. The list of list of methods and properties is all there. But, without decent examples, it seems like a much bigger time investment to learn the basics of Python, even with my experience with C#.

Given what wayfinder said about being "unfamiliar with many of even the most basic programming concepts," it is pretty impressive how he or she has made it this far. Wayfinder, with that attitude, you would be an asset to any dev team lol.

The Rabbits
Posts: 132
Joined: Tue Apr 07, 2020 2:23 pm

Re: Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by The Rabbits » Sun Feb 28, 2021 1:19 am

What your talking about is definitely possible, but it's something of a puzzle! I'm a beginner with python as well and initial learning curve is pretty steep.

I learnt/am learning a lot by looking at a script called Selected Track Control. It's open source, so there's no problem using it as a learning tool. It already does most of the things I listed in my post but not quite the way I want.

jbone1313
Posts: 578
Joined: Wed Dec 12, 2007 2:44 am

Re: Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by jbone1313 » Sun Feb 28, 2021 2:23 am

The Rabbits wrote:
Sun Feb 28, 2021 1:19 am
I learnt/am learning a lot by looking at a script called Selected Track Control. It's open source, so there's no problem using it as a learning tool. It already does most of the things I listed in my post but not quite the way I want.
Whoah! That is really helpful!

I just had a look at those scripts. Seeing those is like “cooking with gas” as they say.

Here is a link for the convenience of anyone else reading.

http://stc.wiffbi.com/download/

wayfinder
Posts: 176
Joined: Tue Jul 28, 2009 4:01 pm

Re: Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by wayfinder » Sun Feb 28, 2021 6:53 am

jbone1313 wrote:
Sat Feb 27, 2021 11:24 pm
It seems like the _Framework scripts are a wrapper of a subset of the underlying API, and it seems they are mostly limited to control surface stuff.
I think it goes beyond that. I mean, yes, it is a subset, and much of it is about control surfaces, but it contains a lot of ways to inspect and control Live devices. Check out the API docs: https://structure-void.com/AbletonLiveR ... Framework/, look for anything with "Component" in the name, you'll see a bunch of properties you can use to query the status of stuff like, whether the play status of a Clip has changed. Also, there are some utilities for working with Live's data structures more easily.
jbone1313 wrote:
Sat Feb 27, 2021 11:24 pm
The first thing I started digging into is whether and how I could create a script that would allow me to change device parameter values based on MIDI notes coming from a MIDI port. From what I can tell, that is not really possible using _Framework.

(Side note: you might be wondering why I do not just use MIDI mapping. If so, the reason is I want to do something where I change the parameter values for ALL devices matching a specific name. More specifically, I want to have Beat Repeat devices on all my tracks, which can all be changed by the same MIDI note. You might be wondering why I need them on all my tracks. The reason is the dreaded PDC issue with tempo-based Live devices; for those devices to play in time, they must precede any latency-inducing devices. Hence, placing them on a bus is no good.)

Then, I started digging into the API documentation. It seems pretty straightforward; however, I started bumping into my lack of Python knowledge pretty quickly. The list of list of methods and properties is all there. But, without decent examples, it seems like a much bigger time investment to learn the basics of Python, even with my experience with C#.
That absolutely sounds like something that should be possible with these scripts. Specifically, I think you could write something to work over an enumeration of all devices and when their names match your criteria, change something in them, using DeviceComponent perhaps? Another free script to learn from is ClyphX: https://github.com/ldrolez/clyphx-live10. Perhaps that has something that'll be useful to you? I just browsed the manual though, and it looked like it does a LOT, like, A LOT of things, so i might be difficult to find one thing specifically.
jbone1313 wrote:
Sat Feb 27, 2021 11:24 pm
Given what wayfinder said about being "unfamiliar with many of even the most basic programming concepts," it is pretty impressive how he or she has made it this far. Wayfinder, with that attitude, you would be an asset to any dev team lol.
Thank you :)

jbone1313
Posts: 578
Joined: Wed Dec 12, 2007 2:44 am

Re: Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by jbone1313 » Sun Feb 28, 2021 3:22 pm

Thanks wayfinder!

I forgot the OG ClyphX is open. I just glanced at it and it is gold! I probably will not be able to resist trying this.

One thing that stood out to me is that ClyphX seems to use basic loops to iterate through and set parameters. Yesterday, I was thinking some special parallel handling would be needed to set parameters of multiple devices at the same time (given it is Beat Repeat and timing matters), but now I am kind of thinking basic loops will do.

Yeah, the first thing I did was look closely at DeviceComponent in _Framework, and I did not see much about dealing with specific parameters and devices. But, that is ok. The underlying API is not too bad.

The Hello World and other example you provided have been really helpful.

Thanks again!

wayfinder
Posts: 176
Joined: Tue Jul 28, 2009 4:01 pm

Re: Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by wayfinder » Sun Feb 28, 2021 9:34 pm

Good luck mate, glad to be of help!

Image

Here's what's been happening on my end:

The Basic Live Project Setup

I created four groups of three tracks each, to later contain audio clips. My plan is to use group 1 for drums, group 2 for bass & background sounds, group 3 for hooks, and group 4 for fx and atmospheres. I wanted to have some options for blending between multiple clips so three tracks per group seemed good, and it fit with the number of controls. Next I assigned the four faders on my controller to the four group volumes, and the three pots per lane to the volumes for the three tracks in each group. So far, so good.

Image
Image

(imagine I drew lines between the things if you like)

It turned out that with all the mapping going on, it made sense to set up some variables to help with keeping things more readable, so I put down some names for the channel (which also makes it easier to change that, should the need arise) and the controller definitions. I also imported a few variable names from Live:

Code: Select all

# this file contains a few standard variable names like MIDI_NOTE_TYPE

from _Framework.InputControlElement import *


# these are my own variables

CHANNEL = 13
POTS1_CCS = [4, 5, 6, 7]
POTS2_CCS = [8, 9, 10, 11]
POTS3_CCS = [12, 13, 14, 15]
FADER_CCS = [16, 17, 18, 19]
ENC_CCS = [0, 1, 2, 3]
ENCPUSH0 = [52, 53, 54, 55]
SWITCHES1 = [48, 49, 50, 51]
SWITCHES2 = [44, 45, 46, 47]
SWITCHES3 = [40, 41, 42, 43]
SWITCHESA = [36, 37, 38, 39]
SWITCHESE = [32, 33, 34, 35]
SWITCHESI = [28, 29, 30, 31]
SWITCHESM = [24, 25, 26, 27]
LEFTSHIFT = 12
RIGHTSHIFT = 15
ENCLEFT_CC = 20
ENCRIGHT_CC = 21
ENCLEFT_PUSH = 13
ENCRIGHT_PUSH = 14
Since I was going to use faders and rotary encoders, I needed to import the code for those:

Code: Select all

from _Framework.SliderElement import SliderElement
from _Framework.EncoderElement import EncoderElement
And here's how I created the mappings:

Code: Select all

            # had to expand the mixer to handle 16 tracks
            
            self.mixer = MixerComponent(16)
            
            
            # this sets up a loop where everything in it is run four times,
            # with the value of the variable i being 0 in the first run, 1 in
            # the second, and so on
            
            for i in range(4):
            
                # this assigns the faders to the group volumes (tracks 0, 4, 8 and 12)
                # the CC values are defined as entries 0-3 in an array up in the variables
                # section so I can do all four of them with this one command
                
                self.mixer.channel_strip(i*4).set_volume_control(SliderElement(MIDI_CC_TYPE, CHANNEL, FADER_CCS[i]))


               # the pots to control the audio tracks are set up in the same way
               
                self.mixer.channel_strip(i*4+1).set_volume_control(EncoderElement(MIDI_CC_TYPE, CHANNEL, POTS3_CCS[i], Live.MidiMap.MapMode.absolute))

                self.mixer.channel_strip(i*4+2).set_volume_control(EncoderElement(MIDI_CC_TYPE, CHANNEL, POTS2_CCS[i], Live.MidiMap.MapMode.absolute))

                self.mixer.channel_strip(i*4+3).set_volume_control(EncoderElement(MIDI_CC_TYPE, CHANNEL, POTS1_CCS[i], Live.MidiMap.MapMode.absolute))
That works beautifully, and I can start mixing clips in realtime with my controller. But how the heck do I control their playback from it? Ahh, good question. It's ...

The Session and the Button Matrix

Enter the red box! This is a section of the session view that can be moved around, and the controls I assign it will be updated to affect the clip slots inside it, wherever it is. Its internal name in Live is a "session". It will be controlled by an array of buttons called a Button Matrix (which I can freely choose, I'm not restricted to buttons in an actual matrix formation on the controller, or any special kinds of button, it's just regular buttons).

It's surprisingly straightforward to set up! First, more imports:

Code: Select all

from _Framework.SessionComponent import SessionComponent
from _Framework.ButtonMatrixElement import ButtonMatrixElement
I want a section that uses the full width of my 16 tracks and is just one clip in height.

Code: Select all

            self.session = SessionComponent(16, 1)
            
            
            # this next line assigns an object that will be responsible for actually
            # drawing the red box. from how everybody else does it, it seems
            # appropriate for this to be the session itself, and I did not test anything else
            
            self.set_highlighting_session_component(self.session)
            
            
            # this next line is necessary because the session usually only gets drawn
            # when anything in it changes, and on startup, nothing has so far. so
            # I force it
            
            self.session.update()
And voilà:

Image

The buttons that are going to launch clips inside the red box are going to be the ones under the knobs and faders that control the track's volume. I'm setting up a new variable with the correct Button identifiers in the correct order, and also an array for their colors - you may have noticed that the first track in every group has a red title, the second is yellow, and the third green. I want to mirror this on the controller to make it easier to see what's where. Going from horizontal to vertical is otherwise probably a bit distracting, but I hope this will help. Here's how the two arrays look:

Code: Select all

SWITCHMATRIX = [36, 40, 44, 48, 37, 41, 45, 49, 38, 42, 46, 50, 39, 43, 47, 51]
SWITCHCOLOR = [RED, RED, AMBER, GREEN]
Now I'm ready to set up the ButtonMatrix and connect it to the session clip slots!

Code: Select all

            # creating the object to contain the Matrix
            
            self.matrix = ButtonMatrixElement()
            
            
            # The underlying structure of the matrix is a 2-dimensional array,
            # with rows containing columns. I will have just the one row, starting
            # with an empty container
            
            button_row = []
            
            for column in range(16):
                
                
                # these button definitions work just like the test button I had to mute
                # a track. for the identifier, I pick the number from the array I set up above,
                # and the color takes the same 4 values in sequence over and over
                
                mbutton = K1ButtonElement(False, MIDI_NOTE_TYPE, CHANNEL, SWITCHMATRIX[column], SWITCHCOLOR[column % 4])
                
                
                # each of these buttons is then added to the row
                
                button_row.append(mbutton)
                
                
                # next, I choose the clip slot that the current button will control. it's going
                # to be in the first (and only) row, number 0, and its horizontal position
                # is the current column
                
                slot = self.session.scene(0).clip_slot(column)
                
                
                # then I assign the button to launch the clip in that slot
                
                slot.set_launch_button(mbutton)
                
            
            # when all the buttons are added to the row, the row can be added to the
            # matrix. Tuples are, I think, a kind of dictionary entry, or key-value pair.
            # I super don't know why this is done the way it's done, but that's how they
            # do it, and so that's how I do it, and it works.
            
            self.matrix.add_row(tuple(button_row))
Alright! Now I have a red box and I can launch clips inside it. But how do I move the box? It was very easy to set up buttons to move the box, and I implemented that to test the functionality. But I wanted one of the endless rotary encoders to scroll the box, because having two separate buttons to move it was clumsy and would not have worked well with the rest of the control scheme. This needed to be snappy! And so I began a long, loong, loooong search for how to do this thing, on this controller. I tried a whole lot of different things that just would. not. work. The encoders send CC messages, but the whole button-based infrastructure expects note values! And I could not figure out how to change that. Then I tried, out of desparation, to declare the encoder as a button, whaaat?? That actually worked, but I couldn't get one encoder to control both up and down movement. I did some more research, and found some forum threads with people who had, maddeningly, solved the problem but not provided the solution. But their remarks set me on the right path eventually (in what must have been the 5th different approach):

Value Listeners

It's possible to set up a function that will fire whenever a certain thing happens. In my case, I needed to delete all the button-based controls for moving the red box, and instead listen for changes in the rotary encoder to then shift the box myself. I used a midi monitor to figure out what the encoder actually sends when it's being moved (a factory Kontakt script did the trick for me without having to leave Live to free the device for a standalone or web-based monitor), and then translated it to my desired behavior. Here's how I eventually made it work (and that was a rush of nice brain chemicals to see it in action after a day of bashing my head against it):

Code: Select all

            # this is the control I want to use
            
            encr = EncoderElement(MIDI_CC_TYPE, CHANNEL, ENCRIGHT_CC, Live.MidiMap.MapMode.absolute)


            # here I'm telling Live to call a certain function when the
            # value of the control changes
            
            encr.add_value_listener(self.encr_moved)
            
    
    # this function is part of my overarching K1 object, I had
    # to be careful with the indentation. it receives the new value
    # as a parameter
    
    def encr_moved(self, value):
        
        
        # turning the knob to the right produces a 1, to the left a 127,
        # so i mapped the two to paging down and up. this works for
        # my case, because I have just the one row - paging moves as
        # many rows up or down as the height of the box is. 
        
        if value is 1:
            self.session._scroll_page_down()
        else:
            self.session._scroll_page_up()
And it works! I did an actual real-life fist pump and then scrolled the red box up and down like a madman for a few minutes. It's so quick now, I love it.

However... there's always more. Turns out I am not happy with the ButtonMatrix yet. For one thing, the buttons didn't take their colors! Or rather, they did (I checked that the function that sets the correct one is actually being called, and it is!), but then they immediately all turned red too fast to even see the right color. So that's something I'll have to address. Also, and this might be the more consequential issue, currently the buttons are lit when a clip is playing in that slot, and unlit when there isn't, which includes two very different cases: No clip at all, or a stopped clip. I would very much like to have the buttons be off only when the slot is empty, lit when there is a stopped clip, and then ・゚: *✧・゚:* blinking with the beat *:・゚✧*:・゚ when the clip is playing. This is one thing I'm absolutely not sure I can pull off. Time Will Tell (if you wanna know how fast someone can shoot an apple off their kid's head)
Last edited by wayfinder on Tue Mar 02, 2021 12:30 pm, edited 1 time in total.

The Rabbits
Posts: 132
Joined: Tue Apr 07, 2020 2:23 pm

Re: Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by The Rabbits » Mon Mar 01, 2021 2:45 am

Great work. I hadn't got into the framework that far. Lots to learn.

wayfinder
Posts: 176
Joined: Tue Jul 28, 2009 4:01 pm

Re: Project thread: Building an ultra-portable performance set, and my experience with Remote Scripting in Live 10

Post by wayfinder » Mon Mar 01, 2021 10:49 am

The Rabbits wrote:
Mon Mar 01, 2021 2:45 am
Great work. I hadn't got into the framework that far. Lots to learn.
Thank you! I hope you can also have some fun learning new stuff, I know I do.

Now, remember when I said I would definitely never ever eat those words earlier? Turns out...

Image

Fixing the LED colors

That's right! When I first set up the colors, I did it in a way the red box functionality would not work with. So I spent some more time digging through the code and ended up finding out what went wrong. I had to update my K1ButtonElement and override a different function. The change is even smaller than the earlier override, and feels more elegant, too. Instead of sending the wrong value and then the right one in the turn_on wrapper function, I ended up amending the function that actually assembles the MIDI message to take color into account, like so (the arrow marks the only place it differs from the original):

Code: Select all

class K1ButtonElement(ButtonElement):
    def __init__(self, is_momentary, msg_type, channel, identifier, color = RED):
        ButtonElement.__init__(self, is_momentary, msg_type, channel, identifier)
        self.color = color
    def _do_send_value(self, value, channel = None):
        data_byte1 = self._original_identifier + self.color # <---------------
        data_byte2 = value
        status_byte = self._status_byte(self._original_channel if channel is None else channel)
        if self._msg_type == MIDI_PB_TYPE:
            data_byte1 = value & 127
            data_byte2 = value >> 7 & 127
        if self.send_midi((status_byte, data_byte1, data_byte2)):
            self._last_sent_message = (value, channel)
            if self._report_output:
                is_input = True
                self._report_value(value, not is_input) 
Now, while the functionality remains to be adapted to my wishes, at least the LEDs all have the correct colors.

Better Debugging Tools

Two things helped me figure out a bunch of stuff much faster: First, I upgraded my Log.txt window to a real log viewer (an extension for VSCode), which means I don't have to reload and scroll down every time I want to check for errors or debug messages in there anymore. The log viewer keeps updating the log in real time and always shows the bottom of the file, as long as I don't scroll up on purpose, so I get to see everything as it happens.

The other thing was that I set up a global logging function in the script, which allowed me to sidestep Scope Hell, where everything becomes a matter of figuring out how to access the object that owns the log_message function from inside some button or slider. I got the method from macfergus' K2 script, and I thought it was very clever. It works by first declaring an empty variable that will later be pointing to the actual logging function. It can't be pointed at the function directly, because that doesn't yet exist when the globals are set up! Then he defines a wrapper function right there (those are called "unbound functions", and they're kinda like globals, accessible from everywhere) that will take the message he wants to log and passes it to the actual logging function (which, remember, at this point we still don't have, so there needs to be a check that it will only try to do that when the function is actually available). This code goes right under the imports:

Code: Select all

logger = None

def log(msg):
    global logger          # this allows the function to access the global variable
    if logger is not None:
        logger(msg)
And then, later, when we have created the object that actually owns the log_message function (which is our main object, K1), it's time to fill the global variable with a reference to it, like so (this is just the very start of the function):

Code: Select all

class K1(ControlSurface):
    def __init__(self, instance):
        super(K1, self).__init__(instance, False)
        with self.component_guard():
            global logger
            logger = self.log_message # note: no brackets, because we're not calling the function, just pointing to it
With this setup, I can now, from anywhere in the script, regardless of where in the scope I am, go "log(message)" and it will do that without complaint. Really useful tool to have.
Last edited by wayfinder on Tue Mar 02, 2021 12:30 pm, edited 1 time in total.

Post Reply