I work on the BoseAR WebSDK, and was recently able to access the sensor/gesture information from Max using Node for Max via WebSockets.
I was then able to access the BoseAR data in Ableton Live 10 using Max for Live, where I have a bunch of number boxes for every sensor (acceleration-XYZ, gyroscope-XYZ, rotation-YawPitchRoll, etc), and some bangs for basic gestures (double tap, head nod, head shake)
I can get automation for the various sensor data streams, but can I use that automation to automate other parameters (e.g. use the yaw to automate track panning), without having to manually make a dedicated Max for Live "pan effect" patch? I'm new to Ableton Live so I was curious if this was possible.
You can check out a video demonstration here