Tom Swirly
12-14-2015, 10:06 AM
Greetings, fellow Reapees!
This post will be fairly long, as I have a fairly substantial task at hand - I both want to get your advice and "talk out" my problem.
I finally realized that Ableton was never going to fix the bugs that I and others have been complaining about for years so I decided to jump ship to Reaper.
I have a pretty large library of Javascript for Max/Max For Live that I wrote for my solo music show, where I play my songs and a few covers with DMX lighting controlled algorithmically from an OS/X machine - it's here: https://github.com/rec/swirly
I started using Javascript to do this because programming with boxes and wires in Max makes generating large, stable programs very difficult - however, I see that Reaper preferentially supports C++ and Python, languages I know intimately and prefer (JS is a pretty neat language, mind you).
I got to a 1.0 state where I could do shows in the middle of this year and did a bunch. Then I started to rewrite it to be able to do longer and more complex shows - but version 2.0 got only half-done before I remembered how frustrating Ableton is - see here:
https://docs.google.com/document/d/1sh7fG1cVtjVb5Ml5nYXeew9ckZYqRXApdLqSoU8oaSw/edit
So it seems to me I have the following possibilities:
1. Finish version 2.0 in Javascript in Max, and communicate with Reaper using OSC.
2. Port the Javascript to run with Reaper.
3. Port to Reaper as Python.
4. Port to Reaper as C++.
5. Port to Reaper as C++ and Python (either separate modules or using Cython).
---
Now (thanks for listening) let me tell you more about what I'm doing.
I have a repertory of songs (spacey weird pop songs, generally cheerful?) which I play solo. I have the computer where I cannot see it and a show is pre-arranged completely in advance from my collection of songs (actually, due to the limitations of Ableton, I generally was playing exactly the same show every time).
I own a bunch of controllers but at this time I use exactly one - a Yamaha WX-7 electronic wind controller which has a dedicated hardware sound generator and also sends MIDI to the computer.
I also have a microphone and I sing.
I have a small number of DMX lights and lasers that are synchronized to the music, but also triggered by my playing. The WX-7 sends MIDI notes, breath control, pitch bend (effectively only 6-bit precision, what can you do?) and program changes 1 through 5. A typical setting might be "note controls color, breath controls brightness".
While you don't want the lights to lag _too_ far behind the audio, synchronization is really not an issue. I've done experiments where I deliberately introduced a lot of lag and jitter into the system and it was hard to notice unless I really turned it up. I've never had an issue with that.
The actual Javascript algorithms to map the MIDI to the DMX are really pretty primitive. More smarts goes into making the documents describing the scenes easy to read, and I have a mechanism to describe lighting instruments with documents like this one: https://github.com/rec/swirly/blob/master/data/lights/laser/definition.json (yes, I know it's broken at the end, you can see the spot where I stopped and said, "I can't work with Ableton any more...")
And the key part is of course sequencing this to the music. I currently have little JSON documents describing the state at each scene and then a complex Rube Goldberg mechanism of sends to from multiple other Max For Live objects and the IAC bus to get around various issues described in that document. I could keep the JSON and reimplement it entirely - and perhaps with a lot less work.
---
So back to options 1 through 5. If I were advising someone else, I'd say, "Keep all the code you already have in Javascript".
But, well, I like to code. A lot of that Javascript was already ported from my own real-time Python libraries https://github.com/rec/echomesh particularly the color code. I have this handrolled unit test system in Javascript, but it's only sort of effective for what I'm doing. I'd love to throw that away.
Most of the Javascript is getting around the deficiencies of the Ableton/Max For Live ecosystem (the Love Canal of the DAW world). I'd have to throw that away anyway.
The one issue is that I only have a Max driver for my DMX interface. Now, this is a DMX USB Pro compatible unit. Someone might very well have already written something I can use. OR, I could have a tiny standalone Max patch that simply received OSC and sent that to DMX. It's one more moving part, but sometimes you can't avoid moving parts.
--
So _if_ I can do this entirely in Python then I think it's the way to go. Unless there is some overwhelming reason that either C++ or Javascript is a more effective solution. Don't get me wrong - I love C++, it's what I do for a living, but I can develop significantly faster in Python than in C++ (though C++11 has definitely evened the score somewhat).
So is this attainable in Python? Reading a score and scenes from Json files and then sending out OSC as Reaper plays? Or is there some reason that doing it in C++ or JS would be better?
Thanks again for listening to me work this out!
This post will be fairly long, as I have a fairly substantial task at hand - I both want to get your advice and "talk out" my problem.
I finally realized that Ableton was never going to fix the bugs that I and others have been complaining about for years so I decided to jump ship to Reaper.
I have a pretty large library of Javascript for Max/Max For Live that I wrote for my solo music show, where I play my songs and a few covers with DMX lighting controlled algorithmically from an OS/X machine - it's here: https://github.com/rec/swirly
I started using Javascript to do this because programming with boxes and wires in Max makes generating large, stable programs very difficult - however, I see that Reaper preferentially supports C++ and Python, languages I know intimately and prefer (JS is a pretty neat language, mind you).
I got to a 1.0 state where I could do shows in the middle of this year and did a bunch. Then I started to rewrite it to be able to do longer and more complex shows - but version 2.0 got only half-done before I remembered how frustrating Ableton is - see here:
https://docs.google.com/document/d/1sh7fG1cVtjVb5Ml5nYXeew9ckZYqRXApdLqSoU8oaSw/edit
So it seems to me I have the following possibilities:
1. Finish version 2.0 in Javascript in Max, and communicate with Reaper using OSC.
2. Port the Javascript to run with Reaper.
3. Port to Reaper as Python.
4. Port to Reaper as C++.
5. Port to Reaper as C++ and Python (either separate modules or using Cython).
---
Now (thanks for listening) let me tell you more about what I'm doing.
I have a repertory of songs (spacey weird pop songs, generally cheerful?) which I play solo. I have the computer where I cannot see it and a show is pre-arranged completely in advance from my collection of songs (actually, due to the limitations of Ableton, I generally was playing exactly the same show every time).
I own a bunch of controllers but at this time I use exactly one - a Yamaha WX-7 electronic wind controller which has a dedicated hardware sound generator and also sends MIDI to the computer.
I also have a microphone and I sing.
I have a small number of DMX lights and lasers that are synchronized to the music, but also triggered by my playing. The WX-7 sends MIDI notes, breath control, pitch bend (effectively only 6-bit precision, what can you do?) and program changes 1 through 5. A typical setting might be "note controls color, breath controls brightness".
While you don't want the lights to lag _too_ far behind the audio, synchronization is really not an issue. I've done experiments where I deliberately introduced a lot of lag and jitter into the system and it was hard to notice unless I really turned it up. I've never had an issue with that.
The actual Javascript algorithms to map the MIDI to the DMX are really pretty primitive. More smarts goes into making the documents describing the scenes easy to read, and I have a mechanism to describe lighting instruments with documents like this one: https://github.com/rec/swirly/blob/master/data/lights/laser/definition.json (yes, I know it's broken at the end, you can see the spot where I stopped and said, "I can't work with Ableton any more...")
And the key part is of course sequencing this to the music. I currently have little JSON documents describing the state at each scene and then a complex Rube Goldberg mechanism of sends to from multiple other Max For Live objects and the IAC bus to get around various issues described in that document. I could keep the JSON and reimplement it entirely - and perhaps with a lot less work.
---
So back to options 1 through 5. If I were advising someone else, I'd say, "Keep all the code you already have in Javascript".
But, well, I like to code. A lot of that Javascript was already ported from my own real-time Python libraries https://github.com/rec/echomesh particularly the color code. I have this handrolled unit test system in Javascript, but it's only sort of effective for what I'm doing. I'd love to throw that away.
Most of the Javascript is getting around the deficiencies of the Ableton/Max For Live ecosystem (the Love Canal of the DAW world). I'd have to throw that away anyway.
The one issue is that I only have a Max driver for my DMX interface. Now, this is a DMX USB Pro compatible unit. Someone might very well have already written something I can use. OR, I could have a tiny standalone Max patch that simply received OSC and sent that to DMX. It's one more moving part, but sometimes you can't avoid moving parts.
--
So _if_ I can do this entirely in Python then I think it's the way to go. Unless there is some overwhelming reason that either C++ or Javascript is a more effective solution. Don't get me wrong - I love C++, it's what I do for a living, but I can develop significantly faster in Python than in C++ (though C++11 has definitely evened the score somewhat).
So is this attainable in Python? Reading a score and scenes from Json files and then sending out OSC as Reaper plays? Or is there some reason that doing it in C++ or JS would be better?
Thanks again for listening to me work this out!