12-14-2015, 10:06 AM | #1 |
Human being with feelings
Join Date: Dec 2015
Posts: 6
|
Refugee from Ableton/Max-land wanting to port large JS codebase.
Greetings, fellow Reapees!
This post will be fairly long, as I have a fairly substantial task at hand - I both want to get your advice and "talk out" my problem. I finally realized that Ableton was never going to fix the bugs that I and others have been complaining about for years so I decided to jump ship to Reaper. I have a pretty large library of Javascript for Max/Max For Live that I wrote for my solo music show, where I play my songs and a few covers with DMX lighting controlled algorithmically from an OS/X machine - it's here: https://github.com/rec/swirly I started using Javascript to do this because programming with boxes and wires in Max makes generating large, stable programs very difficult - however, I see that Reaper preferentially supports C++ and Python, languages I know intimately and prefer (JS is a pretty neat language, mind you). I got to a 1.0 state where I could do shows in the middle of this year and did a bunch. Then I started to rewrite it to be able to do longer and more complex shows - but version 2.0 got only half-done before I remembered how frustrating Ableton is - see here: https://docs.google.com/document/d/1...qSoU8oaSw/edit So it seems to me I have the following possibilities: 1. Finish version 2.0 in Javascript in Max, and communicate with Reaper using OSC. 2. Port the Javascript to run with Reaper. 3. Port to Reaper as Python. 4. Port to Reaper as C++. 5. Port to Reaper as C++ and Python (either separate modules or using Cython). --- Now (thanks for listening) let me tell you more about what I'm doing. I have a repertory of songs (spacey weird pop songs, generally cheerful?) which I play solo. I have the computer where I cannot see it and a show is pre-arranged completely in advance from my collection of songs (actually, due to the limitations of Ableton, I generally was playing exactly the same show every time). I own a bunch of controllers but at this time I use exactly one - a Yamaha WX-7 electronic wind controller which has a dedicated hardware sound generator and also sends MIDI to the computer. I also have a microphone and I sing. I have a small number of DMX lights and lasers that are synchronized to the music, but also triggered by my playing. The WX-7 sends MIDI notes, breath control, pitch bend (effectively only 6-bit precision, what can you do?) and program changes 1 through 5. A typical setting might be "note controls color, breath controls brightness". While you don't want the lights to lag _too_ far behind the audio, synchronization is really not an issue. I've done experiments where I deliberately introduced a lot of lag and jitter into the system and it was hard to notice unless I really turned it up. I've never had an issue with that. The actual Javascript algorithms to map the MIDI to the DMX are really pretty primitive. More smarts goes into making the documents describing the scenes easy to read, and I have a mechanism to describe lighting instruments with documents like this one: https://github.com/rec/swirly/blob/m...efinition.json (yes, I know it's broken at the end, you can see the spot where I stopped and said, "I can't work with Ableton any more...") And the key part is of course sequencing this to the music. I currently have little JSON documents describing the state at each scene and then a complex Rube Goldberg mechanism of sends to from multiple other Max For Live objects and the IAC bus to get around various issues described in that document. I could keep the JSON and reimplement it entirely - and perhaps with a lot less work. --- So back to options 1 through 5. If I were advising someone else, I'd say, "Keep all the code you already have in Javascript". But, well, I like to code. A lot of that Javascript was already ported from my own real-time Python libraries https://github.com/rec/echomesh particularly the color code. I have this handrolled unit test system in Javascript, but it's only sort of effective for what I'm doing. I'd love to throw that away. Most of the Javascript is getting around the deficiencies of the Ableton/Max For Live ecosystem (the Love Canal of the DAW world). I'd have to throw that away anyway. The one issue is that I only have a Max driver for my DMX interface. Now, this is a DMX USB Pro compatible unit. Someone might very well have already written something I can use. OR, I could have a tiny standalone Max patch that simply received OSC and sent that to DMX. It's one more moving part, but sometimes you can't avoid moving parts. -- So _if_ I can do this entirely in Python then I think it's the way to go. Unless there is some overwhelming reason that either C++ or Javascript is a more effective solution. Don't get me wrong - I love C++, it's what I do for a living, but I can develop significantly faster in Python than in C++ (though C++11 has definitely evened the score somewhat). So is this attainable in Python? Reading a score and scenes from Json files and then sending out OSC as Reaper plays? Or is there some reason that doing it in C++ or JS would be better? Thanks again for listening to me work this out! |
12-14-2015, 10:21 AM | #2 |
Human being with feelings
Join Date: Feb 2007
Location: Oulu, Finland
Posts: 8,062
|
Your post is kind of long and would need to be addressed in detail...However, one thing to take note of immediately : Reaper's "JS" is not "JavaScript" but "JesuSonic". And they are completely different languages. Reaper doesn't have any built-in support for JavaScript.
__________________
I am no longer part of the REAPER community. Please don't contact me with any REAPER-related issues. |
12-14-2015, 12:51 PM | #3 |
Human being with feelings
Join Date: Dec 2015
Posts: 6
|
Excellent then. So much the better, it eliminates Javascript entirely! Thank you.
That sort of thing is really just what I'm looking for rather than an in-detail analysis, I know it's too long. :-) Any reason why Python might be problematic - features you can only get in C++, severe performance implications? |
12-14-2015, 01:00 PM | #4 | |
Human being with feelings
Join Date: Feb 2007
Location: Oulu, Finland
Posts: 8,062
|
Quote:
With C++ you can implement a subclass of Reaper's PCM_source that can result in more predictable and accurate timing, but even with that things might turn out tricky. (I've done such a subclass myself that was able to send OSC messages but I didn't develop that thing very far, it was just for some preliminary testing purposes.)
__________________
I am no longer part of the REAPER community. Please don't contact me with any REAPER-related issues. |
|
12-14-2015, 02:56 PM | #5 |
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,690
|
There are several points in your post that trigger my comments.
1) Javascript Language: Reaper feature a built-in script language called "EEL" that is different but a slightly similar concept from Java Script (both are derived from C, but done for a less "strict" programming style). So when you come from JS, I suppose you will be up to speed with EEL in no time. Reaper uses EEL as well for VST-alike "JesuSonic" Plugins (living together with VSTs and other plugins - handling Midi and audio in and out - in the tracks' effect chains), as for "scripts" that influence the behavior of Reaper itself by an appropriate API. I did a number of EEL Plugins for customizing the "Live" use of Reaper, for myself (playing two master keyboards and a TEC "BBC" Breath controller - I buried my old WX7 when I got <the brand new pre-release version of> the BBC) and for a friend, who plays an EWI wind controller. I did a description of the set of EEL plugins I did for him. -> http://www.bschnell.de/patch.pdf 2) Using Reaper "Live": Here the free SWS "LiveConfigs" extension already provides great functionality without you needing to do any programming. Here a revised manual for that -> http://www.bschnell.de/LiveConfigs_1.pdf. I mainly use reaper for Live Playing with VST instruments and effects. Works great, awesome stability (Windows 7)! 3) DMX: Another friend of mine is about to release (sell) a little device called "ADMX/2" that (among other things) allows for attaching Reaper to DMX lighting equipment via EEL plugins, that send out midi messages. The EEL plugins take (reprogrammed) Reaper "envelops" our sound output from other tracks as an input. (Of course Midi output from tracks cold be used as well.) Please let me know if you have any specif questions. on these issues. -Michael Last edited by mschnell; 12-14-2015 at 11:32 PM. |
12-14-2015, 03:57 PM | #6 | |
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,690
|
Quote:
-Michael |
|
12-14-2015, 04:20 PM | #7 | |
Human being with feelings
Join Date: Feb 2007
Location: Oulu, Finland
Posts: 8,062
|
Quote:
__________________
I am no longer part of the REAPER community. Please don't contact me with any REAPER-related issues. |
|
12-14-2015, 05:44 PM | #8 |
Human being with feelings
Join Date: Apr 2009
Location: GWB
Posts: 76
|
Hey Swirly!
Good to hear from you here. You've gotten some good advice above. What I might add is to think about Lua, which duplicates JS's data as code aspect, enabling you to create a bunch of Lua data files that can be loaded and run. Perhaps a simpler, better choice than Python. Look for the "Reapscript Documentation" in the Help Menu. Investigate Jesusonic and/or Eel as M4L substitutes to do the MIDI fiddling. It may make sense to jettison the OSC for straight MIDI. (I'm not at all familiar with Reaper OSC capabilities.) Port, port, port! 'Appy 'Olidays! Tad |
12-14-2015, 11:19 PM | #9 | |
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,690
|
Quote:
But I suppose he should re-think, as for live performance JesuSonic plugins make a lot more sense. Especially as he wants to control his setup with one or more Midi Controllers (WX7, ...). I do this all the time. In fact, I never tried to do a "Reaper Script". I am really happy with what SWS LiveConfigs provides to control Reaper internals (muting/unmuting tracks, pushing Parameters onto plugins, executing Reaper actions (with this also ReaScript - I did not try this), ...). All this now is controlled by Midi CC messages. Once you find out that you can generate / modify these Midi messages in any tracks' effect chain and direct them towards LiveConfigs by the MidiToReaControl Plugin, there is no need to bother about OSC or ReaScript (and considering appropriate timing issues) in a LivePerformance situation playing with Midi Controllers. -Michael Last edited by mschnell; 12-14-2015 at 11:34 PM. |
|
12-15-2015, 02:44 AM | #10 |
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,690
|
BTW.:
The latest addition to my live setup (but not too specific for exactly this setup) I did is a small pure Midi JSFX plugin that in a comfortable way allows for transposing and adding a sub-octave purely controlled by the midi input (via definable CCs from MasteKeyboard switches). A rather simple thingy but it might serve as a "getting started" example for realtime Midi handling via JSFX for live usage - including somebody to ask questions about it . Of course there are lots of Midi and audio JSFX example to look at, coming with Reaper out of the box. -Michael |
12-15-2015, 03:18 AM | #11 |
Human being with feelings
Join Date: Mar 2009
Location: Dartmouth, Nova Scotia
Posts: 11,184
|
Some great advice here, I would add only one thing (especially since you say you like to code).
Whether to port or not. From a "production coding" viewpoint that is certainly almost always true. But through more "R&D" glasses and depending on overall code base size... - Nice chance to redesign/refactor/whatever that section that has been driving you nuts, but you just haven't found the time to fix. - Chance to experiment with different ways(MIDI vs OSC, etc.) of implementing previous design with eye toward better performance/stability/future upgrading and one that can sometimes be important in real-time, harmonization with Reaper's soul, if you will. Reaper is extremely stable, however some approaches to interact with the outside world may work better than others -- Well you know where the rest of this goes
__________________
To install you need the CSI Software and Support Files For installation instructions and documentation see the Wiki Donate -- via PayPal to waddingtongeoff@gmail.com |
12-17-2015, 02:49 PM | #12 |
Human being with feelings
Join Date: Dec 2015
Posts: 6
|
Thanks for all the good ideas!
I still need to digest all this information.
I'm not entirely opposed to learning a new language, but for example, my color manipulation routines I have written already in three languages - C++, Python and Javascript - I'd love to avoid doing a fourth one. I do appreciate making the code better but the reason I am here is because I was spending too much time programming and not enough making music! :-) Timing isn't TOO critical, as I said. I'm relying on the sequencer for its timing for critical stuff. "Score" was probably a bad choice of name - it's more like a list of "scenes", each of which remaps MIDI onto lighting. But I would like to be doing more stuff with LFOs and lighting. There are two reasons for using OSC. One is just because there's a reasonable amount going on and having human readable messages is better. BUT I really haven't had a good system that used OSC before, and I managed encoding everything through MIDI. The other is more technical - it's that MIDI is 7-bit and DMX is 8-bit. I've run into problems with this before, and it isn't just that you lose a bit of precision - but that sometimes if you have a MIDI- >DMX converter, there are specific patches, settings or values that you need to hit exactly and cannot reach (though no instrument I own today has that issue). So I have a plan - and my plan is (for a change) not to have a specific plan! I'm not going to try to port my existing show for a while. Instead, I'm going to start writing new material and then try to hook in the lighting gear according to my existing plans, but coming fresh to Reaper with no preconceptions. This means I can experiment, fail several times (something I have no real option to do if I'm porting an existing codebase), try things that I might not even want to use. At some point, I'll have an idea of "best practices" in Reaper, and can move forward. My theory is that I'll still end up with Max hanging around for quite a long time, if only because the super-standard, very solid driver for my specific DMX interface is in Max, and once that's there, I might as well use my existing work there. It might well be that much of the logic stays in Max, then. My synchronization wants are primitive - if Max simply got a message every beat from Reaper and used free-running LFOs, your eyes would never be able to detect a few ms of latency of jitter in there. Thanks again, I'll keep you posted over the next few months. All of this will be open-source so you can point and laugh and hopefully avoid my mistakes. (* - for those programming geeks, I'm in a pretty happy state right now, because I'm using the same sort of functional-programming-with-side-effects in all three of my languages: Python, C++ and Javascript, thanks to the miracle of C++11 and `std::function`...) |
12-17-2015, 11:14 PM | #13 | |
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,690
|
Quote:
BTW.: there is an EEL to C converter (creating VSTs), but note vice-versa. The Reaper-to-DMX system my friend I mentioned did, sends Midi Sysex messages via a a standard midi channel via USB to the box that creates the DMX output. The Sysex messages hold 8 Bit (or more) values, plus information such as scenes and file names. BTW. he found that to do a really decent show the DMX information needs to be created faster than just synchronized to audio-sample-blocks). -Michael |
|
Thread Tools | |
Display Modes | |
|
|