Reaper deeply embedded
We are doing / planing / evaluating / considering several projects with Reaper used as a deeply embedded tool. I.e. Reaper would not been visible to the end user (who of course needs to pay for a "commercial" license included in the project price), but Reaper just works in the background, remote-controlled by our software.
We already did two such projects, right now not yet sold to anybody but just used by ourselves for evaluating and non-commercial activities.
One is a "headless" hardware box (similar to the Muse Receptor) used for live-playing by myself, "on stage" controlled just by Midi.
Another is a "Song-contest engine", running on a laptop connected to a multi-channel A/D and "hardware buttons". Here pushing dedicated buttons starts and stops the recording, triggers a mixdown, and then publishes the resulting mp3 by uploading it to a web-page created by the system in realtime on a public web server. Here the appropriate software is done in Python, and Reaper is completely invisibly remote-controlled via "Beyond Python/Reaper".
Other planned projects include using Reaper as a versatile audio engine as well for realtime as for off-line tasks, including the use of EEL scripts to create certain "queer" audio features.
In a certain application we are considering, Reaper is supposed to sit in a dedicated "box" (embedded Windows (or Linux, if applicable) - PC), that is connected via TCP/IP to a "system" (Windows-PC) running our software. Here, ca. 100 telephone-grade (8 K Sampls/sec) audio-channels might be sent to Reaper, and JSXF-instances should work on that streams, doing FFT and then some "queer" operations.
(With another project-test. we already successfully used the "Wormhole2" plugin in Reaper to receive unsynchronized audio streams via TCP/IP created by some software we did, running on multiple small Linux boxes.)
Do you think this is doable ?
Any tips and tricks ?
Additional thoughts ?
-Michael
Last edited by mschnell; 10-31-2016 at 02:44 AM.
|