|
11-17-2018, 02:27 AM | #1 |
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,776
|
Reaper for Live, on stage, and embedded use.
Reaper for Live, on stage, and embedded use.
Introduction Due to its stability, efficiency and versatility, Reaper is exceptionally well suited as the basis for an "embedded" application. This means that while at work, the Reaper GUI is not of any concern, and may even not be visible at all, but other control elements provided by some hardware or on a computer screen govern the proceedings that are executed by the Reaper project. Several classes of such projects can be done in that way e.g.:
With "Live instrument" setups, the most common applications are:
"DJ" or "Live Looping" seems to subsume a wide range of applications requiring individually tuned workflow. To allow for "background tracks" this sometimes might be combined with an "instrument setup". With "Live Mixing", Reaper would replace the audio processing of a digital mixer. This might be as well for stage performances as in Studio situations, reusing the hardware and software available for media production. Here mostly it makes sense to install remote A/D-D/A converters connected via digital cables such as AES50 or Dante. An advantage over using a hardware mixer is that a huge number of audio plugins are available for sound processing, and even can be created as Reaper JSFX plugins. Regarding "deeply embedded", a huge range of applications are conceivable. Some examples known to be working might be mentioned here:
Prerequisites As most of the applications mentioned here need decently low latency performance, this is more demanding then a typical "DAW" use of Reaper's regarding the hardware and software setup required. So some comments on this issue. Latency (delay between some input to the system and the resulting output) is not automatically introduced by insufficient power of the computer. Latency is set by the configuration of the project. If you set the latency too small, this will result in audio dropouts and crackles. To prevent this, you need to increase the processing latency (usually defined by the block count and size the audio driver introduces), reduce the CPU demand by engaging fewer or more efficient plugins, or use a more powerful computer. Regarding "Live Instrument" or "Live Mixing" projects, the latency needs to be low enough to be automatically compensated by the brain. A good analogy is the time a sound wave needs to travel from the loudspeaker to the ear (with about 333 m/sec). The sound latency always adds to the latency introduced by the computer system. Besides the latency that needs to be deliberately defined to allow the system to run in a perfectly stable way, some minimum latency is introduced by the audio A/D-D/A hardware and it's drivers. To build a Live system, audio equipment needs to be used, that is specified for such purpose and features appropriately low inherent latency. Unfortunately many makers don't bother to publish such specification. Hence it might be useful to do a "round-trip-delay" measurement before any purchase decision. Usually the average CPU workload is not much of a problem, but crackles and dropouts are the result of peak demand only occurring now and then. Not only the demand of the audio system itself needs to be considered, but other stuff, the computer might be busy with, eats CPU cycles. That is why when building a Live system, any available "realtime tweaks" for the OS should be applied, e.g. preventing the start of any unnecessary services that are enable for the OS by default. Happily, the CPU demand of Reaper itself is known to be especially low, making Reaper a good choice for Live applications. Reaper assigns an OS thread to each track. Hence multi-core CPUs are exploited with projects that feature multiple tracks. Reaper denotes the individual "PDC" latency of all plugins instantiated, and uses that value to determine the resulting latency of the complete project (detecting the longest path and compensating the others). Here you might want to select plugins with lower latency over others with similar functionality. With some plugins, the PDC can be selected by parameters. Usually lower PDC for the same functionality comes at the cost of more CPU demand. A good example is ReaVerb: lower PDC and higher CPU demand with smaller FFT Window setting; PDC=0 and highest CPU demand with "ZL" (zero latency) activated. Hence for Live usage of ReaVerb, "ZL" should be activated for zero PDC, and "LL" should be activated as this allows the plugin to use multiple CPU threads and with that distribute the load to multiple CPUs. As WiFi technically is neither very well suited for realtime purpose nor for streaming data, and the stability of a WiFi can be affected by other WiFi devices in the surroundings, it is not recommended to use WiFi for streaming audio or Midi in an "on stage" setup. Mac, PC or Linux ? Regarding the software infrastructure of the Life setup, it does not seem to make much difference if same is based on OSX or on Windows, as long it features enough CPU, RAM and Disk resources. A decent quality PC system, might be slightly less expensive, while a Mac might feel a bit more safe. The best price performance ratio and best safety will be achievable with a Linux box, but setting up such a system might be much more demanding (see the "Linux" subforum for details). .. continued in next post ... Please answer in a new thread, as I might need to add more pages to this thread. Last edited by mschnell; 05-28-2020 at 04:41 AM. |
11-20-2018, 11:09 AM | #2 |
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,776
|
... continued from previous post ...
Selecting Patches With instrument setups, a main task for the Live system is switching sound flavors ("programs", "patches") on the fly due to requests of the player by commands from
In Reaper, there are lots of ways the plugin configuration can be modified to create a different sound. E.g.:
The "Live" tools to be instantiated in Reaper on that behalf need to adhere to these user commands, if possible show an updated status display somewhere, and modify the plugin parameters and Reaper configuration according to the pre-programmed patch makeup. LiveConfigs The most commonly used tool on that behalf is "LiveConfigs" (updated manual -> www.bschnell.de/LiveConfigs_1.pdf ) that comes with the "SWS" Reaper extension. LiveConfigs provides an easy to use GUI (laid out as a spreadsheet) that directly offers the options noted above as (5.), (6), (7.), (8.), and (10.). Using additional tools, other options can be accomplished: (2.) by the "Slider To Midi PS" JSFX (available via "ReaPack", for (3.) in many cases the "ReaControlMidi" VST (comes with Reaper) is helpful, more sophisticated tricks will need dedicated JSFX or ReaScript programming. A problem with LiveConfigs that arises in many setups is, that LiveConfigs "learns" Midi CC messages to switch patches. This means, it needs to see CC messages in the Reaper Control path, while Midi equipment often sends live playing Midi messages to the tracks, holding the VSTs and VSTis. If the equipment does sends CC messages for patch changing (e.g. a by a knob), the "control" option in the Midi device setup in the Reaper preferences can be activated to route these messages additionally to the control path. If the Midi equipment sends different messages (such as Program-Change or Note-On), the setup needs to be done in a way that they are preprocessed. Here the said "control" option should be off, and the Midi stream should be routed to a track. In the effect chain, a JSFX-plugin (e.g. "Midi Convert To CC") can modify such messages to be CCs and then the "MidiToReaperControlPath" VST (by Jeffos, -> https://forum.cockos.com/showthread.php?t=43741 ) needs to route the messages to the Control Path, where LiveConfigs can see them. Midi CC table LiveConfigs is designed according to the idea to switch patches in the most effective way, using as little CPU power as possible. To allow for this, it provides the option to mute all tracks managed by LiveConfifgs and unmute only the one (per "page") that is actually playing. By default, in the Reaper preferences, "Do not process muted tracks" is enabled, and hence the not selected patches don't consume CPU power. (These tracks need to be not armed, so you need to do appropriate Midi / audio routing for their inputs). Due to this design alternative patches never can be heard at the same time. LiveConfigs always fades out the previous patch and then fades in the next, creating a small gap in between, preventing cracks and glitches, but also any sound cross-fade or a "spill over" of a reverb tail. In usage cases that require creative use of patch changing, this might be a problem. An alternative to LiveCongig, offering similar results, but not providing an easy to use GUI for configuration is using a set of multiple JSFX plugins and Reaper Scripts. With this system - that resides in the realm of the tracks rather than in the realm of the ControlPath, as LiveConfigs does - the reaction to incoming CC messages for patch changing is defined in a file that needs to be created by a standard text editor, and the audio switching is done by fader plugins, that also can control track muting/unmuting via actions that trigger a Reaper Script. While being perfectly versatile, the setting up of this system of course is not trivial, especially if a great count of different patches is needed. The main components (provided via ReaPack) are:
LBX Stripper Might be a very versatile tool for several kinds of Live applications, but I did not find any straight forward instructions on what it can do for "Live" users and how to use it, in plain English. But see: -> https://forum.cockos.com/showthread.php?t=182233, -> https://www.youtube.com/playlist?lis...iKXcTIfeLcEjEd. -> https://pipelineaudio.net/guitarjack...-lbx-stripper/ (In this instruction I suppose the parts about "learning" an action to the faderbox and other plugin should be replaced by simply assigning the midi message by [Param] -> FX Paramater List -> "Parameter Modulation / Midi Link".), Song or Song-Position stepping On top of a patch switching system, for Live performances it might be viable to use software that allows to step through a list of patches predefined according to a sequence of songs, and/or according a the sequence of necessary sound for sections of songs. A description of such software done as a set of JSFX plugins, can be found here -> http://www.bschnell.de/patch.pdf . .. continued in next post ... Last edited by mschnell; 08-04-2019 at 04:48 AM. |
11-20-2018, 10:53 PM | #3 |
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,776
|
... continued from previous post ...
Deeply Embedded Applications If a software system - done for whatever purpose - needs some audio functionality such as
The main software and Reaper can reside on the same PC or on different PCs connected via a network, and they will communicated via
Regarding OSC, Reaper provides a large set of OSC commands and status messages, which can be configured by providing a configuration file. Moreover, Reaper scripts can send OSC information, and can be triggered by incoming OSC messages. If the main software is done as a "Reaper Extension", it can access the complete "Reaper API" (see the "developers" subforum for details) and control any aspect of Reaper's. But in most cases it's not a good idea do do the main functionality of a project as an "extension"-plugin. The most versatile and nonetheless handy way to access the Reaper API from an external program on the same box or via a network is using "Beyond Python". When this Python script is instantiated in Reaper, (most of) the Reaper API is exported via OSC and hence via TCP/IP, using a protocol format compatible with Python classes. Hence an external Python program will (via the "Beyond" library) easily be able to access the Reaper API (and functions that additionally might be implemented side-by-side with "Beyond"). If the external program is not done in Python, it could directly use the OSC data stream "Beyond" provides, simulating Python classes. If project specific audio and/or Midi processing is necessary, such functionality can be done by creating appropriate "JSFX" plugins (see the "developers" subforum for details), which can do audio processing in realtime with exceptionally low CPU overhead, and can be instantiated in Reaper tracks like any other audio/Midi plugins. The JSFX framework provides an IDE for creating and debugging JSFX plugins. The JSFX software has access to library functions for Midi and audio I/O and processing (including FFT), a standard user interface for the plugin, basic support for creating a graphical user interface, configuration support, file I/O, ... A huge number of JSFX plugins and libraries are available in source code by Cockos and by the Reaper user community, covering all aspects of JSFX programming, to be used as examples for new creations (see "ReaPack" and the "developers" subforum). Alternative stand alone software for Live playing with plugins There are several commercial offers for such software that mostly are advertised to be easily configurable, which might be an advantage over a Reaper based solution. OTOH, a Reaper based setup will be a lot more versatile (especially because most of these don't feature scripts for handling "unusual" demands) and of course you can use Reaper as a normal DAW "additionally", not needing do buy another software and go through another learning curve. Here is a list of some of these products:
(Maybe whoever can provide any experience with one of those programs might be inclined to provide them in a forum thread here...) Regarding the financial cost, a Reaper system is hard to beat, regarding the very moderate price of Reaper itself, and the fact that all add-ons described in this paper are free, and of course the fact that close to any musician needs a full featured DAW, anyway, and Reaper can be used as well for media producing as for live playing. Hence the bill for using Reaper "Live" can be considered zero. "DJing", "Live Looping", "Abelton Session View"-type and "Backing Track" applications Some options are:
Using multiple kinds of live performance tools at the same time You might want to use multiple Tabs in Reaper for different purposes. Here you might need to install audio routing between these tabs. This post explains how that can be done. -> https://forum.cockos.com/showpost.ph...8&postcount=10 Using Reaper Live together with additional software
Last edited by mschnell; 02-18-2024 at 02:57 PM. |
12-13-2018, 07:11 AM | #4 |
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,776
|
Tweaking the system for performance
general
Reaper settings:
Some of these options are not really compatible with song editing. You can start REAPER with dedicated live settings (i.e. a “live edition” of the REAPER.ini file) thanks to a command line like: "reaper -cfgfile ReaperLive.ini myLiveConfigs.rpp" Plugin settings
PC tweaking
Laptops Laptop hardware (electronics and fans) usually is optimized for Battery live instead for performances. Hence using a Desktop box or a dedicated "embedded" box is a better choice for live usage. If you want to buy a Laptop nonetheless, you should look for a "Gaming" Laptop, as these are a lot more optimized for performance. Thy also usually feature dedicated video RAM, so that the main RAM bus is not affected by Video activity and screen refresh. MacOS
Linux tweaking
Corrections and suggestions for enhancement are very welcome ! Please answer in a new thread, as I might need to add more pages to this thread. Questions in the forum are very welcome, as well. -Michael Last edited by mschnell; 08-31-2022 at 10:31 PM. |
12-13-2018, 07:12 AM | #5 |
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,776
|
Example for "embedded" Live ("on stage") hardware using Reaper
The upper keyboard is a Seaboard Rise 49 "fretless" keyboard. The middle keyboard is a DX7 just used as a Master Keyboard, mostly together with a TEC BBCv2 Breath controller, also nice with Hammond sounds. The lower keyboard is an Kawai VPC 1, best for E-Piano and Grand Piano sounds. The Surface Device is an XTouch Compact used to control as well the keyboard sound patches as the XAir 18 . Sound output by an NI Audio 6 low latency USB adapter (in the trunk together with the embedded PC and a DI box which is below the metal plate), then via an XAir 18 (in an additional rack) and two "dB FM10" floor monitors plus a subbass; and optionally a PA. The XAir also mixes the Live input signals coming from the other members of the band. (The closed Laptop is not part of the Live setup but used for multitrack recording in the "studio" and playing back 3rd party stuff during the rehearsal sessions.) -Michael Last edited by mschnell; 12-09-2019 at 07:00 AM. Reason: actual message |
12-13-2018, 07:13 AM | #6 |
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,776
|
Reserved Message slot
-tbd- 3
|
09-22-2019, 12:46 AM | #7 | |
Human being with feelings
Join Date: Nov 2008
Location: Somewhere Between 120 and 150 BPM
Posts: 7,968
|
Quote:
Im using a single Master MIDI Controller for these local gigs, but when I do bigger gigs my frame firs into the ATA Shock Rack and I bring along a Solaris and Code 8. Local gigs I use SE-02, HX-3 Hammond Module, PLAY, Kontakt, PTeq, ZebraHZ and Omni/Keyscape. All mixing in an XITE-1 DSP Rack. Very small fast portable rig. Jazz gigs and Dance gigs (Bruno, Disco, etc. lots of Horns and synths)
__________________
. |
|
Thread Tools | |
Display Modes | |
|
|