PDA

View Full Version : Developing for live performance


mschnell
01-17-2015, 12:02 AM
Being a (rather experienced) programmer and a (rather infant) Musician, using Reaper (additionally to using it's great DAW features for post-production) as a realtime VST host for live playing, I'd like to do some home-brew additions to the obvious "standard" features.

- I might want to modify Midi messages sent from my master-keyboard and breath controller on their way to the VSTs and VSTi's I placed in Reaper tracks.
- I might want to modify the Midi and/or Audio Routing by remote controlling same via programmable algorithms that in turn are triggered by Midi messages.
- I might want to switch VSTs on and off to save CPU power consumed by VSTs that are not routed to the Audio output.

In fact I feel that it should be rather easily possible to do a (set of) add-on's for Reaper to create a functionality that otherwise could be obtained by a combination of tools (most of which I already did test) like "Boeme's MidiTranslator", "RackPerformer", "Forte", the (really sadly) discontinued Native Instruments' "Kore", Native Instruments' "KOMPLETE KONTROL-Software (which seems to require a rather expensive € 500 License of at least "Komplete"), "and/or (maybe) the AKAI "VIP" software coming with their upcoming controller keyboards. Also "Muse Receptor" does this, but here, the (Linux based) software is not available without the (rather expensive) Computer Rack.

Of course I will provide any results here for free.

I found that to allow this, Reaper provides several APIs.

- The standard "VST" API: I know that I can fetch an SDK from Steinberg (plus helpful stuff like "JUCE") and will be able to do my own VSTs in C++ language.
- The (seeming standard, too) "DX" API: I have no knowledge about same, yet
- The (seeming standard, too) "JSFX" API that seems to allow to do plugins as Java Script applets.
- the "OSCII" interface providing its own "EEL2" scripting language
- the "Reaper Extension SDK", allowing for native Reaper additions in C++
- ReaScript (I did not take a look there, yet)
- did I find all options ?

There also is the (seeming standard) "AAX" API: I don't know much about same, yet. I don't suppose Reaper provides this API.


Is there a "compact" description comparing the different ways of doing "programming for Reaper" ?

Please let me know if such a "live Reaper" extension is already existing (commercial or free), or if a project aiming on this target is already in the works.

-Michael

Xenakios
01-17-2015, 04:16 AM
- The (seeming standard, too) "JSFX" API that seems to allow to do plugins as Java Script applets.

There also is the (seeming standard) "AAX" API: I don't know much about same, yet. I don't suppose Reaper provides this API.


Is there a "compact" description comparing the different ways of doing "programming for Reaper" ?



JSFX are not JavaScript, JS comes from JesuSonic, which is Cockos's language derived from their Eel language.

AAX plugins are not supported in Reaper, it's a ProTools-only thing.

I am not sure if a concise and accurate document exists on the possibilities how Reaper can be programmed by 3rd parties. Writing that document would need lots of work since the options are so much all over the place...(With things overlapping sometimes, sometimes obvious things having been left out.)

moliere
01-17-2015, 04:25 AM
- I might want to modify Midi messages sent from my master-keyboard and breath controller on their way to the VSTs and VSTi's I placed in Reaper tracks.
- I might want to modify the Midi and/or Audio Routing by remote controlling same via programmable algorithms that in turn are triggered by Midi messages.
- I might want to switch VSTs on and off to save CPU power consumed by VSTs that are not routed to the Audio output.


Check out http://www.thepiz.org/plugins/?p=pizmidi for a nice set of existing free plugs for ideas.

mschnell
01-17-2015, 03:29 PM
JSFX are not JavaScript, JS comes from JesuSonic, which is Cockos's language derived from their Eel language. ,,,

So "JSFX-plugins and "OSCII" is the same, and AAX is not an option. Making the picture less complex :).

Thanks,
-Michael

mschnell
01-17-2015, 03:31 PM
Check out http://www.thepiz.org/plugins/?p=pizmidi for a nice set of existing free plugs for ideas.

Great !

So there are several real-life examples for VSTs with sourcecode, A good starting point indeed.

-Michael

mschnell
01-17-2015, 03:42 PM
In the "general" forum, MINK99 mentioned the "mcu protocol", which is a set of well defined midi messages that allows to remote-control Reaper features (such as routing) and which could be send e.g. from OSCII (aka EEL) plugins. This does sound like a nice option for the task that I have in mind. (Supposedly such midi messages could be sent from a VST plugin, as well).

Moreover OSCII plugins seem to be able to call "User definable functions" that might be another (maybe better) way to do "internal remote-control" for Reaper.

Is there a description and/or an example for this (remote-controlling Reaper features by means OSCII or VST) available ?

-Michael

Xenakios
01-17-2015, 09:49 PM
You may be confused about OSCII(-bot)...There's a separate application from Cockos called "OSCII-bot" that can transform between MIDI and OSC messages. As far as Reaper is concerned, you can only control it from OSCII-bot with the possibilities that MIDI and OSC messages allow. Reaper has a fairly comprehensive support for OSC messages but for example the Reaper API functions can't be called with them. (MIDI messages are still more limited.)

mschnell
01-18-2015, 12:04 AM
You may be confused about OSCII(-bot)...There's a separate application from Cockos called "OSCII-bot" that can transform between MIDI and OSC messages. As far as Reaper is concerned, you can only control it from OSCII-bot with the possibilities that MIDI and OSC messages allow. Reaper has a fairly comprehensive support for OSC messages but for example the Reaper API functions can't be called with them. (MIDI messages are still more limited.)

OK. Now I am even more confused.

I did take a look at the "JSFX" Plugins. (I hope I am correct this is what I find when selecting a plugin from the "All Plugins -> JS" choice.)

I understand they are done in EEL language.

I understand that they (among other things) can take Midi input and can create midi output.

I understand that Reaper can route these midi streams.

I understand that with EEL I can call "user definable functions".

I guess that these allow for accessing Reaper functionality, that is not accessible directly by Midi, by accessing a dedicated API of Reaper's.

I did not yet find out in what language "user definable functions" can be done.

For info on Reaper extension plugins (including Xenakios' previous extensions) :
Seemingly these use that API. Seemingly I need to do (not use) something like such a "pugin" and have it fed by midi messages fetched from a track - maybe by means of a plugin selectable as "FX" by "All plugins -> xxxx", done in EEL...

-Michael

Xenakios
01-18-2015, 12:09 AM
OK. Now I am even more confused.

I did take a look at the "JSFX" Plugins. (I hope I am correct this is what I find when selecting a plugin from the "All Plugins -> JS" choice.)

I understand they are done in EEL language.

I understand that they (among other things) can take Midi input and can create midi output.

I understand that Reaper can route these midi streams.

I understand that with EEL I can call "user supplied functions".

I guess that these allow for accessing Reaper functionality, that is not accessible directly by Midi.

-Michael

No, in Jesusonic plugins you can not call the Reaper API functions or any functions that are not implemented with the Jesusonic language itself. While Jesusonic is a version of Eel, it can not do the same things you can do with ReaScript when using Eel as the language. Conversely, ReaScript and Eel can not do the same things that Jesusonic plugins can do, like process audio or MIDI live.

It all IS confusing and inconsistent. :( That's why it can't be shortly summed up and doesn't seem to make much sense at first.

mschnell
01-18-2015, 12:26 AM
No, in Jesusonic plugins you can not call the Reaper API functions or any functions that are not implemented with the Jesusonic language itself. While Jesusonic is a version of Eel, it can not do the same things you can do with ReaScript when using Eel as the language. Conversely, ReaScript and Eel can not do the same things that Jesusonic plugins can do, like process audio or MIDI live.

What exactly is "ReaScript" ? Is this the language used in " Jesusonic plugins" also called EEL ?

I did find that the language in OSCII-bot is called "EEL2" (instead of "EEL").

OK I see that the "simple way" is impossible, as the "Reaper extension API" is only accessible from native code.

The other two options that seem to shimmer are:

a) OCSII-bot: I understand this is a separate executable I need to start. Same seemingly indeed can receive Midi and access some API of Reaper's. So it might be possible to - in Windows - route the Midi stream aimed for Reaper to (or through) OSCII-bot, and have the "EEL2" program access the Reaper extension API (directly or by user definable functions) when appropriate.

b) Doing a native code VST: Is it possible to access the Reaper extension API from such a dll ?

Thanks for all your explanation !
-Michael

mschnell
01-18-2015, 12:34 AM
It all IS confusing and inconsistent. :( That's why it can't be shortly summed up and doesn't seem to make much sense at first.

:) :) :)

At least a comprehensive explanation of the Reaper-propriety terms used in this discussion would be very helpful:

- OSCII-bot
- JesuSonic
- JSFX
- ReaScript
- EEL
- EEL2
- User definable functions (in OSCII-bot: e.g. in what language to do these)
- Reaper Extension API (I don't even know if this is the correct name to be used to denote this specification.)
- DX / DXi / VST / VSTi (as a Reference: this of course is not Reaper propriety and explanations are easily available elsewhere)

Thanks for your patience !
-Michael

mschnell
01-18-2015, 12:43 AM
Conversely, ReaScript and Eel can not do the same things that Jesusonic plugins can do,

It would be a really stringent "dodo" for a future version of Reaper's to extend each of "JesuSonic-language", EEL, and "OSCII-bot-language" (aka "EEL2"), so that all can do the same, as well allowing for extended functionality, and reducing confusion. (And maybe either killing the term "ReaScript" or using it instead of "EELx", everywhere.)

-Michael

Xenakios
01-18-2015, 12:49 AM
Just a short note about EEL, that's always EEL2 (EEL version 2) in Reaper, although it's commonly abbreviated to just EEL.

edit : I am now trying to create a table on my blog that attempts to show a quick overview what is possible with each programming language/"thing" in Reaper.

mschnell
01-18-2015, 12:56 AM
I am now trying to create a table on my blog that attempts to show a quick overview what is possible with each programming language/"thing" in Reaper.

You are my hero ! :)
Let me know where to find it.
(I might dare to ask more questions, after having read that paper, as my endeavor seems not easy but should create a Reaper extension helpful not only for me, but for all that want to use Reaper "live" as a VSTi plugin host, and seemingly (and astonishingly) is not available yet. Please re-read the first message in this thread.)

-Michael

Xenakios
01-18-2015, 01:11 AM
Very preliminary version of the blog post, the table isn't very readable and there are likely errors. (It's also now missing some of the Reaper capabilities, like raw OSC messages handling, the EEL-based joystick support and WALTER. If I am going to put those into the table too, I might need to create this webpage somewhere else than on the Wordpress blog, the blog theme just doesn't work right for this kind of a large table...)

https://xenakios.wordpress.com/2015/01/18/reaper-programming-matrix/

Xenakios
01-18-2015, 01:42 AM
Also to clarify things, OSCII-bot doesn't bring any extra capabilities to Reaper, it's a separate application that can send OSC and MIDI messages, also to other applications than Reaper. You can send OSC and MIDI into Reaper from other applications too. It just happens to be so that OSCII-bot is scriptable with the EEL language which is also available inside Reaper.

mschnell
01-18-2015, 01:42 AM
I printed the Table for reference.

This already is very helpful.

Please to a short list of explanations for the terms used:

- OSCII-bot
- OSC message
- JesuSonic
- JSFX
- ReaScript
- EEL
- EEL2
- User definable functions (in OSCII-bot: e.g. in what language to do these)
- Reaper Extension API (I don't even know if this is the correct name to be used to denote this specification.)
- DX / DXi / VST / VSTi (as a Reference: this of course is not Reaper propriety and explanations are easily available elsewhere)
- C++, Python, LUA (as a reference)
- LICE (as a reference, never heard of same)

In fact I don't see yet where remote-contollinig the features I need such as routing, "track activating" and VST-powering on/off is handled (maybe this is "OSC messages".

-Michael (just waiting for what you will come up with)

Xenakios
01-18-2015, 01:50 AM
In fact I don't see yet where remote-contollinig the features I need such as routing, "track activating" and VST-powering on/off is handled (maybe this is "OSC messages".


Some of that stuff is possible to control with the C functions API(*) or with OSC messages. (Routing is tricky, there's no easy way to do that programmatically and it's not possible at all with OSC messages.)

(*) This for practical purposes is the same as the "ReaScript API".

Xenakios
01-18-2015, 01:55 AM
Relating directly to the topic of the thread...Not to discourage you too much, but the mess the Reaper 3rd party programmability currently is, has lead me to explore other possibilities than Reaper, for live performance purposes.

Currently I am looking again into learning Cycling74's Max/MSP...The 3rd party programming possibilities in Reaper seem to me to be more suited for studio production tasks and Max seems to be a better fit to build live performance set ups. I am not saying Reaper absolutely can not be made to work for live performance work, but it seems quite complicated.

mschnell
01-18-2015, 02:54 AM
Regarding your "Feature List", some comments that you might or might not find useful:

To understand the meaning, I need to find detailed information about the "Reaper C API" and "OSC messages". An appropriate link would be useful.

I still don't understand the meaning of "ReaScript" (vs "JesuSonic Plugin"). An appropriate link would be useful.

If you say "Midi hardware input" and "Midi hardware output", you of course mean the MID API of the OS. This is not necessary "hardware" but programs like Boeme's Midi Translator" and "Midi-OX" create "Virtual Midi" devices that of course are accessible as well.

If a standard Python interpreter is used, same will be able to call user provided C functions, and hence any "custom code" should be possible.

Thanks for listening,
-Michael

mschnell
01-18-2015, 03:06 AM
the mess the Reaper 3rd party programmability currently is, has lead me to explore other possibilities than Reaper, for live performance purposes.

Of course I did check several of those. One of the most versatile (and not too expensive) seems "RackPerformer". But I am not sure if this Project is out of the "work in progress" stage and if it is here to stay. (In fact I was not even able to definitively find out if it's a decent Win 64 Bit program running 64 Bit VSTs, and I was not able to register with their rather sleepy Forum. And I am not good in speaking French ;) )

The beauty of using Reaper for this purpose would be that this really excellent DAW with it's nice price/performance ratio is useful for _each_and_every_ musician, and hence the training we have on it can be re-used for using it as a live VST player (and vice-versa).

Hence IMHO it is an extremely viable task to provide (a set of) Plugins for Reaper that allows exactly this and - if some (supposedly very few) of the necessary API features are not available in Reaper, we should work with Cockos to make it happen. They in fact should be interested in us providing such a Reaper add-on (supposedly for free).

I do know several musicians who use Reaper for that purpose (e.g with the great "SWAM engine" based instruments (Flute, Sax, Clarinet, Oboe, Trumpet, and friends), played by means of Wind Controller hardware. Swam instruments are available only as VSTs, but not as Stand Alone programs (like the appropriate version of NI Kontakt, that I currently use side by side with Reaper). Of course a decent VST host is a lot more versatile than trying to use a set of Stand alone programs and mess around with Midi and Audio Routing in Windows. (On a Mac you have MainStage that seems to do all this "for free out of the box".)

(In fact, same being a mess or not, Reaper is known to be the DAW that provides the best 3rd party programmability of all DAWs.)

From your List it's rather obvious that we (among other things) would need to do a VST plugin. I do know that endeavor is not meant for the faint of heart, but I understand that with the VST building environment provided by JUCE it should be doable (and fun).

(On the first glance, MAX/MPS does look nice, as well, though. Please keep me informed on you findings. -- Off topic here, so use " ms ch ne ll -at- bs ch ne ll -dot- de " without the blanks. --)

-Michael

Xenakios
01-18-2015, 03:29 AM
To put things really simply, based on your requirements in your first message, a VST plugin written in C++ that also uses the Reaper API is your only feasible option at the moment. That will of course be very complicated work that you might spend months or even years doing, depending on your skill level with C++.

mschnell
01-18-2015, 05:49 AM
I agree that creating a VST in C language is the only reasonable option for the task in question.

In fact I do C for embedded projects every day in my job, for rather complicated projects, so I do think that I will be able to do this part with not too much headache. I will do a test using the JUCE VST SD -> http://www.redwoodaudio.net/Tutorials/juce_for_vst_development__intro2.html, ASAP. This "tutorial" suggest no great problems.

The part I am not sure about is the "Reaper C++ extension" API (specification and ways to access it). Maybe you might be able to provide some expertise on same. So this should be doable, as well (provided this API does offer the necessary functionality).

BTW.: given a Price of $400 (what is the price of a iBook that already includes Mainstage ;) ?), MAX/MSP is not an option for myself and many more musicians who might be inclined to use VSTs live on a Windows PC and even might already be happy Reaper users. As I gather that you are qualified for this group, maybe working together on a Reaper extension is a more interesting option than shelling out money for a ready-made solution.

BTW/2: now I do understand why the thingy I am looking for does not yet exist :) :) :)

-Michael

Banned
01-18-2015, 06:52 AM
(On the first glance, MAX/MPS does look nice, as well, though.
Also check out its free 'cousin' Pure data. It looks a bit more bland, but it is free/open source. Indeed, Max/MSP or Pure data are *very* powerful apps for doing this sort of stuff. I suggest checking out the latest stable Pd-extended here: http://puredata.info/

If you'd choose to start hacking with OSC, I may be able to get you started using Max / Pd / OSCII-bot, as I've done some things with them already that are quite similar to your bucket-list.

Cheers!

ashcat_lt
01-18-2015, 09:48 AM
I'm so confused! I mean, we don't have a lot of specifics, but none of this looks particularly difficult. You don't need to delve into the API or even write full-on VSTs...
- I might want to modify Midi messages sent from my master-keyboard and breath controller on their way to the VSTs and VSTi's I placed in Reaper tracks.
JS plugins can do this. In fact, I'd be willing to bet that most of what you would want to do has already been written, or at least close enough that a couple quick modifications would get you where you're trying to go.
- I might want to modify the Midi and/or Audio Routing by remote controlling same via programmable algorithms that in turn are triggered by Midi messages.
This can probably also be done in JS. It sounds a bit more complex, and might require a little "outside the box" thinking...
- I might want to switch VSTs on and off to save CPU power consumed by VSTs that are not routed to the Audio output.
You can automate the bypass parameter of any FX, and map it to MIDI or OSC control already via a couple different methods.

So WTF am I missing?

Edit - Oh yeah! What we're missing is any specific information at all. Tell us exactly what you're trying to accomplish (maybe in the appropriate forum...) and you'll have all kinds of help in getting it done.

mschnell
01-18-2015, 03:40 PM
(modify Midi messages sent from my master-keyboard and breath controller on their way to the VSTs and VSTi's I placed in Reaper tracks.)
JS plugins can do this. In fact, I'd be willing to bet that most of what you would want to do has already been written, or at least close enough that a couple quick modifications would get you where you're trying to go.
Xenakios' list states that a JesuSonic plugins only can modify messages within the single track that it is placed in. As the target is a central instance that controls multiple tracks this supposedly is not possible with JSes


(modify the Midi and/or Audio Routing by remote controlling)
This can probably also be done in JS. It sounds a bit more complex, and might require a little "outside the box" thinking...
Xenakios' list states that a JesuSonic plugins can't use any API of Reaper's in "working" mode.

You can automate the bypass parameter of any FX, and map it to MIDI or OSC control already via a couple different methods.
Maybe using OSCII-bot really might be helpful in that way. I did check the "OSCII-bot code reference", but I did not yet read a description on what exactly "OSC controls" are and how they are used in Reaper. (I already requested Xenakios to include OSCII-bot in his reference list.)


Tell us exactly what you're trying to accomplish (maybe in the appropriate forum...) and you'll have all kinds of help in getting it done.

In a first stage I'd like to install multiple chains of VSTi's and VSTs (supposedly necessarily in multiple tracks), each "set" create a "sound" (This is called a "scene" in Forte). For this of course each VST's needs to use the appropriate set of parameters, either by being fed with same in realtime or by using multiple pre-programmed instances that can be switched on and off. The switching of the sounds needs to be done by midi "program change" messages.

In a second stage, with any currently selected program, Midi messages (e.g. cc's) should be "translated" in a program-specific (programmable) way before they reach the active VST(s).

(Once I am able to play with this thingy additional ideas might arise.)

At best I would like to help providing an easy-to use " musicians' " tool that allows for Reaper to succeed as a live VST player, usable instead of Mainstage, RackPerformer, Forte, or MAX/MSP.

-Michael

Banned
01-18-2015, 04:00 PM
You might also take a look at apps such as Bidule (and if you'd be on OS X, I'd also heavily recommend looking at Numerology), which excel at this sort of live performance oriented stuff, and can run standalone / as plug-in (while hosting other plug-ins itself) / ReWired. For Bidule, perhaps wait for a new version to appear, then demo it for free for 3 months.

For the REAPER's OSC implementation, I'd suggest beginning by (carefully) reading the "Default.ReaperOSC" file which has been installed into the OSC subfolder. And then again. :)

ashcat_lt
01-18-2015, 04:34 PM
Still all sounds like things that either already exist in Reaper or could be done via JS and thoughtful routing.

Every couple of days it seems somebody asks the forum how to change presets in a plugin via program change message. I guess I've never read one of those threads because I don't need it, but I'm sure it's been pretty well figured out by now. Even if it can't be done, it's still not a lot more than bypass and/or mute automation.

Realtime controller re-mapping is easy. Pretty sure there's a JS plug or three that come with Reaper to do it.

Remember that each "track" in Reaper has a whole lot of parallel audio and MIDI channels to work with, and routing between them is pretty easy. Then there's the things you can do with parameter modulation and linking... I just really don't think you're making this out to be harder than it is. I think it's important to understand the various aspects you've brought up in this thread, but I don't think you need to re-write reaper to do what you want to do.

mschnell
01-18-2015, 10:10 PM
You might also take a look at apps such as Bidule ...
some messages above I listed quite a lot of such dedicated non-DAW VST-hosting programs. I did know, but did not yet take a look at Bidule, as I found others (e.g. Forte) to be more recommended. I pointed out the reasons why I (and supposedly many other musicians) would prefer Reaper if same could do the task.

For the REAPER's OSC implementation, I'd suggest beginning by (carefully) reading the "Default.ReaperOSC" file which has been installed into the OSC subfolder. And then again. :)
Thanks a lot for the pointers !

-Michael

mschnell
01-18-2015, 10:24 PM
Every couple of days it seems somebody asks the forum how to change presets in a plugin via program change message.
Before starting this discussion I asked exactly this in the General forum and expected just a simple pointer on how to do this. I was astonished that nobody came up with such a recommendation, so obviously there is no simple solution. Finally, here, Xenakios explained, why there is none, and showed that it supposedly could (only) be done by creating an appropriate VST plugin (meaning that it is not impossible). He did a "Feature list" of possible tools, but here, OSCII-bot is missing, so same might be an additional option.

Remember that each "track" in Reaper has a whole lot of parallel audio and MIDI channels to work with

What do you mean by "Midi Channels" ? Really "Midi Channels" (that are used for multi-timbral instruments) or "Midi Streams" (that are comparable with Audio channels) ?

and routing between them is pretty easy.
Statically: Of course. The question is about doing this dynamically by Midi program change messages. Do you have an example ?

Then there's the things you can do with parameter modulation
We are not in playback mode.

but I don't think you need to re-write reaper to do what you want to do.
Nobody suggested any modification to Reaper itself. We are just discussing using it's features (APIs) appropriately.

-Michael

Xenakios
01-19-2015, 01:21 AM
Realtime controller re-mapping is easy. Pretty sure there's a JS plug or three that come with Reaper to do it.


If you are sure, then point out the names of those JS plugins.

It's not easy, convenient and flexible to do controller remappings in a JS plugin. So even if there exists some JS plugins to do that, they'd be limited in what they can do.

My assumption here from the beginning has been that the original poster wants full control over everything without dirty hacks or "outside of the box" thinking. This is the Reaper developer forum after all...

mschnell
01-19-2015, 03:13 AM
http://puredata.info/
This does not seem to be a VST host...

-Michael

mschnell
01-19-2015, 03:49 AM
My assumption here from the beginning has been that the original poster wants full control over everything without dirty hacks or "outside of the box" thinking. This is the Reaper developer forum after all...
Yep see post from 1, 18, 02:40 PM

-Michael

mschnell
01-20-2015, 03:12 AM
trying to gather the necessary information:

I assume the VST to be done needs to use either or both of the "Reaper C++ extension API" and "OSC messages".

I understand that SWS uses "Reaper C++ extension API", and the OSCII-bot uses OSC messages.

How is the "Reaper C++ extension API" and OSC ( -> http://en.wikipedia.org/wiki/Open_Sound_Control ) related ?

Where to find the exact specifications describing these Reaper APIs ?

(I already found "Default.ReaperOSC" and http://www.reaper.fm/sdk/plugin/plugin.php . This does help a lot...)

-Michael

mschnell
01-21-2015, 12:21 PM
http://www.reaper.fm/sdk/plugin/plugin.php

There they say:

"From REAPER, run this action:
[developer] Write C++ API functions header "

I fail, to find "[developer]" in the Reaper GUI, nor any more explicit explanation in the Internet.

What am I supposed to do ?


-Michael

Xenakios
01-21-2015, 12:44 PM
There they say:

"From REAPER, run this action:
[developer] Write C++ API functions header "

I fail, to find "[developer]" in the Reaper GUI, nor any more explicit explanation in the Internet.

What am I supposed to do ?


-Michael

Use the actions list window to find the action. (Actions in the main menu.) Reaper has tons of actions that don't appear by default in the menus or don't have keyboard shortcuts. The action list is the way to find and manage them.

That will give you the C functions API only, by the way. There's another header that has additional stuff, such as C++ base classes.

You can get both from this link :

http://ge.tt/2ERBxz52/v/0?c

But for the C API, the one generated by Reaper will probably be more up to date.

(As a side note, I am still in the process of making Max/MSP work for me, using the 30 day trial mode. They by the way have a $10 a month subscription option too, instead of having to pay the $400 to get the full license. If they didn't have the subscription available, I'd probably not be looking at using Max, the $400 is a bit too steep...Of course if I end up using Max for over 40 months, it would bit of a financial loss, I suppose...)

mschnell
01-21-2015, 12:50 PM
Just to keep you informed about what I found up to now:

The extension API is done by providing a DLL with a certain name and Reaper will load it when starting: ( -> http://wiki.cockos.com/wiki/index.php/Plug-In_Architecture )

A VST of course is a DLL, as well.

Now I understand that I need to use a VST to be able to read and modify the Midi stream that is routed within Reaper.

Additionally I need to use the Extension API to be able to control things like kin realtime switching the "bypass" for VST(i)s.

Hence the planned project needs to result in two DLLs that communicate via a mutual API (one DLL loading the other) or a data stream (e.g. Windows Messages, a named Pipe or TCP/IP).

As already said: not a task for the faint of heart, but rather certainly doable.

-Michael

Xenakios
01-21-2015, 12:58 PM
Hence the planned project needs to result in two DLLs

You can do just a single DLL that will be loaded as a VST plugin. That plugin can then, using "special means", also use the Reaper API functions. Those "special means" are so special though, that I can't at the moment remember how it exactly should be done. It does work though, and I've done some little tests with that a long time ago. The author of the "Playtime" VST plugin is also currently using that technique in his product.

http://www.helgoboss.org/projects/playtime/

mschnell
01-21-2015, 01:05 PM
The action list is the way to find and manage them.

Found it. Reaper is a lot greater than visible on the first sight :)

But for the C API, the one generated by Reaper will probably be more up to date.

Yep. The one created by Reaper is more than double size ! (32 an 64 Bit version are identical, sauf a single comment line.)


(As a side note, I am still in the process of making Max/MSP work for me, ...

( AFAIK, Max/MSP is the most expensive of a great number of available live VST Host programs. Why did you choose that one ? )

If we get this working, I am sure you and I will be more happy with Reaper as a live VST host: zero $$$, the well know Reaper user interface, and - now and later - we will implement any features we might find useful. Another obvious advantage is that you can use existing and easy-to-do "JesuSonic" EEL script plugins in your live-setup.

(And maybe we might be able to collect some donations :) )

-Michael

mschnell
01-21-2015, 01:14 PM
You can do just a single DLL that will be loaded as a VST plugin. That plugin can then, using "special means", also use the Reaper API functions.

That of course would make things a lot easier.

In fact a DLL is just a set of named functions that can be called by the host. Hence "in principle" the same DLL file could be as well a Reaper Extension and a VST, by providing both sets of named functions. (This might be such a "special means".)

Those "special means" are so special though, that I can't at the moment remember how it exactly should be done.

Of course finding the specs how to do the "special means" will be the first task do be done, before planing the exact layout of the project.


Thanks for your interest. (Maybe you might want to send me an E-Mail, so that we don't flood this forum with the really project-specific communication.)

-Michael

mschnell
01-22-2015, 04:24 PM
Here: http://reaper.fm/sdk/vst/vst_ext.php#vst_host


They say
"Additional API functions are listed in the REAPER Extension Plug-in SDK. "

So it seems that a VST DLL simply - additionally to the named functions that are necessary to be a VST Plug-in - can optionally provide the named functions necessary to be a REAPER Extension Plug-in.

Hence Reaper's Extension API seems to attach not only to DLLS with the appropriate name and location (as described in the "REAPER Extension Plug-in SDK"), but also scans the existence of all REAPER Extension Plug-in function names in any VST when it's loaded.

Seems great !

-Michael

Banned
01-22-2015, 06:20 PM
This does not seem to be a VST host...

-Michael
Well, there is the [vst~] object...

But Pd (or Max/MSP) doesn't need to be a VST host in order to do most things you described; and it seems that you want to use REAPER for its DAW/hosting/sequencer capabilities anyway, so you would only use it to add specific live performance oriented 'data flow' stuff to what REAPER already does. Pure data (or Max/MSP) would e.g. handle live input from MIDI hardware, apply whatever logic you need, and forward OSC to REAPER via localhost (and vice versa, if you require feedback). I have used such setups quite a lot myself, and they work pretty well.

mschnell
01-22-2015, 11:33 PM
I have used such setups quite a lot myself, and they work pretty well.
Yep.

My intention is to use / provide something the to the user feels "monolithic" and "musician-friendly" configurable, though versatile, working flawlessly.

Even "Forte", a nice and often used dedicated live VST host, lacks several features (that Reaper already does have) and in a discussion in their forum I recommended "Midi Translator" to add them. For me too many $$$ for a non-monolithic solution, so I kept on searching.

-Michael

mschnell
01-25-2015, 11:10 PM
Many thanks for the hint.

Of course I will check this out before starting a new development project.

Thanks again,
-Michael (very astonished that I only now find out about an already exiting reaper "live" add-on, even though searching for such since a rather long time)

mschnell
01-27-2015, 02:42 PM
Veto,

I installed the Extensions and read the "Live Configs" tutorial PDF.

It seems that the tutorial PDF is a bit outdated, as many things in the "Live Configs" look differently than described.

I now want to assign a midi signal (Program Change) to a "CC-Value" (1st Line) in "Config 1", where I already placed a "Track".

I hit "Learn" and can select "Apply..." or "Preload ..." I don't understand what this selection means.


No matter which I select, I then get a box similar to the one described in the PDF, but the "Midi CC" filed is grayed out and sending Midi messages from my keyboard does not have any effect.

Selecting an already working (with a Midi VSTi) track as "Input Track" does not help.

I suppose I somehow need to select which channel and Midi interface the Life Config is supposed to listen to.

Any Help ?
Did you get this working ?

-Michael

mschnell
01-28-2015, 10:38 AM
Sorry for pointing to the outdated manual.
No need to apologize. AFAIK, there is no more recent documentation. I did ask Jeffos for more information. If I indeed get it running, I'm willing to help with updating the "Live Configs" manual (additional to some donation :) ).

Concerning your problem, yes i and many others got it working, but using it with CC values not PC's.
I do not really see the functional difference between CC and PC, but I found a posting by Jeffos how to convert PCs to CCs. In fact I did not yet understand the details, but It should work. But in fact I was not able to "Learn" a CCs , either :( )


I dont know if it works with PC's for sure, what you probably would need would be a pc-to-cc-value converter in your chain, easy feasible with a tiny JS plug or a VST (pizmidi?) together with the MidiToReaControlPath plugin.

I already found both "MidiToReaControlPath" and "PCtoCC", as Jeffos mentioned both in the said post. I feel that using all these tools on top of each other is rather clumsy. So maybe one day I might do a more dedicated VST that includes "MidiToReaControlPath", "PCtoCC", and "Life Config", (with admission by Jeffos, as SWS is open source).

Anyway, I'll try "MidiToReaControlPath", "PCtoCC", and "Life Config" inside of Reaper ASAP.

Or use some external software (oscIIbot script, Pure Data, Bome midi translator?, ...).
I already tested some of these. But piling such programs on top of each other seems even more clumsy.


Or since you are a C++ dev become a member of the SWS team and extend Live Configs that way.Do you think they are interested in such a project ? In fact IMHO the first viable thing (for me) to do on that behalf would be to update the "Live Config" manual.

...Thanks a lot for the pointers !

-Michael

mschnell
01-28-2015, 11:50 AM
in the first place, but it might be the case that you did not configure your MIDI device properly. To register MIDI to actions/parameters you need to tick "Enable input for control messages" in the Reaper prefs in MIDI device section (dbl-click your device).
If you figured that out already please ignore my post.

The midi device does work, as a VSIi in a track can nicely be played from the masterkeyboard. I'll check the "Enable input for control messages".

EDIT:
=====

After activating "Enable input for control messages" the Midi Learn seems to work (at least the Window closes when I move a controller and the next "Learn" request asks whether to remove the previous assignment.

AND IN FACT: This also happens with program change !!! (and with key-press)

Lets see where this gets me.....



EDIT:
=====

I seem to understand.
- A CC is assigned to a config table and any value of the controller (0..127) will trigger the event that is described in the appropriate row of table.
- A certain PC value is assigned to a complete "config#" - table and selects this table to be active (there seem to be four of these tables)
That is why I would need to convert PC to CC
And to do that I need to feed the Midi output of the converter plugin back to the "Control Path" to make LiveConfig see it. And this is only possible with a VST such as MidiToReaControlPath.

BTW when feeding the Midi Events via MidiToReaControlPath I supposedly don't need to activate "Enable input for control messages" for any hardware device. Correct ?

Of course i cant speak for SWS but afaik they are always happy about new contributions, just as the reaper user base would be.
Not least taking into consideration that many users are shying away taking that step from Reascript to full extension coding, myself included :)

I found "Juce-VST", which claims to make it easy to create a VST. I am really interested in trying that... But Doing something to be included in SWS seems a good idea as well.

-Michael

mschnell
01-28-2015, 01:00 PM
Update:

I was able to create a new track and add "MidiToReaControlPath" in it, and (after deactivating "Enable input for control messages"), I could "Learn" CCs in "Life-Config".

So I do have a starting point to play with.

Just another question:

I do have PCtoCC. When d/lding it I assumed it is a JS plugin, but it seems to be just a text file. How to "install" it ?

Thanks for your help,
-Michael

mschnell
01-28-2015, 02:47 PM
(Funny that JSFX files don't have a dedicated File extension....)

Works !!!!

After "Learning", the controller (I used channel 16, CC 60) is mapped to PC messages from my masterkeyboard and the little dot moves correctly through the rows in the "Live Config" table.

Great !!!
Thanks !!!

BTW.: If it is possible for me to become part of the SWS developer team, the really simple idea I would suggest is to add a configuration setting that the PC messages (that already do arrive at "Live Configs", and right now seem to switch the active Table #), optionally can be used to fire the Action in the appropriate Row corresponding to the PC number sent with the message.

-Michael

mschnell
01-31-2015, 06:25 AM
As I did not get an answer by Jeffos, I started to rewrite the "Live Configs" manual on my own. I'd like to come back with a PDF ASAP, but I don't know how to handle copyright issues,as of course most of the text is taken from the outdated version :(

-Michael

mschnell
02-01-2015, 03:43 PM
Veto,

Now that I can route PCs as CCs to LiveConfig, changing rows as desired, I am trying to set up multiple tracks with VSTis.

The goal is to use the PC buttons as "Sound change", hence the value is not a "Program #" that any VSTs is suppose to digest, but just a "row #" for Live config.

The Idea is that Live Config will enable the appropriate track with the desired VSTi and send a "PC #" that is configured in that row and of course independent of the row # itself.

Now in track 1, "MidiToReaControlPath" is configured to route channel 16 (which is created by PCtoCC) to ReaControl and other channels to the Midi path (of the track)

The VSTis live in several different tracks and I now would like to route the (remaining) midi stream from track 1 to those tracks.

By default seemingly every track gets the midi input directly from an external device.

How to route the midi stream from one track to another (similar to routing the audio stream) so that plugins in a central track can modify the midi signals before they reach any of the VSTIs ?

In fact I found that I seemingly can set the trackl to receive from a "Midi Bus" (e.g. B1) but I don't find out what exactly is meant by that or how I can send Midi data from an FX chain to the bus.

Thanks for any pointers
-Michael (trying to setup something decent from the beginning and not use multiple workarounds)

Klinke
02-02-2015, 06:09 AM
That of course would make things a lot easier.

In fact a DLL is just a set of named functions that can be called by the host. Hence "in principle" the same DLL file could be as well a Reaper Extension and a VST, by providing both sets of named functions. (This might be such a "special means".)


I think there is a better solution, it seems that a VST instance can access the Reaper API directly (without the need to write an extension and some communication between both worlds), see the section Host in the following page:

http://www.cockos.com/reaper/sdk/vst/vst_ext.php

But I havn't ever tried this by myself. If this works, I think everything you need would be possible to write as a VST plugin.

mschnell
02-02-2015, 03:57 PM
it seems that a VST instance can access the Reaper API directly
Yep. I got this impression, as well, and there obviously are several VSTs - including "MidiToReaControlPath" that work that way.

I understand that in a VST, you just provide as well the VST-related named functions and the Reaper API related named functions and Reaper will attach to all of them.

-Michael

mschnell
02-02-2015, 04:01 PM
i did not understand what you're trying to set up...

This issue itself is not directly related to the "Program Change" stuff. I want to implement a central Midi Modifier and use it's output as input for all other tracks (rather than the midi in devices Reaper attaches to).

In the Reaper docs I found a (very short) section about Midi Routing between tracks using the buses "B1", "B2", ..., I did some testing but was not yet able to really understand how to make use of that, I'll do some more tests ...

Thanks,
-Michael

mschnell
02-05-2015, 03:52 PM
Hi Veto and Xenakios,

(Back from some experimenting...)

In fact I don't really know what the "input track" select does. I left it on "none" and this does not seem to matter.

In fact I was able to solve the Midi Bus routing issue (I will include this in the upgraded Live-Configs description I am doing right now).

Now, I can forward all midi events, but the program changes, to the multiple tracks that contain my audio VSTs.

But this leads to the next question.

With any row that gets selected in Live Config, I not only need to activate exactly one associated track (this does work now), but also send a dedicated Program Change message (plus in future several more midi messages) to the VSTs in that track, to have them switch to the desired sound.

I don't yet understand the meaning of any of the rightmost five fields in the Grid LiveConfig provides. Is it somehow possible to configure midi messages to be sent on activation ?

If not, I suppose I should modify the PCtoCC JFSX script to additionally send out translated program change messages. Or is it viable to use a kind of "Configuration" (including all parameters) for each "sound" of each VST ? Where and how are those stored ?

If so: Is it possible / viable to have a JSFX read a (ASCII) file from disk that holds a list "PC-Number -> Midi Messages to be sent" ? Or will I need to do such a list right within the JSFX script. This seems rather inflexible when providing the stuff to others.

What do you think ?
-Michael

mschnell
02-06-2015, 05:09 PM
I'm sorry i'm in a hurry right now so here's just the short version (edit: at least i thought so :)):
I'm very happy that you bother to write down all this !

The "input track" auto-routes incoming midi from your performing device (f.e. midi keyboard) to the track you configured a Live Config row to.
row 1 active: midi device --> input track --> track 454
row 2 active: midi device --> input track --> track 342

Does that mean that using this, I should not configure the input device, not a Midi Bus with the tracks themselves ?

A problem would be that the program change messages are not allowed to reach the tracks, as they are meant to control "Live Configs" and not the VSTs in the tracks (that would switch to the wrong programs)

Moreover I have three Midi input devices that send signals to Reaper, Wich would "input track" select as a source ?

if you do not know yet what the last 5 columns do, then you probably missed about 80% about Live Configs awesome feature set :)
I am aware of this. In fact I am still not "using" it but trying to understand Live Config and Reaper itself on that behalf.

In fact I think I don't understand what the settings in the last 5 columns do in Reaper and how I create the settings, files and actions that are to be accessed by Live Config. (I never decently used Reaper as a DAW yet, my primary purpose with it right now is a live tool.)

So I have no idea yet what a "track template", an "fx chain" or an "fx preset" exactly is or what exactly an "action" is and what it can do. Of course I'll try to find the Reaper docs, but I hoped you could provide a hint, what of all these complex concepts might be best fitting for what I need firstly.

In fact, a Life Config row activation should unmute a track and here we have one or more VSTs. The VSTs usually would expect PC messages and with those switch to a certain "sound". So one way would be "activate track and send some Midi messages". This looks like the straight-forward way for non Reaper users, accustomed to hardware boxes.

I feel that a more "Reaper-like" might be to save the current "configuration" of the appropriate VSTs (or the track ?) that will include all VST parameters including the current Program/sound. I understand that a Live Config row can "upload" such a "configuration". But I still don't know which files (or whatever) those are stored in and how I create these files (or whatever).

Of course, being a programmer, "actions" that fire up a script look familiar to me.
But while I do understand that JSFX plugins are done in EEL and that they are "located" in a track's fx chain, I did not yet understand the concept of "Reaper scripts". Moreover I understand that additionally to this also Python user-scripts can be fired by Reaper.



2) save a reaper preset for every sound (done with the "+" button in the fx window)
Ahhhh, I see. You suggest it's better to use saved configs instead of generating midi messages to the VSTs.

I need to check what happens with the "+" stuff before going on...

You now need to make sure to setup the midi control path obviously, to change the rows by your midi controller....
That does work perfectly, including filtering out the incoming PC messages (that are converted and sent to Live Config as CC messages) from the Midi stream and using a Midi Bus to route the filtered Stream to the Track and hence to the VSTs. (With "Input Track" in Live Config disabled .)

Thanks a million,
-Michael

mschnell
02-07-2015, 11:46 AM
I see, so you would like to perform on all 3 devices simultaneously while 1 row is activated.
That way you could use a reaper folder track as "input track",
kind of like that:

-input track (folder track)
----midi keyboard 1 (sending on MIDI channel x)
----midi keyboard 2 (sending on MIDI channel x+1)
----midi keyboard 3 (sending on MIDI channel x+2)


Great !
I'm going to experiment with that. In fact I already did some tests with folder tracks, but (other than the nice graphical layout), I did not find out how the folder hierarchy influences the routing (midi and audio) (even though I tried to understand the Reaper documentation on that behalf).


Is one of these also the control device (the PC messages sender)? Or is there a fourth device?

Yep. In the end it's even going to be more complicated (but I will first try to get a simpler version running):
- masterkeyboard 1: sending PCs (and keys and CCs) to one Midi interface on two different channels. Supposedly only the PCs should be distiguished regarding the channels, keys and CCs will simply be merged for both channels resulting in the same actions.
- Masterkeyboard 2: sending PCs (and keys and CCs) to another Midi interface on a single channels (it only sends channel "1").
- Wind controller: sending keys and CCs to a third Midi interface
- Breath controller: sending just CCs to a forth Midi interface. This data stream needs to be merged with the data stream from keyboard 2 when an appropriate "row" is activated.

Sorry i'm not completely following (anybody?), a rough signal flow chart with
A) your devices you would like to perform on,
see above :) :) :)
B) your plugins you'd like to play
Right now I want to use:
- NI Kontakt VSTi with multiple sample Instruments (right now I use it in a mode that selects instruments via PCs, but maybe this is not the best way withing Reaper )
- SWAM "Flutes" VSTi by Sample Modeling
- DEXED (free DX7 emulation VSTi)
- effects (such as Convolution Reverb and Delay by Reaper
C) your device you would like to trigger Live Config with would be an immense help trying to figure out the routing. That would be awesome! see above :) :) :) (In fact I feel this is a typical kyboarder setup.)
Something to examine: i can imagine that the chance of getting hanging notes when switching Live Config rows would be much smaller (zero?) than changing sounds directly per program change (depends of vst of course).
Thanks a lot for the pointer. I'll be carefully following it.
Also you then maybe missing other stuff because i think thats the whole idea of "Live Config". Configure your live set off stage and then play away with what you configured beforehand.

...
That is what I am tying to research right now...


... It can currently be written in Python and EEL with the same basic functionality but each with pros and cons. Perl was also supported until not so long and Lua will be in near future.
Seems really versatile and complex.... Is it also able to modify midi streams ? (Don't answer here. I opend an additional thread with this question.)


JSFX (written in EEL) is meant as a realtime effects processor (audio and MIDI).That is what I already did understand :)

-Michael (Thanks for the great discussion !!!)

mschnell
02-08-2015, 01:21 AM
- if you need to filter out PC messages (used for controlling Live Config) only for that keyboard you would need to set "Pass through source PC" to "No" in the PC-to-CC-value plugin and MidiToReaControlPath to only route CC's to control path on that channel. That way PC's are filtered out of the stream and the remaining MIDI can be used for performing.

I already do this.

I unsuccessfully tried to described that in a previous message, where I asked why this did not work (using a "Midi Bus"). It now works fine. The bug I introduced was activating the "Midi bus transfer" from one track to the other and forgetting to deactivate the input from the midi device in the target track. Happily the "ReaControlMidi" Log nicely showed the duplicate messages to me :) )

I understand that with this (routing a Midi Bus from the "midi input" track to the "audio VST" tracks, is an strict alternative to using the "input track" feature of "Live Config".

- also dont be fooled of thinking that in one Live Config row you can't set multiple presets for all FX in that track , like i was, it can :)
I did find that feature :).

Keeping this in mind, I will stop using a Kontakt "Multi" (that allows to select a sound in Kontakt by sending a PC message, but save single Kontakt sounds with "+" -> Save Preset.

-Michael

mschnell
02-09-2015, 01:13 AM
Hi Veto:
Just to let you know.

With your help, I got a first installation up and running.

I did
- [1] a dedicated "Midi Input Track" that holds PCtoCC and MidiToReaControlPath
- [2] a Track for Standard Sample Sounds with NI Kontakt Not (!) using a Multi, but having LiveControl switch the sounds via "load FX config"
- [3] a track for DX7Piano with two instances of DEXED (slightly detuned) and one instance of ReaVerbate (also ReaControlMidi is necessary to map DEXED's Volume knob to the Volume CC)
- a set of tracks for Pipe Organ consiting of
- - [4] a folder track that holds ReaDelay with two "taps" creating different echoes left and right for enhanced stereo room feeling, and one instance of ReaEQ
- - [5] and [6] two child folders that each hold one instance of DEXED (with different PIPE sounds, together creating a single sound consisting of a set of stops) and one instant of ReaVerb (each with a different impulse file loaded for the different position of the stops in the room) (plus ReaControlMidi)

With LiveConfig I use three "Config"-Pages, the first activating the tracks [2], [3] and [4] alternatively, and loading FX setups for the plugins in these tracks, and two additional "Configs" loading FX setups for the plugins in tracks [5] and [6].

I don't use the "Input Track" feature in Live Config, but do the midi Routing via a midi "Bus" from Track [1] to all others.

Happily without me tweaking anything, the child tracks of track [4] get deactivated (getting no midi input) as soon as the folder track is muted.

Great !!!

Right now only one controller keyboard and only one input midi channel is active. I am sure that I will succeed in adding the others once the current setup is completely done and verified.

BTW:
What files do I need to save for a complete backup of such a setup ?

BTW2:
I am going to continue to try to update the updated "Live Control" documentation, and come back with a PDF ASAP.

Thanks again !
-Michael

mschnell
02-09-2015, 04:32 AM
great, you got it working :)
It was astonishingly hard, even though the nice SWS plugin could be used.

It would be great, if someone (me) could do a dedicated "Live Config" plugin, that provides only this part of SWS, but enhanced by the stuff I found necessary to make it usable e.g. by keyboarders, and maybe a GUI optionally restricted to the stuff needed in that environment which supposedly is not much more than "select Track" and "load FX config", but perhaps making the use of multiple config "Pages" for child-tracks more easy.

(But I would do a concept not before I am ready enhancing my setup for multiple controllers and multiple midi channels...)

I would probably just save the project, live configs should be saved to that too.

... including all VST ("FX") configurations ?!?!?

-Michael

mschnell
02-23-2015, 10:52 PM
Hi
Veto and Xenakios:

I did a draft for an update to the Live Config Documentation:

http://www.bschnell.de/LiveConfigs_1.pdf

I would appreciate if you could take a look at it and maybe detect errors, fill gaps or provide any additional comments.

Thanks,
-Michael

mschnell
02-25-2015, 10:41 AM
Veto,

I of course did start a private communication with Jeffos, and in a first message he did approve my intention to update the docs.

Later I sent him the link to the PDF with the same request for revising, but unfortunately he did not seem to have any spare time right now for this.

So I would be really happy if you could take a deeper look and tell me your thoughts.

-Michael

Jeffos
02-27-2015, 05:33 AM
Yes, sorry about that, kinda busy these days...
Anyway, THANKS A LOT for the update Michael! I'll have a look and report back as soon as I can!

mschnell
03-01-2015, 04:56 PM
Hi Jeffos and Vento,

As well for updating the Live Configs "Manual", as for completing my own live setup, I urgently need answers to some questions regarding "Live Configs", that I can't find out myself.

"CC-Delay"

In the "MIDI Controller & Learn" section there is a footnote talking about the Delay "Live Config" imposes to reactions on a controller change, which is very useful with e.g. sliders, but in a setup using buttons for controlling the configs the standard delay is not appropriate and should be reduced.

That is why the footnote introduces the "CC_DELAY" parameter. Unfortunately, it seems not to exist in my S&M.ini file. Just inserting such a line there does not seem to have any effect. Moreover I am not sure if I really work on the correct S&M.ini file, as the "Main Menu > Options > Open Resource Path" (described in the footnote) does not seem to exist either.

Maybe CC-Delay is not viable any more, as Live Config now features the "Smoothing" and "Tiny Fades" wheels, the meaning of which I don't exactly understand but maybe are used to replace "CC_DELAY" altogether.


Controlling FX parameters of multiple tracks simultaneously

For more complex Sounds (such as Church Organ) we need multiple tracks to work together to create a single "instrument". Here multiple tracks - each featuring a VST instrument - are active at the same time, and the output of these tracks is mixed together in a master track that might hold additional line effects and the midi input of the VSTI tracks might be necessary to be preprocessed in a track that hold pure Midi Effects and distributes the Midi output.

Obviously it's not possible to use a single line of a Config page to set effect parameters of multiple tracks.

I tried to accomplish this by using multiple Config tables (aka "Configs" aka pages).

Astonishingly it does not seem to be possible to use the same CC to control multiple Config tables by having them "learn" the same CC (and channel). I don't understand why this is blocked. Did I do something wrong ?

I want to avoid forcing a potential reader of the "Manual" do any EEL scripts to accomplish such a rather "standard" - even if not uncomplex - thingy. I think in the end I will provide some useful scrips but this should not be necessary in this early stage.

BTW.: The old Manual somewhere states tha there should be "up to 8 Configs", but in the new version I seem to be able to select one of only 4 config tables. For the said purpose it in fact might be appropriate to use more than four.

Any help ?

Thanks a lot,
-Michael

mschnell
03-05-2015, 10:21 PM
Anybody ?!?!?

-Michael

mschnell
03-09-2015, 11:19 PM
Sorry to be a PITA, but I really need help on this.

I found a video that might answer some of my questions, -> https://www.youtube.com/watch?v=rPmymNzA78I. But I don't understand enough French.

-Michael

mschnell
03-10-2015, 10:32 AM
CC_Delay == Smoothing-control in newer releases of live config afaik (for up to date specs of live config see also here http://sws.mj-s.com/download/featured/).

I did guess this from the French video...


To get to resource path alternatively run the action "show resource path" from action list.

OK...


If multiple tabs (max are 4) dont work for your usecase consider to make use of reapers routing- and/or grouping-features and live configs input track.

The problem with multiple tabs is that - for a reason that I don't understand - you seemingly can't control multiple tabs with the same CC controller. In fact it does not make much sense to create multiple tracks with the same instruments and just different fx parameter sets, so I need to use LifeConfig to set these parameters. And if I run multiple VSTi's in parallel, they need to be placed at the top of the FX chain each in a track and hence LiveConfig needs to set the FX parameters for multiple tracks at the same time. Multiple tabs seems the correct choice here, but they need to run in parallel.

As a workaround I of course could do a JSXF that send multiple controller messages when I press a Program Change button. But before doing a workaround I'd like to make sure if this is really necessary / recommended.

-Michael

mschnell
03-11-2015, 10:11 AM
Please keep in mind that Live Config is able to select presets of multiple Vsts located in *one* track, so packing everything into one track might be an option.
(that would require different Midi Channels for Vsts, or using "Midi Buses")

Up till now I have the impression that I can't use multiple VSTis in a single track, as there is no track-internal routing for their audio outputī, but just a "chain" (that is why it is called that way). But maybe I am wrong.

Thanks a lot. I'll give that idea another try....

-Michael

Banned
03-11-2015, 10:34 AM
Up till now I have the impression that I can't use multiple VSTis in a single track, as there is no track-internal routing for their audio outputī, but just a "chain" (that is why it is called that way). But maybe I am wrong.
Yes, fortunately for you, you were wrong there. ;)

You *can* use multiple plug-in instruments in a single track. For 'track-internal routing', you can use up to 64 channels per track, use the "plug-in pin connector" to place any plug-in's audio output on other channels than 1-2, and then route the output from those channels to other tracks using sends/receives.

mschnell
03-11-2015, 02:26 PM
Yes, fortunately for you, you were wrong there.
Great !

I'll give that a try !

-Michael

mschnell
03-22-2015, 02:31 PM
Banned,

I finally was able to check as well the "Smoothing" as the complex audio routing with in a single track's FX chain.

Both does work as you suggested.

I'm going to update the would-be LiveConfigs manual on that behalf and several more issues and let you know when I uploaded the next version,

-Michael

Banned
03-22-2015, 03:18 PM
Banned,

I finally was able to check as well the "Smoothing" as the complex audio routing with in a single track's FX chain.

Both does work as you suggested.
Good to hear that it works for you. :)
I'm going to update the would-be LiveConfigs manual on that behalf and several more issues and let you know when I uploaded the next version,
Don't bother notifying me about LiveConfigs, as I have absolutely no interest in that feature. In fact, for various reasons I consider REAPER to be absolutely useless for live performances.

ELP
03-22-2015, 06:39 PM
sign
My opinion: any DAW and any VSTHost program which can not sync externally to vari musical timebase,
vari. live tempo changes is completely unsuitable for live playing/performances.
Unless you like it statically over fixed tempomaps and or static frames per second ;)
like a better music player, as a good/super DAW mixing console, recorder for live,
but for live playing hard/soft synths or whatever you and I :D need the possibility to change the tempo(beats per second) during playing.
Timewaster.

Today there are only a few "modern" DAWs
like Ableton or Reason that can sync as slave to mus.timebase via M-Clock or Tempo changes via controller and perhaps only one very good LiveVSTHost named LiveProfessor, that can do the job for live.... and extern sync to musical timebase.
REAPER is a super DAW, no question, but "at the moment, @Devs ;) :)" unfortunately totally unsuitable for such a thing. :(

mschnell
03-22-2015, 11:09 PM
My opinion: any DAW and any VSTHost program which can not sync externally to musical timebase,live tempo changes is completely unsuitable for live playing/performances.
For me, live performance just requires that the delay between hitting a key of the masterkeyboard and the sound to be heard is as low as possible, to allow for best sync between the musicians. No electronic "sync" requested at all, as there is no pre-produced material in any form.

I know several musicians who use Reaper for that purpose.

Reaper is perfectly suited for this, only the configuration is not exactly easy ;) And I would be happy to be able to help by improving the LiveConfigs docs according to the experience I get with the kind support from this forum.

-Michael (hating loop-based and similar production which is "half-live" or short "Zombie Music" :) )

ELP
03-23-2015, 02:59 AM
"No electronic "sync" requested at all, as there is no pre-produced material in any form."

the problem begins already, for example, if you play soft/hard synths with arps
or any other things inside that act with a tempo.... and thatīs alot.

And yes of course you can use REAPER for live performances, but than playing/the performance is very static, like from a tape or a playback performance^^.. you canīt really vari. the tempo, the feel etc.pp with all the other musicans on stage.


Nevertheless, I think it's obviously good if someone is bothered and cares about
And who knows, possibly the non possibility tempo sync as slave or to a simple live tempo change via controller story in REAPER changes maybe once....
I hope so

mschnell
03-23-2015, 07:11 AM
you canīt really vari. the tempo, the feel etc.pp with all the other musicans on stage.
Obviously my meaning is not yet clear.

When playing "Live" (as I understand it), Reaper does not "run". It's just used in stop mode with tracks armed and "Monitor on". So it does not impose or even know any "tempo". It's just like playing an acoustical Piano.

Regarding machine-generated arpeggios, I in fact don't know how they could be synchronized with a drummer, playing on an acoustical set and being the one who has the job to dictate the tempo.

-Michael

Banned
03-23-2015, 08:42 AM
When playing "Live" (as I understand it), Reaper does not "run". It's just used in stop mode with tracks armed and "Monitor on". So it does not impose or even know any "tempo".
Well, that's only *your* use case. Some other users wish to use REAPER not only as a host, but also as a sequencer - unfortunately, REAPER sucks at that task.

(Btw, tempo-synced plug-ins *can* also follow REAPER's tempo while its transport is not playing, afaik.)
[...] I in fact don't know how they could be synchronized with a drummer, playing on an acoustical set and being the one who has the job to dictate the tempo.
It's not very hard to generate a sufficiently accurate MIDI clock with a varying tempo. But that discussion seems moot here, since you don't care about it, and REAPER does not support it.

mschnell
03-24-2015, 03:44 AM
But that discussion seems moot here, since you don't care about it, and REAPER does not support it.
In fact I don't need it for my music playing, but as I am a software developer as well, I of course am technically interested such "enhanced" use of reaper (as a sequencer) and will keep watching what might happen on the "external sync" issue (there will be version 5 soon ?!?!?) .

I could e.g. imagine things like a "Sampler" firing the playback of an item or track (and sub-tracks) due to a midi note-on event, converting note value, velocity, aftertouch, CC etc to plaback parameters and Midi output to effect VSTs.

Here external sync for arpeggio is just around the corner.

-Michael

Banned
03-24-2015, 03:59 AM
In fact I don't need it for my music playing, but as I am a software developer as well, I of course am technically interested such "enhanced" use of reaper (as a sequencer) and will keep watching what might happen on the "external sync" issue (there will be version 5 soon ?!?!?) .
Oh, good to hear you *do* care, even if you don't need it yourself (I would imagine you *could* have a use for e.g. tempo-synced delay time plug-in parameters, even without need for a sequencer, though). :)

Still, nothing will happen, probably, given the 'live with it' status of bug reports like this one (http://forum.cockos.com/project.php?issueid=4462). REAPER's design as a sequencer simply seems to be flawed from the bottom up.

mschnell
06-10-2015, 09:31 AM
Wondering, if LiveConfigs are so useful, why Evangelos Odysseas Papathanassiou does not use it? Is Evangelos stupid?

He obviosly has enough money to use a lot of dedicated hardware devices. :)

Of course there also are several dedicated software products that are more easy to use for exactly this task than Reaper plus LiveConfigs is.

The sexy thing about LiveConfigs is that you get it for free if you already have Reaper (which itself is a lot cheaper - and of course a lot more versatile - than the said dedicated products are). And if you already use Reaper, LiveConfig does not offer a to steep learning curve.

-> www.bschnell.de/LiveConfigs_1.pdf

(Please do provide comments on the document if possible... )

-Michael