As this is an initial alpha preview release there are a few things missing, and probably a few things unintentionally broken. Notably, there is not yet a GUI for creating custom banks, so you'll have to edit bank files by hand and learn a bit about Reaticulate's custom markup used to extend reabank files. This is all documented, although the documentation, as with everything else, is a WIP.
If you can stand my rambling, I've done a video to cover the features and operation of Reaticulate. Even if you tune out after the 2 minute mark, you should have a pretty good sense of what it's about.
I welcome all bug reports, suggestions, and other feedback, either here, or on the GitHub issue tracker.
[Cross-posted on the Reaper forum, vi-control.net, and thesoundboard.net. Apologies if you frequent multiple forums and this is feeling spammy.]
This looks really promising. I watched the whole video.
When I compose, I like to perform the notes using a general purpose articulation (a short attack with sustain) and then I go back afterwards and draw in the key switch notes I want for each note or group of notes. C2 might be sustain, E2 for staccato, F2 for pizzicato for example.
I see in your picture, below the piano roll, the articulations are displayed. Can I draw in the articulation I want below the piano roll or do I have to record my key switch selections to get them to appear there?
Another question. Sometimes I'll record a violin for example and after I've programmed my articulations (which are key switch notes below the violin range) I decide I want to copy those same MID notes to cello and then shift them down an octave or two. I also copy the key switch notes from violin to cello and then shift those notes above the cello range (which is where the key switches for my cello are active).
With your Reaticulate plugin, can I still do something like that, where I can copy MIDI notes from one instrument to another and also copy your articulations from one instrument to another? I would imagine that would be 2 separate copy operations. One for the MIDI notes, one for the articulation changes.
__________________
Paul Battersby Virtual Playing Orchestra - free sample library @ http://virtualplaying.com
This looks really promising. I watched the whole video.
Amazing!
Quote:
Originally Posted by pbattersby
I see in your picture, below the piano roll, the articulations are displayed. Can I draw in the articulation I want below the piano roll or do I have to record my key switch selections to get them to appear there?
You can step record them, as long as step input is enabled in the MIDI editor. At that point, after clicking on one of the articulation buttons or triggering one of the "activate articulation" actions (e.g. from a tablet running TouchOSC, or whatever), it will insert the articulation at the edit cursor position, or replace an existing program event if one is currently selected in the MIDI editor.
With your Reaticulate plugin, can I still do something like that, where I can copy MIDI notes from one instrument to another and also copy your articulations from one instrument to another? I would imagine that would be 2 separate copy operations. One for the MIDI notes, one for the articulation changes.
Two separate select actions anyway (you'll need to shift-select the program change events after you select the notes), and then you can copy and paste them both at once.
Unlike with CCs, Reaper doesn't have a mode to automatically select program changes underneath selected notes. I plan to investigate the possibility of adding a feature to have Reaticulate optionally do this (tracked here), so you get a similar behavior to CCs.
When I compose, I like to perform the notes using a general purpose articulation (a short attack with sustain) and then I go back afterwards and draw in the key switch notes I want for each note or group of notes. C2 might be sustain, E2 for staccato, F2 for pizzicato for example.
Oh, related, I also intend to add an action to scan MIDI items and convert manually triggered keyswitches to program events (and that one is tracked here).
So you could do your live performing as described, trigger the action, and it'll convert all the keyswitch notes (C2, E2, etc) to program changes for easy visibility in the MIDI Editor. Then you can go back over them and massage via step input.
I thought I'd mention in case it wasn't clear: to have the program changes show in the MIDI editor, just add a lane that's set to "Bank/Program Select"
Pretty impressive pack and it even has a dedicated website ! Well done !
It seems a pretty good alternative to BRSO Articulate and sfer alternative articulation manager, especially because the dev is open (in fact, these alternative behave very differently too).
Pretty impressive pack and it even has a dedicated website ! Well done !
Thanks!
Quote:
Originally Posted by X-Raym
It seems a pretty good alternative to BRSO Articulate and sfer alternative articulation manager, especially because the dev is open (in fact, these alternative behave very differently too).
Yes I did look closely at BRSO Articulate some time ago. I never did donate for Stephane's track inspector, but I believe we took more similar approaches than BRSO (using program changes as the basis for the solution).
Ultimately the closet nature of both these projects was a deal breaker for me, because I really wanted additional functionality. Had they been open -- or at least Stephane's solution -- I almost certainly would have contributed improvements rather than do my own thing.
In the end though I think my own thing ended up solving the problem better for my particular workflow. Also I suffer helplessly from NIH syndrome so it kind of worked out.
Wow, thanks for free and open source!
Could we add those articulations in notation view and it would play correctly, no? Then you could compose the articulations.
Spent a little time trying this out and it's very promising indeed. Really well thought out, well documented and stable through my initial testing - great quality work for an alpha preview.
I'll knock up a bank or two for some Embertone instruments and let you know how I get on.
Could we add those articulations in notation view and it would play correctly, no? Then you could compose the articulations.
I'm going to wait to see how articulation maps pan out (maybe in Reaper 6?), which should hopefully facilitate that kind of integration in the notation view. Other approaches are surely to become deprecated by articulation maps so I don't want to invest a lot of time on that now.
Spent a little time trying this out and it's very promising indeed. Really well thought out, well documented and stable through my initial testing - great quality work for an alpha preview.
Thanks! I'd love to see those Embertone banks when you're done. I'm interested in fleshing out the factory banks for a better out-of-box experience. (I have the Jubal flute, Herring Clarinet, and Joshua Bell violin but haven't gotten around to doing banks for those yet.)
In general, community bank submissions are most welcome.
Pretty impressive pack and it even has a dedicated website ! Well done !
It seems a pretty good alternative to BRSO Articulate and sfer alternative articulation manager, especially because the dev is open (in fact, these alternative behave very differently too).
Thanks for sharing :P
Absolutely X-Raym! I've been trying it since yesterday and the concept is so well thought out. Jason, you are brilliant!
Wow, this is really well presented and looks like a LOT of work has gone into it. Thanks so much for being so generous with your time!
I really don't have time at the moment to get properly stuck in, but wanted to give some words of appreciation and encouragement. Maybe in the new year I'll add some mapping for LASS First Chair and the Sample Modelling instruments I've got, if no-one else has by then.
I'm going to wait to see how articulation maps pan out (maybe in Reaper 6?), which should hopefully facilitate that kind of integration in the notation view. Other approaches are surely to become deprecated by articulation maps so I don't want to invest a lot of time on that now.
I guess familiarity more than anything. Bome MIDI Translator was already an integral component in my MIDI workflow so it was a natural extension.
Reaper has already for notation view following articulations:
-accent
-fermata
-marcato
-portato
-staccatissimo
-staccato
-tenuto
A .lua which can sync between these events could do the trick, as an early experimentation field. I mean sync from notation to reaticulate events.
PHP Code:
if (notation articulation available AND sync on)
{
; in time selection
; sync from notation to reaticulate
}
A .lua which can sync between these events could do the trick, as an early experimentation field. I mean sync from notation to reaticulate events.
It could work as an experiment. In practice the number of articulations available to us in many of these orchestral libraries vastly exceeds the built-in articulations, but we could extend it with custom notation events.
Ideally, in fact, these notation events could be used directly to trigger articulation changes in the underlying patches (translated by the Reaticulate JSFX as it currently does for program changes).
The problem though is that the articulation (or other custom notation) events seem to be sent after the notes they decorate, not before.
Hopefully that's fixed by the time articulation maps lands. In the meanwhile, I think I'd rather not have any attempt at notation view integration at all than to do something half baked.
It could work as an experiment. In practice the number of articulations available to us in many of these orchestral libraries vastly exceeds the built-in articulations, but we could extend it with custom notation events.
Ideally, in fact, these notation events could be used directly to trigger articulation changes in the underlying patches (translated by the Reaticulate JSFX as it currently does for program changes).
The problem though is that the articulation (or other custom notation) events seem to be sent after the notes they decorate, not before.
You do not have to do, someone else with enough experimentation interest can do, I asked kawa_, he did not want, but maybe anyone else, over time, we will see.
I would keep the original design using program change/bank select events as this is very general, so very good. This would work with any synths with bank select and program change, like Synth1 or Oatmeal. Not everyone wants to compose and run after Hans Zimmer. Better would be simply syncing .lua or other extra .lua dancing around those bank select, program change, notation articulation events. Regarding after the note instead of before the note, yes, the sync .lua could behave a bit intelligent and jump over its previous note, then it would be again before the note. Possible? Yes. Just open some Beethoven score, look at it only as pictures, where and how often articulations are used, do something similar in notation view, see how it will sound, experiment.
Yeah, the libraries have many more articulations, this can be, that is why it is only a small experiment, not a fully finished end solution, just the beginning, only to see, how it feels if you can directly compose in the notation editor while creating your notes. Workflow could be like:
-open midi editor,notation view, add notes, add articulations (both using mouseover, hit some qwerty key)
-hit ENTER (that would switch over to arrange, plus notation to reaticulate sync, plus optional PLAY)
-hit ENTER again, to switch back to midi editor,notation view
Hi Tack, I checked yesterday Reaticulate creating a new virtual instrument with the Kontakt factory banks and Spitfire Symphonic Strings and it worked perfectly, (the structure of the default factory banks is not the best). The part of creating my own banks is tricky to me so I haven’t tried it out.
Then I checked Reaticulate with one of my orchestral templates and works too, no problem at all. Anyway I have a doubt: my templates consist on instances of Kontakt, inside each of them I have 16 tracks, First I tried out using it in the Kontakt track, and them in each of the 16 tracks. Maybe I am wrong but I see that you are using just Kontakt instances without tracks inside. Is this the best way to use Reaticulate? Should I change my templates to use Reaticulate efficiently?
I haven't found any bug at the moment, I will try with different libraries this weekend, maybe I would like it narrower ... but this is really subjetive I guess
(the structure of the default factory banks is not the best)
What would you recommend to improve it?
Quote:
Originally Posted by Vagalume
Anyway I have a doubt: my templates consist on instances of Kontakt, inside each of them I have 16 tracks, First I tried out using it in the Kontakt track, and them in each of the 16 tracks. Maybe I am wrong but I see that you are using just Kontakt instances without tracks inside. Is this the best way to use Reaticulate? Should I change my templates to use Reaticulate efficiently?
I don't think it should matter which approach you go with. I prefer to have a separate Kontakt instance per track for various reasons (simplified routing; sharing audio FX, automation, and MIDI on the same track; better compatibility with disabled tracks; reduced overall track count) but it should work either way. With the Kontakt Group approach you described, you'd still install the RFX (Reaticulate FX) on each of the 16 tracks and set up the track bank(s) according to how you have the patches arranged in the Kontakt instance on the relevant channel used by that track. I'll double check that works as expected later tonight.
Quote:
Originally Posted by Vagalume
maybe I would like it narrower
You can use ctrl-mousewheel to adjust the overall zoom level of the UI. I realize the box of 16 MIDI channels takes up a good amount of horizontal real-estate. At some point I'll look at how to make it reflow a bit better for narrower panels.
How set is your bank format at this point? Would it be better to wait before spending much time writing additional user banks?
Edit:
Another question, I may be missing something obvious but is there a way to add the articulations to the track other than recording while clicking them (or step recording them)?
I'll echo the need for this to sync with the notation view. I get that articulation maps are coming at some point, but I think you could have a separate (optional) script that copies the bank entry to notation or vice versa.
I might try to cobble together a proof of concept unless you have any objections.
Edit:
Another question, I may be missing something obvious but is there a way to add the articulations to the track other than recording while clicking them (or step recording them)?
I'm not 100% sure if I got this right, but I think the numbers in front of the articulation represent the program change number, which can be triggered by a MIDI controller / Lemur / TouchOSC.
How set is your bank format at this point? Would it be better to wait before spending much time writing additional user banks?
Difficult to say. I don't have any plans to change the bank format in a non-backward-compatible way -- it should be pretty extensible -- but I can't rule out any major revelations from the community here that might not up-end things.
Quote:
Originally Posted by gmgmgm
Another question, I may be missing something obvious but is there a way to add the articulations to the track other than recording while clicking them (or step recording them)?
There are a number of "activate articulation" actions. Open the actions list and search for Reaticulate to find them.
I also demonstrated a few in the video: actions for next/previous articulation (which I use from my control surface), an action to scroll through the list (to use via an encoder say, or the mousewheel if you prefer), and also actions to activate articulations by specific program number, which you could do from e.g. a tablet.
When you trigger these actions with the MIDI editor open and step input enabled, then they will be inserted in the MIDI item. Otherwise, they should be captured via live recording. Or, if not recording or step-inputting, you can of course activate them just for ad hoc use.
Quote:
Originally Posted by pcartwright
I might try to cobble together a proof of concept unless you have any objections.
Certainly no objections. I'd be interested in seeing that code, and if it's something that doesn't require a lot of effort, I could look at integrating back into Reaticulate. It'd be easy enough to have articulation activation insert not just a program change event but also a text event for notation, for example.
Quote:
Originally Posted by _Stevie_
I'm not 100% sure if I got this right, but I think the numbers in front of the articulation represent the program change number, which can be triggered by a MIDI controller / Lemur / TouchOSC.
Reaper can't step record program events unfortunately (or at least it didn't when I tried last year). So the right way to trigger an articulation is through one of the actions I mentioned earlier (or by clicking on the articulation button in the GUI, but that's not always the most practical approach).
There are a number of "activate articulation" actions. Open the actions list and search for Reaticulate to find them.
I also demonstrated a few in the video: ...
When you trigger these actions with the MIDI editor open and step input enabled, then they will be inserted in the MIDI item. Otherwise, they should be captured via live recording. Or, if not recording or step-inputting, you can of course activate them just for ad hoc use.
Thanks for the response - I apologise, I should have been more specific in my question. I had watched and enjoyed your video and seen the various methods you demonstrated. (I found it quite concise btw, you have a talent for video instruction) They're great but seem focused on using Reaticulate in live performance/recording.
I suppose I was hoping for an insert method better suited to messing about in the MIDI editor with an already-recorded track. Something like "right click the articulation in the Reaticulate gui to insert it at the current time position", perhaps.
I guess this would only avoid having to put Reaper in/out of step record mode, but I tend to forget to switch it off - and also I'm a lazy, lazy man.
I guess this would only avoid having to put Reaper in/out of step record mode, but I tend to forget to switch it off - and also I'm a lazy, lazy man.
Yes, exactly, and I feel it would violate the Principle of Least Astonishment if activating articulations started injecting events into open MIDI items with step input disabled. I personally do a lot of fiddling around with patches while the MIDI editor is open. I have step input toggle bound to an easily accessible key and I use it often
Quote:
Originally Posted by gmgmgm
(I found it quite concise btw, you have a talent for video instruction)
Thanks for that. I find I'm terribly self-conscious about going on for too long. I try to be brief but routinely fail.
I wanted to underline one of the goals for factory banks.
This is an excerpt from the documentation that I'll just quote directly:
Quote:
Although the program numbers are arbitary and don’t influence any specific behavior, some form of standardization is recommended because this allows using the Reaticulate_Activate articulation by CC actions to trigger a given articulation (or at least its closest approximation) from a control surface, tablet, etc., no matter the underlying instrument.
So, for example, by consistently using program 42 to map to spiccato, or some similar very short articulation, you could have a control surface send CC value 42 (via a CC number of your choice bound to the Activate Articulation action) to set spiccato, no matter what track is selected.
I talk a bit about this in the video as well (starting at 18:48) and explain the reasoning and benefit of doing this.
UACC has got problems, but I think it's mostly good enough for these purposes. I did think of doing my own spec for articulation mapping, but, well, oblig.
Of course you can do whatever you want for user banks, but if you're sharing with the goal that your contribution is merged into the factory bank -- something that I'm absolutely encouraging and thrilled to see as I'd love to flesh these out for other users -- I just ask that you follow UACC where possible and follow the other recommendations in the docs.
Other key points: articulation names are lower case (except to differentiate between e.g. trill m2 and trill M2), and group 1 should be used for the main articulation group (the one with most articulations that control the instrument) rather than things like legato on/off.
UACC has got problems, but I think it's mostly good enough for these purposes
How would you prefer to handle cases where no UACC standard applies?
For example, in rewriting my Fischer Viola bank to conform, the various vibrato styles don't really fit. Assigning the default style to 16 seems right, but should the other four possibilities be assigned in the UACC FX range (90 - 99) or perhaps to a number outside current UACC (> 112)?
A similar case is controls that do nothing but turn off articulations.
Or should this sort of functionality (not really articulation) remain outside banks aimed at inclusion in the factory banks?
Or should this sort of functionality (not really articulation) remain outside banks aimed at inclusion in the factory banks?
Oh, no, it definitely belongs. Vibrato styles certainly qualify as articulation (perhaps even more traditionally correct compared to the way we tend to use it with VIs these days?). The goal of the bank should be to make the patch usable entirely via Reaticulate.
It really is a judgment call. For those programs that aren't clear cut, I match them as best as I can, and if it's completely in left field, yeah, pick either an unassigned program, or an uncommon program if you run out.
The ones I personally feel are more important to get right:
1: long normale (non-legato for chords)
7: long muted (e.g. con sordino)
8: long soft (sul tasto, or flautando, or hollow, or something played at an unusually soft dynamic)
9: long hard
11: tremolo/flutter
20: legato
40: staccato or generic short
42: spiccato or very short
56: pizzicato
70-74: trill
Of course many patches have multiple articulations that could qualify, so in those cases I assign the one I think is most representative or common in the context of the patch.
There's something I haven't quite settled on philosophically -- I think that uncertainty shows through in the factory banks right now -- which is what to do with patches that use mode toggles (e.g. legato on/off, con sordino on/off) as opposed to discrete articulations (long normale, long con sordino, legato normale, legato con sordino, etc.) like Spitfire does.
One idea I quite like in practice is activating a single program and knowing exactly what you're going to get. Especially the ones listed above. For example, I like being able to send program 1 and know that, no matter what, I can play chords; or program 20, and know, if the patch supports it, I can play a play a monophonic line with legato transitions. Or program 42, and I'll get biting shorts.
This is useful for sketching contexts -- quickly activate the basic sound you want to achieve without farting around with multiple articulation keys -- and then you can go in later and massage the MIDI data.
I was not always true to this idea in the factory banks (e.g. with the Bohemian and Cinematic Studio Strings), and I think I'm regretting that.
In these cases, I've been considering creating separate companion banks for the purpose of sketching, and avoiding the problematic program numbers in the main bank.
So, take for example Cinematic Studio Strings, where we have these programs:
20: legato on (group 2)
19: legato off (group 2)
7: con sordino (group 3)
2: senza sordino (group 3)
1: sustain (group 1)
This means it isn't the case that I can send program 20 and know I'll be able to play a legato normale line, because if the patch is currently on, say, staccato, I'll need to additionally activate sustain. But that means sending program 1, where the expectation I should be able to play chords.
It's a real mess.
Instead, I think it perhaps wise to avoid program 1, 7, and 20 in the main bank and reassign these to something else. Then have a companion bank, which could be loaded along side the main bank in the track, to implement this "one program to rule them all" combination programs, like what I ended up demonstrating in the video.
Maybe this does argue for something more precisely specified than what UACC is offering. I want to have my cake and eat it too: consistency and uniformity across different libraries from different vendors, and be able to support a varied ecosystem of different implementation approaches by those vendors.
There's me rambling again. If you followed that stream of consciousness and you have some opinions on the matter, I'm quite interested.
If you followed that stream of consciousness and you have some opinions on the matter, I'm quite interested.
Well, I think you may need to stipulate at least a subset standard, even within the guidelines of UACC. The list you've given may be a good start. For example, I looked at the UACC spec, saw a few options and settled on 17 for sul tasto. If you'd prefer 8 to be the standard within Reaticulate you may need to specify that.
There's just too many ways to organise this stuff. I wrote Cubase expression maps for several libraries and tried to standardise, but the various functionalities and implementations among the manufacturers just made it a dog's breakfast of compromises.
You've obviously given this considerable thought. I'd recommend doing whatever makes the most sense to you, for the way you intend to use Reaticulate. Let others know what you decide so that they can assist if they want to.
It's already a great contribution to the Reaper community.
Program 8 was perhaps a bad example. Indeed, 17 is pretty unambiguous for sul tasto (and there's piles of precedent in the factory banks). I think no matter how to slice it, "a dog's breakfast" is a fine description.
The problem with doing so many combination programs as you stated above so that there is not a whole lot of different controls to be applied at one time, (the most clear example for me is Legato/ non-legato switch) is that if you make out all the combinations so as to take care of the multiple output (with notes and ccs to match) you end up with twice as many programs.
Now, one of the powers I like in your solution is that you have given us 10 actions ready to go straight to the articulation one wants, this in combination with another layer of abstraction seems to make the most sense to me.
This extra layer can be many things:
- Menu buttons, or actions: Basically reduplicating and renaming (or regrouping) those actions to whatever you want, and have sets created per library.
- OSC messages (again basically sending the same but arranging display so as to show remotely the correct names).
- Web Remote (Same thing, calling up the 10 actions with the correct names depending on the library).
In this way, the level of abstraction is left to the user, but you make the most out of those 10 actions, as otherwise you'd need twice as many. Of course it is not perfect, it is more work, and when switching from from a legato articulation to a non legato one, you'd need to toggle two buttons rather than one, but if its something that you do often you could just assign one button that triggers both actions.
Still, as I said is one more layer and more work on the user, but let's not forget that these options are there to use.
The problem with doing so many combination programs as you stated above so that there is not a whole lot of different controls to be applied at one time, (the most clear example for me is Legato/ non-legato switch) is that if you make out all the combinations so as to take care of the multiple output (with notes and ccs to match) you end up with twice as many programs.
And that's with just legato on/off. Throw in con sordino on/off, or, say in the case of Embertone's Friendlander, sul ponticello on/off (since it's implemented through convolution), and you have a combinatorial explosion of permutations.
I think, though, there is a good reason to provide some consistency for a set of very common articulations for the sketching use-case. A strict set of programs which will emit all the necessary output events to get the patch into exactly the state defined by the program (e.g. program 1: non-legato long sustains senza sordino), that Just Work no matter what track is selected. All other programs, although perhaps still adhering to UACC if there's a reasonable match, are more lax in that they don't take extra care to set those things like sordino, legato, etc.
I'm warming up to the idea. There's a gap in Reaticulate's functionality that I'd need to close, which is that if a program emits an output event to set, say, legato off, and there is another specific "legato off" program in another group mapped to that output event, Reaticulate won't realize it should activate that program in the other group. That should be easily fixed, and then it would allow these "all inclusive" programs to co-exist with the more narrowly defined programs that more closely align with the patch's true behavior.
Quote:
Originally Posted by benigno
Now, one of the powers I like in your solution is that you have given us 10 actions ready to go straight to the articulation one wants, this in combination with another layer of abstraction seems to make the most sense to me.
And that took some doing.
Early incarnations of Reaticulate defined the type of output event at the bank level (e.g. note, CC and which CC number, destination channel number) and then used the program number to signify the value for the output event (e.g. if the bank's output type was note, then program 42 would send note 42). I had a minor breakthrough that made me realize I could allow for multiple specific output events for each individual program. This does allow for some great flexibility, as you point out.
Quote:
Originally Posted by benigno
Still, as I said is one more layer and more work on the user, but let's not forget that these options are there to use.
Yes, of course I'm agonizing about exactly how the factory banks should work, but the user is free to do whatever suits their specific workflow and preferences. I like the idea of leveraging Web Remote, and I'm interested to see what users cook up there.
As someone who recently started using sample libraries, articulation-switching has been one of the biggest hurdles. The current method I use is quite cumbersome.