Go Back   Cockos Incorporated Forums > REAPER Forums > REAPER Pre-Release Discussion

Reply
 
Thread Tools Display Modes
Old 02-26-2016, 07:12 AM   #1
schwa
Administrator
 
schwa's Avatar
 
Join Date: Mar 2007
Location: NY
Posts: 15,815
Default Notation editor and special MIDI handling

This thread is for discussing potential special MIDI handling features in the notation editor.

Please keep in mind that the notation editor is in early development and nothing too exotic will be implemented until the basics of notation are stable.

It's also most helpful to us if the discussion is focused on incremental steps rather than a giant wish list. The most useful thing to think about from our point of view is the not the interface seen on the screen, or what dialog windows or file formats are used, but instead the specification of what types of data need to be linked to what.


For example:



1. Percussion notation, or any situation where the written notation differs from the desired MIDI output. We could potentially expand the existing MIDI note name interface to support mappings like this:

36 "Kick" 64
44 "Hat pedal" 62 "X"

Meaning, a MIDI note with pitch 36 will be displayed in the piano roll with the name "Kick", and displayed in the notation editor with pitch 64 (F4). A MIDI note with pitch 44 will be displayed in the piano roll with the name "Hat pedal" and displayed in the notation editor with pitch 62 (D4) and an "X" note head.

Is this reasonable?



2. Linking articulation and dynamics to MIDI messages. For example, a staccato marking triggering a key switch, or a crescendo marking triggering CC volume messages.

We could potentially add a new interface to support mappings like this:

FFF B0 07 7F
Staccato 90 00 7F

Meaning, FFF articulation is linked to CC volume 127. Staccato articulation is linked to the note C0 with volume 127. (This is written in raw MIDI but that does not mean the user interface will be.)

Is this reasonable?
schwa is offline   Reply With Quote
Old 02-26-2016, 07:49 AM   #2
reddiesel41264
Human being with feelings
 
reddiesel41264's Avatar
 
Join Date: Jan 2012
Location: North East UK
Posts: 493
Default

With regard to mapping score markings to playback elements, I don't it is something that should be fixed because it varies so much between different sample libraries. I think it should be customisable per track and something that can be saved with track presets or loaded from an external file - it may be a good idea to leave the interpretation of such markings to a plugin that could access the notation API or provide a some kind of dedicated playback editor.

Having some defaults for playback though would be good, like CC7/CC11 for dynamics and reducing the length of staccato notes.
__________________
http://librewave.com - Freedom respecting instruments and effects
http://xtant-audio.com/ - Purveyor of fine sample libraries (and Kontakt scripting tutorials)
reddiesel41264 is offline   Reply With Quote
Old 02-26-2016, 07:54 AM   #3
semiquaver
Human being with feelings
 
Join Date: Jun 2008
Posts: 4,923
Default

yes reasonable.

in time:

some techniques are done as text above the staff - "pizz" "sul pont"
tremolo is indicated by slashes on stems.
these can be combined ie "tremolo sul pont" would have the text "sul pont" and slashes on the stems.
semiquaver is offline   Reply With Quote
Old 02-26-2016, 07:56 AM   #4
semiquaver
Human being with feelings
 
Join Date: Jun 2008
Posts: 4,923
Default

thought: it would be awesome to hover over the indication to see/edit the key switch...
semiquaver is offline   Reply With Quote
Old 02-26-2016, 08:05 AM   #5
EvilDragon
Human being with feelings
 
EvilDragon's Avatar
 
Join Date: Jun 2009
Location: Croatia
Posts: 24,798
Default

Quote:
Originally Posted by schwa View Post
1. Percussion notation, or any situation where the written notation differs from the desired MIDI output. We could potentially expand the existing MIDI note name interface to support mappings like this:

36 "Kick" 64
44 "Hat pedal" 62 "X"

Meaning, a MIDI note with pitch 36 will be displayed in the piano roll with the name "Kick", and displayed in the notation editor with pitch 64 (F4). A MIDI note with pitch 44 will be displayed in the piano roll with the name "Hat pedal" and displayed in the notation editor with pitch 62 (D4) and an "X" note head.

Is this reasonable?
HELL YES!!!


A good idea here, if expanding the note name file type is in order, is to allow us to REMAP the pitches however we see fit. So that we don't have a linear chromatic piano roll, but notes would be reordered/remapped according to the note name file visually - in piano roll view (not notation!). This would help very much for drum maps that are all over the place, which means you could list all snare articulations in one group of notes, then all hihat articulations in another group of notes, and so on.

Is that reasonable, too?

Last edited by EvilDragon; 02-26-2016 at 08:11 AM.
EvilDragon is online now   Reply With Quote
Old 02-26-2016, 08:20 AM   #6
BobF
Human being with feelings
 
BobF's Avatar
 
Join Date: Apr 2013
Posts: 699
Default

I hope this isn't a bad time/place to ask this, or if it has already been asked & answered elsewhere.

What is the goal of the notation implementation? Is it to accurately display what has been created/imported by other means?

Is it meant to be a complete edit/display/print solution, with no need for editing in piano roll to get from idea to full orchestration?

Something in between?

Just curious and the answer(s) might guide the input/feedback you receive.
__________________
Reaper/Studio One Pro/Win10Pro x64
i7-6700@3.8Ghz/32G/43" 4K/UMC1820
Event PS8/KKS61MK2/Maschine MK3/K12U
BobF is offline   Reply With Quote
Old 02-26-2016, 08:41 AM   #7
Isaction
Human being with feelings
 
Isaction's Avatar
 
Join Date: Dec 2013
Location: MN
Posts: 4
Default Yes, very reasonable.

Yes, very reasonable.

Finale 2009 had percussion midi maps that prioritized very much the same things. I think I attached an image.

A couple things that will help the notation of percussion in REAPER:

1. Notehead character choice both closed noteheads (<= quarter) and open noteheads (>= half).

2. Choosing which voice notes appear by default. The main example would be playing in a drumset part and having the kick drum/kick drum and snare drum default to the bottom voice.

I'm a percussionist, I write and teach music for kids, I worked for MakeMusic for a few years, and I use REAPER.
Attached Images
File Type: gif Percussion Map Designer.GIF (19.1 KB, 717 views)
Isaction is offline   Reply With Quote
Old 02-26-2016, 08:53 AM   #8
EvilDragon
Human being with feelings
 
EvilDragon's Avatar
 
Join Date: Jun 2009
Location: Croatia
Posts: 24,798
Default

Quote:
Originally Posted by Isaction View Post
I'm a percussionist, I write and teach music for kids, I worked for MakeMusic for a few years, and I use REAPER.
Welcome!
EvilDragon is online now   Reply With Quote
Old 02-26-2016, 09:12 AM   #9
Isaction
Human being with feelings
 
Isaction's Avatar
 
Join Date: Dec 2013
Location: MN
Posts: 4
Default Thanks!

Thanks! But...I may or may not have lurked on most pre-release threads for the past couple years.
Isaction is offline   Reply With Quote
Old 02-26-2016, 09:17 AM   #10
Jae.Thomas
Human being with feelings
 
Join Date: Jun 2006
Posts: 22,572
Default

Schwa, I just want to thank you for taking this time to ask user's input. This isn't uncommon here, but I know much work this might be.

I think the ideas are spot on and as long as we can come up with a good default setting and then edit them to our needs we are good to go!
Jae.Thomas is offline   Reply With Quote
Old 02-26-2016, 09:14 AM   #11
swiiscompos
Human being with feelings
 
swiiscompos's Avatar
 
Join Date: Mar 2011
Location: London
Posts: 1,211
Default

Quote:
Originally Posted by schwa View Post
2. Linking articulation and dynamics to MIDI messages. For example, a staccato marking triggering a key switch, or a crescendo marking triggering CC volume messages.

We could potentially add a new interface to support mappings like this:

FFF B0 07 7F
Staccato 90 00 7F

Meaning, FFF articulation is linked to CC volume 127. Staccato articulation is linked to the note C0 with volume 127. (This is written in raw MIDI but that does not mean the user interface will be.)
That would be beautiful. A few points however need to be taken into consideration. While that would be very nice for dynamics, most VSTi used different patches for different articulations. There are three main ways to switch between them depending on the instrument: MIDI channels, keyswitches, and velocity range.

Having a way to link the articulations to this would give us a extremely powerful tool. And then we would simply need an "articulation lane" in the piano roll editor and we would have a tool as powerful as the Expression Maps in Cubase. Being able to assign articulations in a visual and consistent way without having to remember all the keyswitches and channels is a huge workflow improvement.

One other thing to take into consideration: some of this indications are applied to all the notes that follow them until a new indication override it, while some others apply only to the not they are attached to.
swiiscompos is offline   Reply With Quote
Old 02-26-2016, 09:57 AM   #12
Soli Deo Gloria
Human being with feelings
 
Soli Deo Gloria's Avatar
 
Join Date: Oct 2013
Location: Argentina
Posts: 1,303
Default

Thanks for all, Schwa!! This initial development is already extremely promising!

Quote:
Originally Posted by swiiscompos View Post
There are three main ways to switch between them depending on the instrument: MIDI channels, keyswitches, and velocity range.
And, in some cases, CC values can also change articulations, i.e. : CC16 value 10 is assigned to Legato, value 20 to staccatto, etc... (CineSamples, some Chris Hein libraries & Prominy, to name a few)

Allow me also to remind about the usefulness of linking slurs to legato overlaps - and sorry, please, if this has been discussed elsewhere -. I don´t know if it is fair to quote a couple of features from Sibelius here, but one could even define the overlap range for slurred notes and the length for unslurred ones (in both cases, with percentages of written values)
Soli Deo Gloria is offline   Reply With Quote
Old 02-26-2016, 10:00 AM   #13
ivansc
Human being with feelings
 
Join Date: Aug 2007
Location: Near Cambridge UK and Near Questembert, France
Posts: 22,754
Default

I have largely kept out of the discussions on this but I would like to say a huge thank you to Schwa for the wise and sensitive way in which he is approaching the whole Save editor.
Excellent job,

(but I would still love to have the hybrid stave option some time )
__________________
Ici on parles Franglais
ivansc is online now   Reply With Quote
Old 02-26-2016, 12:26 PM   #14
pcartwright
Human being with feelings
 
Join Date: Jan 2009
Posts: 1,030
Default

As others have said, different libraries have different methods of triggering articulations, dynamics, etc.

Using industry standards (for both notation and midi) is a good place to start. I'll also suggest using a single source for notation standards (Behind Bars by Gould has been mentioned elsewhere and is excellent).

A few specifics:
1. Percussion mapping to general midi is a good default but should be editable.
2. Dynamics would be best mapped to CC 11 (expression) and maybe velocity, but again, should be adjustable/optional (already easy to map CC with JSFX). Consideration should be given if midi data already exists in CC 11.
3. Legato/slurs should have some impact, but this may be the most widely varied implementation. EWQL libraries have separate patches (midi channel change), Garritan uses CC 64, others use keyswitch, the list goes on. However, I would think that in all cases the prior note should end at or after the next note in a slurred phrase (unless the next note is the same pitch)
4. Piano pedal markings should trigger CC 64 by default (again, should be customizable).

I'll have more thoughts on this later.

Last edited by pcartwright; 02-26-2016 at 12:34 PM.
pcartwright is offline   Reply With Quote
Old 02-26-2016, 12:49 PM   #15
pcartwright
Human being with feelings
 
Join Date: Jan 2009
Posts: 1,030
Default

This is probably a little off topic, so I apologize in advance. The way I see it, there are essentially 4 layers to notation based midi.

1. Notes and rests (done)
2. Dynamics
3. Per note articulations (such as staccato where the articulation or technique only applies to that note)
4. Multi note articulations (such as sordino where the articulation or technique applies until turned off by some other instruction)

That being said, layers 2 and 4 should chase the most recent instruction regardless if Reaper plays the command. That is, if a user is in one area of the project where strings are muted and jumps to a section with un-muted strings, the playback should reflect un-muted even though the play cursor never crossed the actual instruction.

Again, off topic. Carry on with specifics and incremental ideas.
pcartwright is offline   Reply With Quote
Old 02-26-2016, 04:25 PM   #16
planetnine
Human being with feelings
 
planetnine's Avatar
 
Join Date: Oct 2007
Location: Lincoln, UK
Posts: 7,942
Default

Quote:
Originally Posted by planetnine View Post
Following-on from what ED has said, I think we need to start a percussion-notation guide thread or document here in the forum as percussion notation is different to melodic notation. We can provide sensible ideas and information from, and using this, that will help Schwa write a decent percussion mapping editor. There seem to be plenty of notation users here to home in on a manual of "good practice" and notation explanation.

Ultimately this guide and some percussion mapping presets should help any user display their drum/percussion MIDI as dots. Maybe the guide can be accessed through the Action list or help menu, and/or downloaded as PDF or similar at a later, more mature stage in the notation editor's development.

Essentially, the mapping should consist of note to line and note-head rows, but some (eg Hi-hat trigger notes) need to consider CC values to determine the notes' dot status. I'm not enough of an expert here to contribute on other "notation punctuation" and how might relate to percussion MIDI, but I'm sure other users will add to the picture.


Edit: I hadn't considered what MM&U has added, with drum rudiments and their shorthand symbold. Perhaps the guide might be more of a good idea, as many users who might be use normal notation would not be so familiar with the convention for drums. That wiki page is certinly a good start!

Maybe with High-hats, some input is needed from CC04? ie:
18 "HH trigger" {CC04>96} 79 "XO" --open hat
18 "HH trigger" {96>CC04>64} 79 "XH" --half-open hat
18 "HH trigger" {CC04<64} 79 "X" --closed hat
42 "Closed Hi Hat" 79 "X"
44 "Pedal Close" 41 "X"
46 "Open Hi Hat" 79 "XO"


This is the only complication I can think of immediately to the percussion notation MIDI and note head mapping -many e-kits and all drum software worth using supports trigger+CC04 translation.

Also, cowbells, triangles, bell articulation, rimshots, ghost notes, etc will need to be detected and symbols added (triangles, circles, brackets, etc).

Are we thinking of a text-entry mapping format here? Could there be a more user-friendly front-end to setting this up?




Edit: Looks like those more knowledgeable than me have taken this up. If Schwa can take all this on-board, it's going to be a fantastic tool




>
__________________
Nathan, Lincoln, UK. | Item Marker Tool. (happily retired) | Source Time Position Tool. | CD Track Marker Tool. | Timer Recording Tool. | dB marks on MCP faders FR.

Last edited by planetnine; 02-26-2016 at 11:00 PM.
planetnine is offline   Reply With Quote
Old 02-27-2016, 01:29 AM   #17
EvilDragon
Human being with feelings
 
EvilDragon's Avatar
 
Join Date: Jun 2009
Location: Croatia
Posts: 24,798
Default

Also, cymbal chokes are usually activated by one of three methods:

* another note (Addictive Drums)
* channel aftertouch (Toontrack)
* poly aftertouch (Roland V-Drums and TD modules)
EvilDragon is online now   Reply With Quote
Old 02-24-2017, 11:56 PM   #18
Mr. PC
Human being with feelings
 
Mr. PC's Avatar
 
Join Date: Apr 2010
Location: Cloud 37
Posts: 1,071
Default

Quote:
Originally Posted by Soli Deo Gloria View Post
Thanks for all, Schwa!! This initial development is already extremely promising!



And, in some cases, CC values can also change articulations, i.e. : CC16 value 10 is assigned to Legato, value 20 to staccatto, etc... (CineSamples, some Chris Hein libraries & Prominy, to name a few)

Allow me also to remind about the usefulness of linking slurs to legato overlaps - and sorry, please, if this has been discussed elsewhere -. I don´t know if it is fair to quote a couple of features from Sibelius here, but one could even define the overlap range for slurred notes and the length for unslurred ones (in both cases, with percentages of written values)
I've actually included this in my Sibelius Action list

http://forum.cockos.com/showthread.php?t=177142

Press L to make selected notes legato, and at the same time add a slight overlap (first it glues selected notes, then it lengthens them slightly).
__________________
AlbertMcKay.com
SoundCloud BandCamp
ReaNote Hotkeys to make Reaper notation easy/fast
Mr. PC is offline   Reply With Quote
Old 02-27-2016, 03:36 PM   #19
peter5992
Human being with feelings
 
peter5992's Avatar
 
Join Date: Mar 2008
Location: Oakland, CA
Posts: 10,480
Default

Quote:
Originally Posted by swiiscompos View Post
That would be beautiful. A few points however need to be taken into consideration. While that would be very nice for dynamics, most VSTi used different patches for different articulations. There are three main ways to switch between them depending on the instrument: MIDI channels, keyswitches, and velocity range.

Having a way to link the articulations to this would give us a extremely powerful tool. And then we would simply need an "articulation lane" in the piano roll editor and we would have a tool as powerful as the Expression Maps in Cubase. Being able to assign articulations in a visual and consistent way without having to remember all the keyswitches and channels is a huge workflow improvement.

One other thing to take into consideration: some of this indications are applied to all the notes that follow them until a new indication override it, while some others apply only to the not they are attached to.
Yes, that is an excellent point. Pretty much every VST is programmed differently, and even within one and the same VST, the midi messages to e.g. create a crescendo can be different.

One random example: Hollywood Strings Diamond by Eastwest - the powerful 1st or 2nd strings patches ("powerful" means you have to have a powerful computer or they won't play back decently) - crescendo is controlled by midi expression pedal, and simultaneously you can control vibrato with the mod wheel. Using both in combination gives you the ability to create that beautiful lush "Hollywood" swell that is typically only possible with live players.

Beautiful! But here's the trick: if you use these powerful 1st strings patches for say 1st violins, but another patch for second violins where the dynamics are controlled through e.g. velocity, then a crescendo hairpin needs to be "translated" differently for 1st and 2nd violins.

So you see, it gets tricky because there are so many different VSTs out there ...

Sibelius has "solved" this in a very intricate way - they have mapped all possible instruments into a huge mind map project, and programmed Sibelius such, that if you set up a score, or add an instrument, Sibelius automatically picks the right instrument for you, and any dynamics and technique are automatically translated into the appropriate playback.

Very clever, but ... it's also develishly complicated if you try to use your instruments, rather than the built in Sibelius sounds. I won't go into details but the bottom line is that you need "soundsets" to translate the Sibelius midi messages into something the external VSTs can understand and play back properly.

I've spent years trying to understand how exactly that works, and at some point I just gave up ... fortunately someone did not give up, Jonathan Loving, and he spent a huge amount of time creating soundsets for Sibelius and third party VSTs which he sells on his website (the soundset project).

http://www.soundsetproject.com/

Even with the soundsets, installing them and setting up a score using your own VSTs is very far from intuitive ... which is why these days I prefer to use another third party VST, "NotePerformer", very cleverly programmed and light impact.

The problem is that in the end of the day, because Sibelius offers so little way to manipulate the midi and audio, it's still going to sound like a computer midi playback ... even with the most expensive Eastwest Hollywood VSTs. But with Reaper this could be different! Because Reaper allows infinite control over audio and midi so you can tweak it to your heart's delight, without having to go back and forth between a notation program (be it Reaper, or Finale, or Notion, or whatever) and Reaper.

Sibelius was primarily designed as a notation program, for live players, not for the playback. This "playback versus notation" has been contentious issue for years on the Sibelius forum.

All this aside, what you guys (= Cockos developers) might want to do is talk to Jon Loving and see if he has any ideas on this. He's a very smart guy and nice too, helped me well in the past (I was actually one of the people who encouraged him to start this whole sound set project, several years ago). I'll see if I can get a hold of him.
peter5992 is offline   Reply With Quote
Old 02-28-2016, 01:46 AM   #20
EvilDragon
Human being with feelings
 
EvilDragon's Avatar
 
Join Date: Jun 2009
Location: Croatia
Posts: 24,798
Default

I have a hunch that this thread is going to end up (eventually) with Expression Maps implemented... In due time, in due time. Not for 5.20.
EvilDragon is online now   Reply With Quote
Old 02-28-2016, 02:52 AM   #21
planetnine
Human being with feelings
 
planetnine's Avatar
 
Join Date: Oct 2007
Location: Lincoln, UK
Posts: 7,942
Default

Briefly, ED, what are "expression maps", please? What do they map to? The searches I did didn't mention notation much, really.



>
__________________
Nathan, Lincoln, UK. | Item Marker Tool. (happily retired) | Source Time Position Tool. | CD Track Marker Tool. | Timer Recording Tool. | dB marks on MCP faders FR.
planetnine is offline   Reply With Quote
Old 02-28-2016, 05:16 AM   #22
Neon_Knight
Human being with feelings
 
Join Date: Aug 2010
Posts: 35
Default

Quote:
Originally Posted by planetnine View Post
Briefly, ED, what are "expression maps", please? What do they map to? The searches I did didn't mention notation much, really.



>
Expression maps are a Cubase feature that allow you to assign notation symbols to keyswitches/channel changes/CC changes/MIDI changes. The possibilities are very powerful, the implementation is not so much. My biggest gripe with expression maps is that it can't map everything under a slur to a single expression map (ie, legato), instead you must have a little slur symbol over every note that you want to legato...

It also has a piano-roll editing mode too which shows you which "expression" is currently being mapped with a bunch of horizontal bars underneath the piano roll.

Steinberg had the right idea but it wasn't anywhere near good enough to make me want to switch and I have a feeling the (eventual) Cockos implementation will be substantially more powerful (and musical). I really really (really) love where this thread seems to be going.
Neon_Knight is offline   Reply With Quote
Old 02-29-2016, 07:37 AM   #23
ceanganb
Human being with feelings
 
Join Date: May 2009
Location: Brazil
Posts: 323
Default

Quote:
Originally Posted by EvilDragon View Post
I have a hunch that this thread is going to end up (eventually) with Expression Maps implemented... In due time, in due time. Not for 5.20.
Sorry for spending a post here, but that would be so amazing.
__________________
Ceanganb
ceanganb is offline   Reply With Quote
Old 02-29-2016, 08:21 AM   #24
memyselfandus
Human being with feelings
 
memyselfandus's Avatar
 
Join Date: Oct 2008
Posts: 1,598
Default

Yep! Would be insane
memyselfandus is offline   Reply With Quote
Old 02-27-2016, 01:54 PM   #25
memyselfandus
Human being with feelings
 
memyselfandus's Avatar
 
Join Date: Oct 2008
Posts: 1,598
Default

Quote:
Originally Posted by EvilDragon View Post
You don't understand. That's support for microtonal scales - but this is still tied to the regular MIDI keyboard. In other words, once you load a tuning file into such a plugin, middle C (MIDI note 60) will always be one specified microtonal pitch (for example). There would be no way for Reaper's MIDI editor to discern between a regular middle C, and a middle C a quarter-tone off, IOW you couldn't send a middle C and middle C quarter-tone off to the plugin by using just MIDI note 60 and some pitch bend/sysex data...


Let schwa first deal with bog-standard notation. Nothing exotic is needed for now.
"Quote: Almost all synths can be retuned polyphonically even if they aren't intended to be retuned. Vst vs. vst3 doesn't really matter. All that's required for a softsynth is that it respond to midi pitch bend messages without delay (i.e. without "scooping"). For "hardsynths", i.e. workstations, it must also be multitimbral, and you must also be able to turn off local control. Other than the Nords, which are mostly bi-timbral, you can retune all but the cheapest keyboards (the kind with built-in speakers and a slot for batteries).

As for the DAW, the main requirement is that it not overwrite the midi channel information like Ableton Live does. So most DAWs work well. Reaper works great.

The usual method is to use multi-channel pitch bends:
http://forum.cockos.com/showpost.php...8&postcount=17

There are easier methods, but they only work for certain synths: Kontakt, PianoTeq, the xen-arts synths, high-end Rolands, and perhaps others."

I agree. Let Schwa focus on the standard notation. Especially right now.

At some point down the road.. If we could load our own fonts and so on.. all of this sort of stuff could be done by users as add on's.
memyselfandus is offline   Reply With Quote
Old 02-29-2016, 05:19 AM   #26
IXix
Human being with feelings
 
Join Date: Jan 2007
Location: mcr:uk
Posts: 3,891
Default

Quote:
Originally Posted by schwa View Post
1. Percussion notation, or any situation where the written notation differs from the desired MIDI output. We could potentially expand the existing MIDI note name interface to support mappings like this:

36 "Kick" 64
44 "Hat pedal" 62 "X"

Meaning, a MIDI note with pitch 36 will be displayed in the piano roll with the name "Kick", and displayed in the notation editor with pitch 64 (F4). A MIDI note with pitch 44 will be displayed in the piano roll with the name "Hat pedal" and displayed in the notation editor with pitch 62 (D4) and an "X" note head.

Is this reasonable?
Yes, that would be great BUT there are more note head types than just X so please consider that. Off the top of my head there are triangles, squares and crossed (X) circles.
IXix is offline   Reply With Quote
Old 02-29-2016, 05:42 AM   #27
EvilDragon
Human being with feelings
 
EvilDragon's Avatar
 
Join Date: Jun 2009
Location: Croatia
Posts: 24,798
Default

There are even more head types than that.
EvilDragon is online now   Reply With Quote
Old 03-02-2016, 02:59 PM   #28
paaltio
Human being with feelings
 
Join Date: Aug 2011
Location: Los Angeles, CA
Posts: 311
Default

Quote:
Originally Posted by schwa View Post
2. Linking articulation and dynamics to MIDI messages. For example, a staccato marking triggering a key switch, or a crescendo marking triggering CC volume messages.
This would be awesome!

Two things I should mention that weren't included in the example:

1) MIDI channel mapping would be great for non-keyswitching patches where you have to have articulations on different MIDI channels (e.g. being able to specify something like "staccato = change the note's MIDI channel to 3")

2) Supporting at least two different CC messages per articulation would be important, to support Vienna articulation matrices. Maybe it can just send all the messages if you specify multiple ones for the same articulation?
paaltio is offline   Reply With Quote
Old 03-02-2016, 05:44 PM   #29
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 482
Default

Quote:
Originally Posted by paaltio View Post
This would be awesome!

Two things I should mention that weren't included in the example:

1) MIDI channel mapping would be great for non-keyswitching patches where you have to have articulations on different MIDI channels (e.g. being able to specify something like "staccato = change the note's MIDI channel to 3")

2) Supporting at least two different CC messages per articulation would be important, to support Vienna articulation matrices. Maybe it can just send all the messages if you specify multiple ones for the same articulation?
Hey gang,

You can do a lot of similar stuff already via set-and-forget JSFX plugins. Case 2 is especially easy using VI Sculpt and VI Officer together and is exactly what they were designed for. Links are in my signature if you'd like to try.

VI Officer was originally inspired by VSL's matrices and is a 6x6 grid that creates an output stream from two input CCs. I typically use length and overlap, which are output options from VI Sculpt; this whole system could easily be adapted to take advantage of any notated articulation hooks should they arise. This new output can take the form of (latching or non-latching) keyswitches, a CC with varying values, or a range of CCs producing proportionally-variable values. These behaviours can be filtered by channel, so by using multiple instances you can cover a range of different options, all on the same track. Another advantage of using JSFX over any other system I know of becomes evident when dealing with pre-recorded material: via PDC it knows what's coming up next, so can adapt perfectly to every situation. It's almost like having ARA or direct access to the editor data.

Speaking of case 1, this is also easily solved with simple translation and routing plugins, with the added benefit that you can still host other instruments on the same track that would otherwise use all of the remaining channels. It's a very modest task to implement this and once we have a model for how these notation events manifest, I'll be getting to work on hashing out a solution to share with y'all.

When it comes to programming virtual instruments, if you expect the central DAW itself to do everything you want perfectly then a) you're going to be disappointed that it doesn't quite cover all of your (often wacky) use cases, or b) you're going to be waiting an extremely long time, then refer to scenario a). Besides, the JSFX platform is a part of Reaper, so the way I see it, actually putting it to good use is a flexible, native solution to the problem and why Justin included it in the first place. And continuing to speak personally, this is the prime reason why I use Reaper exclusively for such tasks. It's a happy accident that it also fits my workflow style perfectly.

I understand the mindset though. I really do. I like to keep things clean, integrated and focused. I never play unofficial mods of games. I respect the canon. I don't like installing a string of messy dependencies just to get an application to run. But sometimes we need to feel our way a little outside our emotional comfort zones, especially when it concerns the inherent modularity required for architecting complex solutions to complex problems, and super-especially when it would enrich our creative lives. We can still tuck the wires away when we're done. Nobody else will know, and maybe you'll even begin to forget in time. Meanwhile we can incorporate more and more built-in features to streamline the process.

Schwa is doing such a fabulous job with this evolving notation editor and with just a little vision and planning from the wider community, we can take the ball from the slavering maw of our neutrally-vowelled sock puppet master and run with it. I'll certainly be putting in the hours.

Finally, there are revamps for both of these plugins in the works. There are a lot of sliders in VI Sculpt but please don't let that put you off. That's just to give you extra control when you really need it. I generally only need to change a handful of sliders to set up any given instrument. There's a GUI version coming soon anyway, which should help matters too.

If you give it a go and have troubles or any other feedback, I'm more than happy to have a chat about it. Good luck to you all.

Sorry if this comes off as rather long-winded and rantish. I'm just (maybe a little too) passionate about this topic, as it's a major focus for me.

Let's get cooking!
ij
ijijn is offline   Reply With Quote
Old 03-02-2016, 11:37 PM   #30
paaltio
Human being with feelings
 
Join Date: Aug 2011
Location: Los Angeles, CA
Posts: 311
Default

Quote:
Originally Posted by ijijn View Post
You can do a lot of similar stuff already via set-and-forget JSFX plugins. Case 2 is especially easy using VI Sculpt and VI Officer together and is exactly what they were designed for. Links are in my signature if you'd like to try.
I'm already using my own https://stash.reaper.fm/v/18047/artic...apper_v0.1.zip

Built-in will always be better.
paaltio is offline   Reply With Quote
Old 03-03-2016, 04:51 AM   #31
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 482
Default

Quote:
Originally Posted by paaltio View Post
Nice one! Kind of reminiscent of this old chestnut of mine, but with a file-reading twist.

Isn't that almost the opposite of want you wanted here though?

Surely the new approach would look something like:
  1. You notate an articulation in Reaper's score view
  2. Reaper propagates this change, either as a change in length, some sort of "1:staccato" type message, or a combination of such things
  3. The instrument somehow responds appropriately, so there's some suitable translation going on somewhere, either within Reaper's midi pre-processor (for want of a better term) or via plugins
This is well within the realms of possibility now. Then, as an added bonus, you only need the one channel per instrument.

Quote:
Originally Posted by paaltio View Post
Built-in will always be better.
Hmm... maybe, whatever better means in this context, but built-in (in the strictest sense) will also always be a compromise on some level. Reaper is quite hackable via extensions in addition to the various scripting options, so isn't the appearance of "nativity" and overall quality/usability the most important thing?

I certainly agree that having integrated access to a wider variety of common tasks out of the box would be nice. Of course it would. But for dealing with the mind-boggling multitude of options out there, and especially in this early development phase, yada^3...

Unless I misinterpreted schwa's focus, which I thought was fairly clear, this thread was intended as a place to discuss data structures and basic approaches to deal with core tasks in a fairly elegant and future-proof, extensible way, rather than a bombardment of wishlist-style features that can be achieved right now with fairly little effort and a modicum of judicious "externalism". And further to that point, I would say that something I'm using right now to do an important job is "better" than something that doesn't exist yet, not only because it's helpful for actually getting things done but also because having access to these features with very low stakes in an open, flexible testing ground can be extremely useful for developmental purposes before setting things in stone later on; this sort of thing can potentially be beneficial for scripters, for Reaper as a host, and for the many fine sample library folks out there. And my sincerest apologies to all if I've added more noise to the signal than I meant to.

Further thoughts on data structures

Schwa, regarding the topic of options for mapping data back and forth, I would simply encourage you to keep following your nose with a sensible and strongly identifiable brokering layer structure that can then be easily interpreted by processes both internal and external alike, and, most importantly and I'm extremely eager to see this, for there to be some elementary per channel functionality options in terms of both display and performance directions.

Also, and maybe this could provide some amount of further inspiration, I've been toying with my own take on UACC as a logical hub for articulation switching, but using CC0 and CC32 together as a 14-bit value instead of a regular 7-bit CC32. That's probably not quite enough data for comprehensive coverage of everything you can do in music(!), but it's a start. Here's some of my thoughts behind the process...

Many common articulations can be expressed simply and conveniently as bit flags. An instrument can typically be muted or not, or in the case of brass, for example, different mutes are potentially available. So, one interpretation of "mute-ness" could take up 2 bits: 0 = none, 1 = regular con sord. or straight mute, 2 = Harmon mute and 3 = miscellaneous mute du jour. Or you could simply model con/senza sord. using 1 bit, depending on how detailed you want to be with it, but going this route would no doubt cause jazz-oriented libraries to suffer.

Tremolo is typically another 1-bit setting: you're doing it or you're not. But then do we model measured tremolo (of different lengths) vs scrubbing, and maybe add some more bits? Or could we handle measured tremolo in a different way by using those shorter actual note values as secondary data? In which case, 1 bit for generic, messy and/or ethereal scrubbing would be fine, along with some other wizardry for note subdivisions.

Note length/articulation is another multi-bit setting, incorporating:
  • staccatissimo/staccato (probably mutually exclusive, therefore different values within a single bit group)
  • tenuto lines (which would need to work both with or without the staccato dots)
  • slurs (picking this up for notes that are part of a slurred group, and ideally also the number of slurs deep, to disambiguate between phrase markings and bowings*, for example, and to inform the interpretation of other articulation markings), as well as
  • accents of varying types, etc.
Then there are trills: the bit value could correspond to the interval played: 0 = no trill, 1 = minor 2nd, ... up to whatever you need. Violins routinely go up to perfect 4th trills in many libraries and you would need a value of 5 to model this, so maybe allow 3 bits for this, which gives some headroom up to a perfect 5th.

Even if the instruments themselves can't understand this format, and obviously at this point in time none of them would, such an approach could serve as an intermediate ground after which it could be translated in some way, internally or externally, as desired. Then when interfacing with actual instruments, we can simply build up a database of available articulations (I have a plugin that detects this information for UACC-enabled instruments and could be retasked for such a job) and then find a best fit solution for any given articulation scenario.

In conclusion, I would find either the text-based or multi-bit internal interpretation of the notational input extremely useful. One could easily be converted to the other in most cases. I do feel that in this unfolding drama, text items probably have the edge as the ultimate source in that they can be assigned arbitrarily and thus have an infinite number of possibilities, but perhaps a secondary conversion process to bit flags could be useful somewhere in the pipeline. In addition, text events have the further advantage that they can be ignored most easily, for instant back-compatibility or if you'd like to turn off all tweaking of data at the notation end.

Oh yes, having distinct text event types, or even the same internal type with different header information, would be another step forward. Distinguishing expression text (mf, dolce) from technique text (pizz.) from lyrics and so on could assist the layout engine in deciding where to put things and also give processes more context to choose what to do with the information. Lyrics would be amazing to have for integration with choir libraries. A plugin (yes, I know...) could easily read the lyric strings and set things accordingly: I have about half a dozen different word-enabled choral offerings and will be testing this out on them soon.

One more thing: noteheads, articulation markings and other note-event specific gems could be represented as accompanying messages sent quasi-simultaneously (just prior) or transmit themselves parasitically inside the host message (possibly as something like a midi_recv with offset, msg1, msg2, msg3 AND msgArgs[]). I suppose consistency and coverage are the most important considerations here, along with fitting nicely into existing paradigms.

Anyway, those are my latest ideas on the subject. As always, looking forward to the next tasty pre-release.

All the best,
ij

*or these could be declared specifically within the editor

Last edited by ijijn; 03-04-2016 at 04:45 PM. Reason: needed more padding
ijijn is offline   Reply With Quote
Old 03-09-2016, 03:43 PM   #32
hamish
Human being with feelings
 
hamish's Avatar
 
Join Date: Sep 2007
Location: The Reflection Free Zone
Posts: 3,026
Default

Given that the note-map has been deferred til post 5.20, and given that we have had a sneak peek with pre16 maybe now would be a good time to discuss any ideas that it has raised.

P9 had this to say:

Quote:
Originally Posted by planetnine View Post
If you insert a note on the percussion stave on a position that two or more MIDI notes are mapped to, it would default to one of them, and vertical note movement with the mouse or numpad keys would scroll through the other mapped MIDI articulations (eg open/half/closed hihat). That's not rocket science to create from a mapping table.

It might get slightly more involved if the mapping source was a MIDI trigger articulation and a CC04 value was needed for the degree of HH "openess", but it's not insurmountable. The UX logic just needs to be thrashed out to make it workflow-friendly (trigger is the HH note from electronic kits that uses the plate hit and pedal CC04 combined to determine the HH sound).
In addition to the snare and HH examples, there are quite a lot of kits libraries that have multi note toms as well, ie L and R hands.

In this example we wouldn't connect a different note head, and even probably don't need any sticking indication, but the notation editor will need to be directed to send the correct MIDI #.

How could that work? I guess with connected articulation sign (L.H. or R.H.) that is optionally visible.

Last edited by hamish; 03-09-2016 at 03:51 PM.
hamish is offline   Reply With Quote
Old 04-07-2016, 10:22 AM   #33
parr
Human being with feelings
 
Join Date: Jan 2008
Posts: 200
Default

Going back to original Schwa's question:

Quote:
Originally Posted by schwa View Post
This thread is for discussing potential special MIDI handling features in the notation editor.

Please keep in mind that the notation editor is in early development and nothing too exotic will be implemented until the basics of notation are stable.

It's also most helpful to us if the discussion is focused on incremental steps rather than a giant wish list. The most useful thing to think about from our point of view is the not the interface seen on the screen, or what dialog windows or file formats are used, but instead the specification of what types of data need to be linked to what.


For example:



1. Percussion notation, or any situation where the written notation differs from the desired MIDI output. We could potentially expand the existing MIDI note name interface to support mappings like this:

36 "Kick" 64
44 "Hat pedal" 62 "X"

Meaning, a MIDI note with pitch 36 will be displayed in the piano roll with the name "Kick", and displayed in the notation editor with pitch 64 (F4). A MIDI note with pitch 44 will be displayed in the piano roll with the name "Hat pedal" and displayed in the notation editor with pitch 62 (D4) and an "X" note head.

Is this reasonable?



2. Linking articulation and dynamics to MIDI messages. For example, a staccato marking triggering a key switch, or a crescendo marking triggering CC volume messages.

We could potentially add a new interface to support mappings like this:

FFF B0 07 7F
Staccato 90 00 7F

Meaning, FFF articulation is linked to CC volume 127. Staccato articulation is linked to the note C0 with volume 127. (This is written in raw MIDI but that does not mean the user interface will be.)

Is this reasonable?
I would split the mapping into two parts:

1. a generic map that uses only standard CC's (expression, velocity, pedal) and duration/combination of notes. This will contain all the dynamics: crescendo, diminuendo, pp, ff, but also accents and sustain pedal. The generic map will contain other articulations related with duration/combination of notes: like staccato, cymbal rolls,...

2. an instrument specific map for instruments with specific articulations. This should allow CC's and key switches and one should be able to load one per channel and track. Say, I prepare maps for Embertone violinand Auddict Solo Violin (both for kontakt). Then I should be able to load Embertone map acting on channel 1 of track 1 and EW map acting on channel 2 of track 1, as well.

In case of conflict the specific map should override the generic one, of course, i.e., if I have an instrument with staccato samples, these are the one to be used and not just a short note.

Finally, the map should be fully customizable, with the possibility of adding new text labels.

Sorry if the post is too trivial.
Anyway, I will be very happy with this setup!

juan
parr is offline   Reply With Quote
Old 06-20-2016, 06:37 PM   #34
ceanganb
Human being with feelings
 
Join Date: May 2009
Location: Brazil
Posts: 323
Default

I wonder if schwa halted this development or anyone has come with a custom solution before I try my own (not skilled in scripts and JS at all).
__________________
Ceanganb
ceanganb is offline   Reply With Quote
Old 10-18-2016, 01:02 PM   #35
Vagalume
Human being with feelings
 
Join Date: Nov 2015
Posts: 607
Default

I just to say something:"Expression Maps" is the only thing that I miss from Cubase, the one and only.

In fact sometimes (unfortunately) I have to use to cubase for this reason. It saves plenty of time when you work with lots of articulations. Besides I can say that some of orchestral composer friends just don't want to give Reaper an opportunity because of this. I hope it can be possible in the future. Thanks for giving us the opportunity to talk about the matter.
Vagalume is offline   Reply With Quote
Old 10-18-2016, 01:26 PM   #36
reddiesel41264
Human being with feelings
 
reddiesel41264's Avatar
 
Join Date: Jan 2012
Location: North East UK
Posts: 493
Default

Quote:
Originally Posted by Vagalume View Post
I just to say something:"Expression Maps" is the only thing that I miss from Cubase, the one and only.

In fact sometimes (unfortunately) I have to use to cubase for this reason. It saves plenty of time when you work with lots of articulations. Besides I can say that some of orchestral composer friends just don't want to give Reaper an opportunity because of this. I hope it can be possible in the future. Thanks for giving us the opportunity to talk about the matter.
+1 Something like this would greatly increase my workflow with sample libraries in both the notation editor and the piano roll.
__________________
http://librewave.com - Freedom respecting instruments and effects
http://xtant-audio.com/ - Purveyor of fine sample libraries (and Kontakt scripting tutorials)
reddiesel41264 is offline   Reply With Quote
Old 10-19-2016, 02:18 AM   #37
hamish
Human being with feelings
 
hamish's Avatar
 
Join Date: Sep 2007
Location: The Reflection Free Zone
Posts: 3,026
Default

I'm also watching the pre's for any work on this. Drum note mapping is badly needed here.
hamish is offline   Reply With Quote
Old 10-31-2016, 08:27 AM   #38
memyselfandus
Human being with feelings
 
memyselfandus's Avatar
 
Join Date: Oct 2008
Posts: 1,598
Default

If we could load custom fonts or something like that so we could do microtonal notation! Reaper would be the go to software for these guys

https://www.facebook.com/groups/xenh...ref=ts&fref=ts

It can already be done in the piano roll. Maybe some sort of user add on? Visually for notation. Pitch bends and everything would be done by the plugins and or synths. Think UHE Zebra, Alchemy, Chipsounds and tons of other plugins. Different tuning systems can be applied directly from the plugin. Reaper would just have the option to load alternate notation symbols or whatever.


Example

"Download Jacob Barton's amazing Sagibelius 2.0 scripts that let you use Sagittal symbols with the Sibelius music notation software. It wasn't easy, but he found a way. Updated to use the Sagittal 2.0 font mapping and to allow most of the Athenian subset. The zipfile includes a modified version of the font. The documentation and examples will be educational even if you're not using Sibelius."

http://sagittal.org


Example of microtonal notation




Here are some fonts

http://www.music-notation.info/en/fonts/Tempera.html


Some awesome Melodyne stuff
http://www.curtismacdonald.com/using...with-melodyne/[/QUOTE]
memyselfandus is offline   Reply With Quote
Old 11-17-2016, 01:37 AM   #39
drowo
Human being with feelings
 
Join Date: Jan 2016
Location: Germany
Posts: 26
Default

Quote:
Originally Posted by schwa View Post
This thread is for discussing potential special MIDI handling features in the notation editor.
...

1. Percussion notation,... support mappings like this:
...
2. Linking articulation and dynamics to MIDI messages.
....
Is this reasonable?
Yes this is reasonable, and desirable. I have seen a lot of people adding more sophisticated ideas which would all be nice but probably mean a lot of effort. However I would love to see this original proposal from schwa to be available as soon as possible ... even if I would have to hack a text file with raw midi messages instead of having a nice user interface :-)
drowo is offline   Reply With Quote
Old 11-21-2016, 06:20 PM   #40
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 482
Default

Quote:
Originally Posted by drowo View Post
Yes this is reasonable, and desirable. I have seen a lot of people adding more sophisticated ideas which would all be nice but probably mean a lot of effort. However I would love to see this original proposal from schwa to be available as soon as possible ... even if I would have to hack a text file with raw midi messages instead of having a nice user interface :-)
Have you tried VI Drum Mapper?
ijijn is offline   Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 01:15 PM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.