Go Back   Cockos Incorporated Forums > REAPER Forums > REAPER Pre-Release Discussion

Reply
 
Thread Tools Display Modes
Old 03-20-2016, 05:16 PM   #81
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 441
Default

Quote:
Originally Posted by memyselfandus View Post
I love that you are here ijijn and I love reading your awesome posts great ideas!
Aww shucks, you've made my day!

The Reaper forum is definitely one of my favourite places to be. Always so many interesting discussions going on! It's good to be back.
ijijn is offline   Reply With Quote
Old 03-21-2016, 12:31 PM   #82
kerryg
Human being with feelings
 
Join Date: Mar 2007
Posts: 328
Default

I'm excited about the power these ideas could unleash, but I also want to inject a note of caution. The composers I've worked for would give the notation editor an initial five to ten minutes. If there's frustration at doing simple things they'll abandon it, and reaper, and not take another look for years - that's just how it is (can you tell I've seen this before with other software?). So these extraordinary abilities - huge channelization options, for example - *must* remain transparent; invisible until invoked.

Controller manufacturers are taking internal rechannelization so much for granted (AFAICT Ableton doesn't even deal with internal channels, it forces all input to one channel) that they've even started to remove the MIDI out channel selection option from some prominent controllers.

So be careful - this stuff has to live "under the hood" by default so composers don't eg wind up having parts jump to different staves or voices and freak out and not know how to fix it.

One further bit of input: please avoid the use of braces or brackets to group staves by tracks; these symbols are reserved for the visual grouping of parts on a score. Their optimal use may - or may not - correspond to the optimal layout of MIDI tracks.

Aside from those caveats, it's getting pretty exciting in here - thanks for a lot of excellent and visionary input!

Last edited by kerryg; 03-21-2016 at 04:50 PM.
kerryg is offline   Reply With Quote
Old 03-21-2016, 06:09 PM   #83
hopi
Human being with feelings
 
hopi's Avatar
 
Join Date: Oct 2008
Location: Right Hear
Posts: 12,295
Default

though I don't know very much about it, my kids who do use XML export and import for MIDI notation scores on a daily basis...

so of course I understand that notation is very new to reaper and may have a long way to go... yet I'm wondering if the is a 'down the road' intention to import and export these MIDI XML files with reaper?
__________________
...should be fixed for the next build... http://tinyurl.com/cr7o7yl
https://soundcloud.com/hopikiva/angel-rain
hopi is offline   Reply With Quote
Old 03-21-2016, 06:55 PM   #84
hamish
Human being with feelings
 
hamish's Avatar
 
Join Date: Sep 2007
Location: The Reflection Free Zone
Posts: 2,996
Default

It's been talked about. It is planned, but not for 5.20.
hamish is offline   Reply With Quote
Old 03-21-2016, 09:05 PM   #85
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 441
Default

Quote:
Originally Posted by kerryg View Post
I'm excited about the power these ideas could unleash, but I also want to inject a note of caution. The composers I've worked for would give the notation editor an initial five to ten minutes. If there's frustration at doing simple things they'll abandon it, and reaper, and not take another look for years - that's just how it is (can you tell I've seen this before with other software?). So these extraordinary abilities - huge channelization options, for example) *must* remain transparent; invisible until invoked.

Controller manufacturers are taking internal rechannelization so much for granted (AFAICT Ableton doesn't even deal with internal channels, it forces all input to one channel) that they've even started to remove the ability to change the MIDI out channel selection option from some prominent controllers.

So be careful - this stuff has to live "under the hood" by default so composers don't eg wind up having parts jump to different staves or voices and freak out and not know how to fix it.

One further bit of input: please avoid the use of braces or brackets to group staves by tracks; these symbols are reserved for the visual grouping of parts on a score. Their optimal use may - or may not - correspond to the optimal layout of MIDI tracks.

Aside from those caveats, it's getting pretty exciting in here - thanks for a lot of excellent and visionary input!
Great stuff, and I really hear you on not scaring people off with an overwhelming monstrosity.

The user experience

I would like to think that a reasonable portion of my considerations have been about providing a transparent interface for these things. As you say, complexity and power under the hood (which Reaper has in spades and which continues to grow daily) should naturally be presented in a way that makes the most sense to the end user without confusion, limitation or intimidation, so that the display/interactive layer of the notation engine scales up and down to match the complexity of each notation scenario. With an adaptive hiding strategy (perhaps along the lines of the one outlined above) you would never actually see more than you need to.

As for parts leaping around, if you changed your selected line from channel 1 to channel 7, would you be surprised to see it move down in the display, especially if you had other music on staves for channels 2-6? I would probably feel a little apprehensive and under-informed in the feedback department if something like that didn't happen, but I'm very interested in other opinions. Perhaps some tweaking of the post-operation display to show the results as clearly as possible would be in order, but I don't see a major problem here.

Brackets and braces

In essence, I think we need a clear hierarchical relationship of some kind to allow for the various staff connection options, and we can draw some inspiration from other organisational structures such as track folders, which share some of the same considerations. Here are a few examples of staff display-related operations:
  • connect these two (or more) staves with a brace
  • connect these two (or more) with a bracket (perhaps by setting the bracketStart or bracketEnd on a given staff, which would automatically allow for flexible layering: e.g. the string section has its own bracket, and the divisi cellos, and/or violins 1 and 2 in the older style, have their own sub-bracket)
  • interpret this staff or these staves as one instrument, and give it a new name if you like
Sibelius has a fairly comprehensive custom brac*ing scheme, as I imagine Finale does; pre-existing solutions are always a good starting point for planning features, as long as we keep an eye on the important differences in context.
ijijn is offline   Reply With Quote
Old 03-21-2016, 09:20 PM   #86
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 441
Default

Or ... how's this for a different approach?

On the topic of underlying data structures, what's clearly important is that they adequately model the domain specifics of notation, with the added concern of integrating nicely into a DAW workflow. It's no mean feat, but I'm sure we can get there and in some aspects it's there already.

It's really a matter of building on a solid design: a loosely coupled relationship between voices, channels, tracks, staves and instruments to allow for flexible (re)interpretation and a reduction of assumptions regarding the exact nature of the given input. As kerryg has said, a track may not necessarily correspond to an instrument, and the grouping of staves together is not necessarily related to their track or channel configurations out of the box. There should be some room for creativity here for the maximum coverage of use cases.

So ... perhaps instead of a huge number of voices for sorting everything, which I've admittedly been banging on about a bit, we could describe everything simply and accurately using text events to distinguish between instruments, staves and voices. This would be the ultimate in separation of concerns, leaving the interpretation wide open. For example...

NOTE 3 81 instrument Flute staff 3 voice 2

where:
  • "Flute" is taken directly from the given name for this associated instrumental staff group (perhaps substituting something like '_' or '-' for any spaces),
  • staff 3 means the 4th staff (if we start with 0) on the score,
  • and voice is pretty much as we're used to seeing it: 1 for upper, 2 for lower, (0) for default (with 3, 4... included later in a DLC)
  • the channel and pitch are also as before, so no change is needed there
Access to the instrument name directly would be simply stunning for automatic contextual treatment of various instruments, as you could define specific rules based on string matches, internally or externally. It could also help to make sure the routing is correct. If all flutes need to go to bus 3, for instance, that's where these events go. My head is spinning at the possibilities.

The staff number is primarily related to the display order but could help with routing and so on if we chose to tap that resource. Clefs and non-global time/key signatures would all relate to a staff number. Such a number would likely be global (a per-project ID). Also, adding or incorporating new tracks/channels would automatically create another staff to host those events, which could later be combined with others if you so wished. This is the part I'm struggling with the most in this thought experiment. Any ideas? Is there a more elegant and extensible way?

Here's another way of expressing the same sort of information, but in a declarative start-of-play "setup" format:
STAF 15 instrument Trombone => staff 15 has the instrument name of "Trombone"
or
INST 15 1 Trombone => the instrument "Trombone" starts at staff 15 and is a total of 1 staff in size
STAF 15 clef bass => staff 15 has a bass clef, then later...
STAF 15 clef tenor => staff 15 has a tenor clef
STAF 15 type 5line => staff 15 is a 5-line staff

The brac*ing would be a separate layer again, so there could be text markers for these too.

BRCE 12 13 => staves 12 and 13 are connected with a brace
BRKT 1 5 => staves 1-5 are connected with a bracket
BRKT 2 3 => staves 2-3 are connected, in this case via a sub-bracket within the larger 1-5 bracket group

It may be worth introducing another text element to express the depth in the hierarchy for clarity, so something like:
BRKT level 1 staff 1 5
BRKT level 2 staff 2 3


One handy thing about using text events exclusively for layout is that if you were to import the midi into another reaper project then the staves would all be there for you. If it had to merge with existing midi then perhaps there could be some options: include as-is or start from the first available staff number, as two likely candidates.

In any case, through a cleaner separation of the notions of instrument, staff, track, channel and voice we have the utmost independence in how we want to treat these parameters, as developers and end users.

Last edited by ijijn; 03-22-2016 at 04:12 AM. Reason: added further ideas
ijijn is offline   Reply With Quote
Old 03-22-2016, 08:23 PM   #87
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 441
Default How about this, then?

In order to keep things a little easier conceptually and technically without sacrificing features, we could make a few assumptions.

Here's a reworked hierarchy with simpler relationships:
  • Each track has a score, so we can consider them to be effectively synonymous (does anyone hate this idea?)
  • Each channel has one or more staves, and staves can be flexibly grouped into (named) instruments*
  • Each voice is affiliated with a particular parent staff

These relationships would be stored internally in a suitable way, and based on the details an initial set of values would appear at the start of every event list to get things going and provide hooks for later processes (especially the instrument name, although clef and other information could be handy to know). Here's a rough artist's sketch:
  • CHAN <channel> staves <quantity> => x channel has y staves
  • STAF <channel> <staffNumber> clef <type> => setting clef type (time signature, key signature...) per staff, and similar messages could occur later on to change the clef mid-stream
  • INST <channel> <firstStaffNumber> <lastStaffNumber> <instrumentName> => this instrument has this name and lives on these consecutive staves on this particular channel
then when it comes time for the notes themselves, we end up with something like the following:

NOTE 0 64 staff 1 voice 2 => this note is on "voice 2" of "staff 1" of the first channel, and based on the earlier setup information, we can detect that it's bagpipes in A major in 9/8 in alto clef if we want to, and proceed accordingly.

* an instrument wouldn't cross any channel boundaries, but would instead reside within a single channel
ijijn is offline   Reply With Quote
Old 03-22-2016, 08:43 PM   #88
hamish
Human being with feelings
 
hamish's Avatar
 
Join Date: Sep 2007
Location: The Reflection Free Zone
Posts: 2,996
Default

Hmm, I guess it's looking better. I especially liked the post you made earlier explaining how you filter and route notes/voices with the REAPER busses. unless that was on the 'Notation editor pre-release thread, in which case srry Here it is:
Quote:
Originally Posted by ijijn View Post
Yes, it's basically about per-channel polyphony.
Still I do take exception to referring to track as 'Score', and I don't think I would be alone.

'Score' will always for most users equate with 'File'. 'Part' will (not rightly or accurately, but in practice is the simplest to get right) equate with 'Track'.

I think the file import should probably default to this as kerry and julian are saying, to Keep It Simple.

As Schwa has said staves are not bound to being per track, as in this current pre-release, but could easily become (optionally) per channel which would be the first step.

I have to say I think that having 16 voices per channel as you suggest would be a valuable asset for REAPER.
hamish is offline   Reply With Quote
Old 03-22-2016, 09:15 PM   #89
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 441
Default

Quote:
Originally Posted by hamish View Post
Hmm, I guess it's looking better. I especially liked the post you made earlier explaining how you filter and route notes/voices with the REAPER busses. unless that was on the 'Notation editor pre-release thread, in which case srry Here it is:

Still I do take exception to referring to track as 'Score', and I don't think I would be alone.

'Score' will always for most users equate with 'File'. 'Part' will (not rightly or accurately, but in practice is the simplest to get right) equate with 'Track'.

I think the file import should probably default to this as kerry and julian are saying, to Keep It Simple.

As Schwa has said staves are not bound to being per track, as in this current pre-release, but could easily become (optionally) per channel which would be the first step.

I have to say I think that having 16 voices per channel as you suggest would be a valuable asset for REAPER.
Well, I certainly can't keep track of which thread anything was in either, but I'm glad you liked that one.

Indeed, you raise some very valid points. I would certainly agree with the score-file interpretation to a large extent: for me, a track would be more like a sub-score, and I would like to be able to see all tracks combined as the full score. I don't imagine that's going anywhere. As for part-track, I think the interpretation is a very personal choice and everyone has a different story.

For some, a part (or staff) is a track. For me, and this is neither here nor there really, that would be setting my workflow back 10 years to a time before my current ecosystem when I was pretty much forced to do that. It certainly wasn't bad, it just didn't work for my way of thinking.

For others, it's a channel. I'm excited to see this coming soon, as I'm sure EvilDragon is too from his recent comments, and it's a positive step towards universality.

The more I thought about the multiple voices "brute force" idea, the more I realised that it was a bit of a kludge and a better representation of the notation would better reflect the actual relationships between the notation components.

This is what led me to my immediately prior approach, which allows a part/staff to be either a track or a channel, or anything else for that matter. With this or the second one, we would get proper support for instruments, represented by one or more staves each. Voices are automatically populated whenever they are needed, and we can map things however we want. If you don't want to fiddle around at all and just stick to a part per track then you would never even know it was there: the additional functionality is 100% transparent.

That would be my preferred option, as it should be flexible enough to please everyone. Ideally we would have total separation and independence of all moving parts, but judging from Schwa's recent post that may not happen.

So ... if we can't get that, how about a tweaking of post #88? Basically, everything in 88 as it stands except that all tracks contribute to the overall score: each track is essentially a subscore that stacks vertically within a full score based on track position. It has very similar properties of transparency and extensibility.

Last edited by ijijn; 03-23-2016 at 01:11 AM. Reason: clarifications, additional ideas, making it less (inadvertently) grumpy-sounding ;)
ijijn is offline   Reply With Quote
Old 04-07-2016, 10:22 AM   #90
parr
Human being with feelings
 
Join Date: Jan 2008
Posts: 194
Default

Going back to original Schwa's question:

Quote:
Originally Posted by schwa View Post
This thread is for discussing potential special MIDI handling features in the notation editor.

Please keep in mind that the notation editor is in early development and nothing too exotic will be implemented until the basics of notation are stable.

It's also most helpful to us if the discussion is focused on incremental steps rather than a giant wish list. The most useful thing to think about from our point of view is the not the interface seen on the screen, or what dialog windows or file formats are used, but instead the specification of what types of data need to be linked to what.


For example:



1. Percussion notation, or any situation where the written notation differs from the desired MIDI output. We could potentially expand the existing MIDI note name interface to support mappings like this:

36 "Kick" 64
44 "Hat pedal" 62 "X"

Meaning, a MIDI note with pitch 36 will be displayed in the piano roll with the name "Kick", and displayed in the notation editor with pitch 64 (F4). A MIDI note with pitch 44 will be displayed in the piano roll with the name "Hat pedal" and displayed in the notation editor with pitch 62 (D4) and an "X" note head.

Is this reasonable?



2. Linking articulation and dynamics to MIDI messages. For example, a staccato marking triggering a key switch, or a crescendo marking triggering CC volume messages.

We could potentially add a new interface to support mappings like this:

FFF B0 07 7F
Staccato 90 00 7F

Meaning, FFF articulation is linked to CC volume 127. Staccato articulation is linked to the note C0 with volume 127. (This is written in raw MIDI but that does not mean the user interface will be.)

Is this reasonable?
I would split the mapping into two parts:

1. a generic map that uses only standard CC's (expression, velocity, pedal) and duration/combination of notes. This will contain all the dynamics: crescendo, diminuendo, pp, ff, but also accents and sustain pedal. The generic map will contain other articulations related with duration/combination of notes: like staccato, cymbal rolls,...

2. an instrument specific map for instruments with specific articulations. This should allow CC's and key switches and one should be able to load one per channel and track. Say, I prepare maps for Embertone violinand Auddict Solo Violin (both for kontakt). Then I should be able to load Embertone map acting on channel 1 of track 1 and EW map acting on channel 2 of track 1, as well.

In case of conflict the specific map should override the generic one, of course, i.e., if I have an instrument with staccato samples, these are the one to be used and not just a short note.

Finally, the map should be fully customizable, with the possibility of adding new text labels.

Sorry if the post is too trivial.
Anyway, I will be very happy with this setup!

juan
parr is offline   Reply With Quote
Old 06-20-2016, 06:37 PM   #91
ceanganb
Human being with feelings
 
Join Date: May 2009
Location: Brazil
Posts: 270
Default

I wonder if schwa halted this development or anyone has come with a custom solution before I try my own (not skilled in scripts and JS at all).
__________________
Ceanganb
ceanganb is online now   Reply With Quote
Old 10-18-2016, 01:02 PM   #92
Vagalume
Human being with feelings
 
Join Date: Nov 2015
Posts: 198
Default

I just to say something:"Expression Maps" is the only thing that I miss from Cubase, the one and only.

In fact sometimes (unfortunately) I have to use to cubase for this reason. It saves plenty of time when you work with lots of articulations. Besides I can say that some of orchestral composer friends just don't want to give Reaper an opportunity because of this. I hope it can be possible in the future. Thanks for giving us the opportunity to talk about the matter.
Vagalume is offline   Reply With Quote
Old 10-18-2016, 01:26 PM   #93
reddiesel41264
Human being with feelings
 
reddiesel41264's Avatar
 
Join Date: Jan 2012
Location: North East UK
Posts: 409
Default

Quote:
Originally Posted by Vagalume View Post
I just to say something:"Expression Maps" is the only thing that I miss from Cubase, the one and only.

In fact sometimes (unfortunately) I have to use to cubase for this reason. It saves plenty of time when you work with lots of articulations. Besides I can say that some of orchestral composer friends just don't want to give Reaper an opportunity because of this. I hope it can be possible in the future. Thanks for giving us the opportunity to talk about the matter.
+1 Something like this would greatly increase my workflow with sample libraries in both the notation editor and the piano roll.
__________________
David Healey
Purveyor of fine sample libraries (and Kontakt scripting tutorials) - http://xtant-audio.com/
reddiesel41264 is offline   Reply With Quote
Old 10-19-2016, 02:18 AM   #94
hamish
Human being with feelings
 
hamish's Avatar
 
Join Date: Sep 2007
Location: The Reflection Free Zone
Posts: 2,996
Default

I'm also watching the pre's for any work on this. Drum note mapping is badly needed here.
hamish is offline   Reply With Quote
Old 10-31-2016, 08:27 AM   #95
memyselfandus
Human being with feelings
 
memyselfandus's Avatar
 
Join Date: Oct 2008
Posts: 1,595
Default

If we could load custom fonts or something like that so we could do microtonal notation! Reaper would be the go to software for these guys

https://www.facebook.com/groups/xenh...ref=ts&fref=ts

It can already be done in the piano roll. Maybe some sort of user add on? Visually for notation. Pitch bends and everything would be done by the plugins and or synths. Think UHE Zebra, Alchemy, Chipsounds and tons of other plugins. Different tuning systems can be applied directly from the plugin. Reaper would just have the option to load alternate notation symbols or whatever.


Example

"Download Jacob Barton's amazing Sagibelius 2.0 scripts that let you use Sagittal symbols with the Sibelius music notation software. It wasn't easy, but he found a way. Updated to use the Sagittal 2.0 font mapping and to allow most of the Athenian subset. The zipfile includes a modified version of the font. The documentation and examples will be educational even if you're not using Sibelius."

http://sagittal.org


Example of microtonal notation




Here are some fonts

http://www.music-notation.info/en/fonts/Tempera.html


Some awesome Melodyne stuff
http://www.curtismacdonald.com/using...with-melodyne/[/QUOTE]
memyselfandus is offline   Reply With Quote
Old 10-31-2016, 08:32 AM   #96
memyselfandus
Human being with feelings
 
memyselfandus's Avatar
 
Join Date: Oct 2008
Posts: 1,595
Default

Quote:
Originally Posted by hamish View Post
I'm also watching the pre's for any work on this. Drum note mapping is badly needed here.
Agreed
memyselfandus is offline   Reply With Quote
Old 10-31-2016, 08:36 AM   #97
reddiesel41264
Human being with feelings
 
reddiesel41264's Avatar
 
Join Date: Jan 2012
Location: North East UK
Posts: 409
Default Tremolos

Is there a way to add tremolo marks to the stem?
__________________
David Healey
Purveyor of fine sample libraries (and Kontakt scripting tutorials) - http://xtant-audio.com/
reddiesel41264 is offline   Reply With Quote
Old 11-17-2016, 01:37 AM   #98
drowo
Human being with feelings
 
Join Date: Jan 2016
Location: Germany
Posts: 15
Default

Quote:
Originally Posted by schwa View Post
This thread is for discussing potential special MIDI handling features in the notation editor.
...

1. Percussion notation,... support mappings like this:
...
2. Linking articulation and dynamics to MIDI messages.
....
Is this reasonable?
Yes this is reasonable, and desirable. I have seen a lot of people adding more sophisticated ideas which would all be nice but probably mean a lot of effort. However I would love to see this original proposal from schwa to be available as soon as possible ... even if I would have to hack a text file with raw midi messages instead of having a nice user interface :-)
drowo is offline   Reply With Quote
Old 11-21-2016, 06:20 PM   #99
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 441
Default

Quote:
Originally Posted by drowo View Post
Yes this is reasonable, and desirable. I have seen a lot of people adding more sophisticated ideas which would all be nice but probably mean a lot of effort. However I would love to see this original proposal from schwa to be available as soon as possible ... even if I would have to hack a text file with raw midi messages instead of having a nice user interface :-)
Have you tried VI Drum Mapper?
ijijn is offline   Reply With Quote
Old 11-22-2016, 08:44 AM   #100
memyselfandus
Human being with feelings
 
memyselfandus's Avatar
 
Join Date: Oct 2008
Posts: 1,595
Default

Quote:
Originally Posted by ijijn View Post
Have you tried VI Drum Mapper?
Cool!!
memyselfandus is offline   Reply With Quote
Old 11-24-2016, 01:28 AM   #101
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 441
Default

Quote:
Originally Posted by memyselfandus View Post
Cool!!
Thanks
ijijn is offline   Reply With Quote
Old 12-01-2016, 01:15 PM   #102
schwa
Administrator
 
schwa's Avatar
 
Join Date: Mar 2007
Location: NY
Posts: 9,175
Default

This is a technical detail, but Steinberg requires that note expression events come after the note they are attached to, which is the opposite of how REAPER plays back notation events -- REAPER notation events come before the note they are attached to.

I know there are at least some user FX and scripts in the wild that listen for REAPER notation events and convert them to some type of expression events (other MIDI messages or automation or whatever).

If we change the ordering of played-back notation events to match Steinberg, that will probably make future translations easier, both user-created and cockos-created. But it may break existing FX/scripts.

We'd appreciate any feedback on the issue of whether or not to change the ordering, from any users who are currently using custom tools that listen for notation events on playback.
schwa is offline   Reply With Quote
Old 12-01-2016, 01:29 PM   #103
reddiesel41264
Human being with feelings
 
reddiesel41264's Avatar
 
Join Date: Jan 2012
Location: North East UK
Posts: 409
Default

Quote:
Originally Posted by schwa View Post
This is a technical detail, but Steinberg requires that note expression events come after the note they are attached to, which is the opposite of how REAPER plays back notation events -- REAPER notation events come before the note they are attached to.

I know there are at least some user FX and scripts in the wild that listen for REAPER notation events and convert them to some type of expression events (other MIDI messages or automation or whatever).

If we change the ordering of played-back notation events to match Steinberg, that will probably make future translations easier, both user-created and cockos-created. But it may break existing FX/scripts.

We'd appreciate any feedback on the issue of whether or not to change the ordering, from any users who are currently using custom tools that listen for notation events on playback.
I'm not using any custom scripts atm but I am very interested in this side of Reaper's workings because I'm hoping that at some point in the future we'll have some kind of notation MIDI playback system in Reaper, like expression maps or Sibelius sound sets.

If a note has a dynamic marking but that marking is triggered after the note won't that mean the note that the marking is attached to won't actually be affected by it? Or will the script/program have to look for the marking after the note instead, so basically everything will be one note ahead... are there situations where this could cause confusion about which note a marking should affect?
__________________
David Healey
Purveyor of fine sample libraries (and Kontakt scripting tutorials) - http://xtant-audio.com/
reddiesel41264 is offline   Reply With Quote
Old 12-01-2016, 01:33 PM   #104
pcartwright
Human being with feelings
 
Join Date: Jan 2009
Posts: 767
Default

As someone who tinkers with notation MIDI messages, I would prefer that any change is made sooner than later (if the change is truly needed).

Since this will impact notation-to-MIDI scripts, would it make since to include a function in the script API that returns notation events associated with a note ppq value?
pcartwright is offline   Reply With Quote
Old 12-01-2016, 01:36 PM   #105
schwa
Administrator
 
schwa's Avatar
 
Join Date: Mar 2007
Location: NY
Posts: 9,175
Default

Quote:
Originally Posted by reddiesel41264 View Post
If a note has a dynamic marking but that marking is triggered after the note won't that mean the note that the marking is attached to won't actually be affected by it? Or will the script/program have to look for the marking after the note instead, so basically everything will be one note ahead... are there situations where this could cause confusion about which note a marking should affect?

This is entirely an implementation detail. In all cases, an expression event that attaches to a note will have the same timestamp as the note and positively identify the note that it belongs to. The only question is whether the expression event arrives immediately before the note event (as REAPER now does it) or immediately after (as Steinberg does).

Really what I'm asking is: how much pain will it cause if we change this ordering now? No behavior within REAPER will change, the only thing that could possibly be affected are user-written FX or scripts that currently listen for REAPER notation events and do things based on those events.

The options are to change it now, or not change it ever. Not changing it may make scripts and our own possible future implementation of expression events somewhat more complicated, but not overwhelmingly so.

It's like the windows backslash -- we have a chance to make all of our slashes go the same direction now, or we can live with them always being different.
schwa is offline   Reply With Quote
Old 12-01-2016, 01:44 PM   #106
pcartwright
Human being with feelings
 
Join Date: Jan 2009
Posts: 767
Default

I say do it now while these user scripts are still in their infancy.

But that's just me.
pcartwright is offline   Reply With Quote
Old 12-01-2016, 01:58 PM   #107
reddiesel41264
Human being with feelings
 
reddiesel41264's Avatar
 
Join Date: Jan 2012
Location: North East UK
Posts: 409
Default

Quote:
Originally Posted by pcartwright View Post
I say do it now while these user scripts are still in their infancy.

But that's just me.
I agree with this. Who knows, maybe this will be relevant for a standard someday that allows DAWS to share notation data.
__________________
David Healey
Purveyor of fine sample libraries (and Kontakt scripting tutorials) - http://xtant-audio.com/
reddiesel41264 is offline   Reply With Quote
Old 12-01-2016, 02:14 PM   #108
ceanganb
Human being with feelings
 
Join Date: May 2009
Location: Brazil
Posts: 270
Default

Quote:
Originally Posted by schwa View Post
The options are to change it now, or not change it ever. Not changing it may make scripts and our own possible future implementation of expression events somewhat more complicated, but not overwhelmingly so.
IMHO, avoid complicators and make it easier for both you and script devs. Change it, that is.
__________________
Ceanganb
ceanganb is online now   Reply With Quote
Old 12-01-2016, 03:14 PM   #109
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 441
Default

Oh yes, please feel free to make any helpful changes! Personally I have no problem changing all of my scripts to accommodate, especially as I'm moving to a more DAW-agnostic model anyway.

One potential FR to consider please, if possible...

Would it be within spec to pair these events neatly, for notes that start at exactly the same sample? Here's why:



As you can see, currently we get all the text then all the notes, so it's a bit trickier to match them up (or determine that there is no match). Only having to check one neighbouring message would be easier and more performant.
ijijn is offline   Reply With Quote
Old 12-01-2016, 03:14 PM   #110
EvilDragon
Human being with feelings
 
EvilDragon's Avatar
 
Join Date: Jun 2009
Location: Croatia
Posts: 18,899
Default

Change it to what Steinberg does and while there aren't a ton of scripts out there that relate to this stuff.
EvilDragon is offline   Reply With Quote
Old 12-01-2016, 03:33 PM   #111
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 441
Default

Quote:
Originally Posted by EvilDragon View Post
Change it to what Steinberg does and while there aren't a ton of scripts out there that relate to this stuff.
Agreed, but I think that ship has sailed, ED.
ijijn is offline   Reply With Quote
Old 12-01-2016, 06:32 PM   #112
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 441
Default

Does anyone have any information regarding the format of Steinberg's various Expression events?
ijijn is offline   Reply With Quote
Old 12-02-2016, 07:05 PM   #113
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 441
Default

If I may suggest a couple of points to consider before details are set in stone:

Safe-to-send Format

Whatever the chosen solution, I think being able to send these meta-events externally (to VSTs, other DAWs, etc.) rather than simply sanitising the stream would be extremely beneficial for the greatest flexibility and interoperability. This way, notation can be shared between applications and harnessed by plugins on various platforms without relying solely on JSFX and friends within Reaper itself. It could also streamline Reaper's MIDI processing slightly in that it wouldn't need to keep such a careful eye on these things.

It seems that the existing 0xFF wouldn't work in this context: it designates metadata in static MIDI files but reset messages () in live applications, which isn't exactly ideal for our purposes!

So ... is SysEx the only format that would actually work reliably for all of this? Following on from discussions early in the piece, I personally wouldn't worry about whether or not it's officially binary or text data, as:
  1. I'm sure Reaper could detect its own distinct header signature for displaying the data appropriately in the event list, and
  2. third-party developers would be fine if we know what to look for.

Of course, if there is another message type that would work, I'd be totally on board with that too.

MIDI Bus Support

In a similar vein, I'm also wondering about the possibility of including the MIDI bus information via these events too, rather than relying on the midi_bus variable. Then we could use buses (where available) externally as well. This could be added to the specification later in the piece, but I thought I'd mention it.
ijijn is offline   Reply With Quote
Old 12-03-2016, 07:26 AM   #114
krahosk
Human being with feelings
 
krahosk's Avatar
 
Join Date: Jul 2009
Location: Canada
Posts: 1,502
Default

Quote:
Originally Posted by ceanganb View Post
IMHO, avoid complicators and make it easier for both you and script devs. Change it, that is.
+1 Agree.
krahosk is offline   Reply With Quote
Old 12-03-2016, 02:07 PM   #115
ijijn
Human being with feelings
 
ijijn's Avatar
 
Join Date: Apr 2012
Location: Christchurch, New Zealand
Posts: 441
Default

Or, if the 0xFF format is preferred, please can we have the option of allowing them through? I know how much everyone loves checkboxes...



It's quite straightforward to filter these events ourselves, if we do stumble upon any problematic plugin or external hardware situations. Then we get the best of both worlds.

As it stands, we have scenarios like this:
  1. A VST plugin creates a meta-event
  2. Reaper receives the event back
  3. Reaper mysteriously destroys the event before it reaches the next VST in the chain (although JSFX plugins will happily take it, of course)

Last edited by ijijn; 12-03-2016 at 07:40 PM. Reason: screenshot and update
ijijn is offline   Reply With Quote
Old 12-03-2016, 03:20 PM   #116
JesterMusician
Human being with feelings
 
Join Date: Jan 2016
Posts: 32
Default

Quote:
Originally Posted by pcartwright View Post
I say do it now while these user scripts are still in their infancy.

But that's just me.
I agree - notation is so new that people who have written such scripts are probably still around to update them accordingly (which probably wouldn't be too hard).
JesterMusician is offline   Reply With Quote
Old 12-20-2016, 04:53 PM   #117
Vagalume
Human being with feelings
 
Join Date: Nov 2015
Posts: 198
Default

I believe that changing it now is the best option.
Vagalume is offline   Reply With Quote
Old 02-24-2017, 11:56 PM   #118
Mr. PC
Human being with feelings
 
Mr. PC's Avatar
 
Join Date: Apr 2010
Location: Cloud 37
Posts: 733
Default

Quote:
Originally Posted by Soli Deo Gloria View Post
Thanks for all, Schwa!! This initial development is already extremely promising!



And, in some cases, CC values can also change articulations, i.e. : CC16 value 10 is assigned to Legato, value 20 to staccatto, etc... (CineSamples, some Chris Hein libraries & Prominy, to name a few)

Allow me also to remind about the usefulness of linking slurs to legato overlaps - and sorry, please, if this has been discussed elsewhere -. I don´t know if it is fair to quote a couple of features from Sibelius here, but one could even define the overlap range for slurred notes and the length for unslurred ones (in both cases, with percentages of written values)
I've actually included this in my Sibelius Action list

http://forum.cockos.com/showthread.php?t=177142

Press L to make selected notes legato, and at the same time add a slight overlap (first it glues selected notes, then it lengthens them slightly).
__________________
AlbertMcKay.com
SoundCloud BandCamp
ReaNote Hotkeys to make Reaper notation easy/fast
Mr. PC is offline   Reply With Quote
Old 04-20-2017, 02:34 PM   #119
lalo
Human being with feelings
 
Join Date: Jun 2006
Posts: 84
Default

Quote:
Originally Posted by schwa View Post
This thread is for discussing potential special MIDI handling features in the notation editor.

Please keep in mind that the notation editor is in early development and nothing too exotic will be implemented until the basics of notation are stable.

It's also most helpful to us if the discussion is focused on incremental steps rather than a giant wish list. The most useful thing to think about from our point of view is the not the interface seen on the screen, or what dialog windows or file formats are used, but instead the specification of what types of data need to be linked to what.


For example:



1. Percussion notation, or any situation where the written notation differs from the desired MIDI output. We could potentially expand the existing MIDI note name interface to support mappings like this:

36 "Kick" 64
44 "Hat pedal" 62 "X"

Meaning, a MIDI note with pitch 36 will be displayed in the piano roll with the name "Kick", and displayed in the notation editor with pitch 64 (F4). A MIDI note with pitch 44 will be displayed in the piano roll with the name "Hat pedal" and displayed in the notation editor with pitch 62 (D4) and an "X" note head.

Is this reasonable?



2. Linking articulation and dynamics to MIDI messages. For example, a staccato marking triggering a key switch, or a crescendo marking triggering CC volume messages.

We could potentially add a new interface to support mappings like this:

FFF B0 07 7F
Staccato 90 00 7F

Meaning, FFF articulation is linked to CC volume 127. Staccato articulation is linked to the note C0 with volume 127. (This is written in raw MIDI but that does not mean the user interface will be.)

Is this reasonable?
Great ideas!!! Are they in a developing phase?
thanks!
__________________
alfonso "lalo" santimone
www.soundcloud.com/alfonsosantimone
www.elgallorojorecords.com
lalo is offline   Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 11:19 PM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2017, vBulletin Solutions Inc.