Quote:
Originally Posted by themixtape
But, the User modes for all launchpads provide the same (not exactly logical) midi note layout, which is completely non-editable. It doesn't even seem like there's a hack/workaround for this. Those pads provide certain midi notes and they are set in stone.
It doesn't make a lot of sense that the User 1 and User 2 modes don't provide complete customization.
|
Buttons are physically the same, so independent from any button labels (like "User"). They always produce the same initial signal.
The difference between editable controllers and fixed is WHERE original signal are mapped. In editable controllers that is happening inside the device (by firmware), for fixed controllers the software (DAW) supposed to do the mapping.
Ableton Live is well prepared for the second scenario. So Ableton oriented controllers (including in-house) are fixed. That in fact has advantages: user is not able to map them wrongly outside of the DAW and the number of signals to process is small and fixed (f.e. 8x8+2=66 fixes signals for 64pads plus 2 layer switches vs 8x8x2=128 random signals for 64pads which can send 2 different signals depended from internally processed layer switches).
Unlike Ableton, REAPER is NOT prepared for such scenario. There is no strait MIDI input and output hooks which could allow logical and simple implementation for controller specific processing, including real time stream (for pads/keys) and filtering out the part which supposed to do control staff (Ableton oriented and many older controllers use the same MIDI input for both). That is under 100 lines of code for REAPER developers, but my attempts to highlight the topic so far produce misunderstanding from the community and silence from developers.
Current "REAPER way" to deal with MIDI controllers has design problems:
1) Control Surfaces proposal and examples supposed to work with controllers separately from performance MIDI streams and not in real time. Single MIDI port and fixed layout controllers CAN NOT BE SUPPORTED by that schema. So mentioned "CSI" is no help for such devices.
2) MIDI FX processors can do re-mapping. But they by design are track and not device oriented, with many consequences. An ax can be used as a hammer, but that is not convenient. REAPER supports 2 ways there:
2.a) FX input/output event (MIDI) streams (most MIDI FXes use that way). The MIDI flow comes from the track (possibly MIDI input, recorded MIDI, etc).
2.b) hardware MIDI streams (ReaLearn can use that). FX can ask which MIDI events from hardware are available, independent from the track data and the track arm status.
Unfortunately even 2.b is far from perfect, if the intention is software mapping based on hardware button state, the plug-in should implement memorizing of the layout. Obvious problem: new instance of the plug-in has no idea which layout was previously selected.
3) MIDI can be observed in the (central) Audio hook. In real-time and synced with audio buffers processing. That is the perfect place to do real-time processing for mentioned controllers. One (and only one, but unfortunately the show stopper) problem: REAPER does not allow to modify MIDI streams in that hook.
4) the only way to completely steer MIDI stream from a controller which I have found so far is organizing own MIDI processing and staff required messages throw the Virtual Controller. Tricky, not synced and possible with some consequences (see my post about "Stop MIDI Leaks").
So,
if REAPER developers allow MIDI stream modification in the Audio hook (or introduce separate MIDI hook for that), that will open a possibility to support Ableton oriented and single port controller. Till that, we are forced to use workarounds, tricks and leave with inconvenience.