|
|
|
04-08-2018, 03:48 PM
|
#1
|
Human being with feelings
Join Date: Dec 2010
Location: Sydney
Posts: 238
|
Seeing MIDI as waveforms
Hey all,
One of the less-enjoyable aspects of working with MIDI is the fact that when it comes to mixdown time, you don't see an actual waveform like you do for all your .wav assets in the timeline.
And I get it. That waveform view would be dictated by the sample being loaded, any mixing features within that particular virtual istrument, any spatial effects (reverbs, delays, etc) that might also be accessible in the virtual instrument, plus a few other variables.
A brief history lesson.
Over the past weekend, I was in conversation with a good friend about how Cool Edit Pro (the forerunner to Adobe Audition) used to have what was was referred to as "background mixing".
For those who never used it, understand that this was late 90's, early noughties. Back when CPU's didn't have the extreme number-crunching capabilities they do today (I know, 20 years from now, today's CPU's will look lame... but I digress), we didn't have RAM measured in gigabytes, nor did that RAM have the kind of front side buss speeds we now enjoy. There were lots of reasons for why a background mixing process was needed.
For those who never used the app, what it would do was create a TEMPORARY .wav file (on disk, in a temp folder) which was a rendered version of your timeline.
And as you made ANY change to your mutlitrack project, CEP would re-render just that portion of the timeline and insert it into the temporary .wav file. And by doing it that way (only replacing the few samples that were actually modified), it didn't take long to have a fully playable version of your project ready to go at a moment's notice.
This meant that the app could always play back the most intense arrangement, regardless of how many audio tracks you had (there was no MIDI in CEP), regardless of how many plugins you'd used, etc.
Obviously, if you went and altered a reverb plugin which ran across the entire session, then CEP would have to run a complete re-render of the timeline. And this was more often than not, slower than realtime (because of the aforementioned hardware bottlenecks). But, this was a price worth paying. If you let it do its thing, and run off the mix in the background, you would then be able to hit play, and without any lag, get your timeline playing back in real time, glitch-free. Sure, you were listening to a rendered 2 track .wav, but it worked.
And this whole trip down memory lane made me wonder... could something along these lines be implemented which would allow us to see a "rendered" version of a MIDI track?
Think of the way Reaper currently works.
If you have two guitar parts, each on its own track, and you put those 2 tracks inside a folder track, the folder track creates a dimmed waveform view which represents the combined assets of the child tracks.
What if we could put MIDI tracks inside a folder track and Reaper could somehow run a background render of that FOLDER track (remember, it might have more than just ONE MIDI VSTi loaded!) and then display a waveform view of the MIXED version of the composite of the child tracks?
Now, don't ask me how that's done, I don't write code. I'm hoping the gurus at Cockos can answer this. Is something like this possible?
I'm thinking that with today's processing power, this shouldn't even present a hiccup to anyone's workflow. Most machines would gobble this up without us even noticing.
And as per the Cool Edit Pro execution, I would envision this working such that the minute you go and alter ANYTHING inside that folder track (the timing of a MIDI note, the velocity of a note or two, parameters within the VSTi, a wholesale swapping out of the VSTi being used, or any one of a hundred other variables), Reaper would simply re-render the waveform view that was being displayed on the folder track to reflect the changes.
I'll leave this here for more intelligent minds to debate. I'd love to hear other people's thoughts on this, and whether or not this is doable from a software point of view.
__________________
Cheers,
Bruce.
|
|
|
04-10-2018, 11:50 AM
|
#2
|
Human being with feelings
Join Date: Apr 2012
Location: Chicago
Posts: 165
|
Quote:
Originally Posted by audio2u
If you have two guitar parts, each on its own track, and you put those 2 tracks inside a folder track, the folder track creates a dimmed waveform view which represents the combined assets of the child tracks.
What if we could put MIDI tracks inside a folder track and Reaper could somehow run a background render of that FOLDER track (remember, it might have more than just ONE MIDI VSTi loaded!) and then display a waveform view of the MIXED version of the composite of the child tracks?
...Cool Edit Pro...
|
I would be absolutely stoked if this were possible/implemented - my brain seems to work better with waveforms vs MIDI item, so it would really be helpful to my workflow.
Quote:
Originally Posted by audio2u
...Cool Edit Pro...
|
Wow - haven't heard that name in a while! Brings back memories...
|
|
|
04-11-2018, 02:05 AM
|
#3
|
Human being with feelings
Join Date: Sep 2015
Posts: 690
|
Freeze or render the track.
Masi
|
|
|
04-11-2018, 02:30 AM
|
#4
|
Human being with feelings
Join Date: Jun 2015
Location: Indonesia Raya
Posts: 684
|
Plug JS:gfxscope after the VI in your FX insert. Adjust the time to couple of seconds.
|
|
|
06-10-2018, 05:16 PM
|
#5
|
Human being with feelings
Join Date: Jan 2015
Location: Sunny Florida
Posts: 34
|
Those of us that have been working with MIDI since it came out, and early sequencers with piano roll displays quite honestly can gather more information from a miniaturization of the piano roll display than a waveform readout. Particularly for sustained pads, strings etc., the waveform display doesn't really show anything at all except a continuous waveform, without easily discernible transients for new notes.
But a piano roll easily shows you what is going on.
|
|
|
01-10-2019, 09:49 PM
|
#6
|
Human being with feelings
Join Date: Jan 2019
Posts: 1
|
Quote:
Originally Posted by Diki Ross
Those of us that have been working with MIDI since it came out, and early sequencers with piano roll displays quite honestly can gather more information from a miniaturization of the piano roll display than a waveform readout. Particularly for sustained pads, strings etc., the waveform display doesn't really show anything at all except a continuous waveform, without easily discernible transients for new notes.
But a piano roll easily shows you what is going on.
|
Simple counterexample: Suppose I have a clip with one note. You can't tell how long that sound will last, because you don't know the ADSR of the instrument. A waveform will show you that.
However it seems to me that this isn't the main benefit the OP is getting at. It's more about easing load on the cpu by offloading some of the processing to an offline time. This allows you to run more and more processor hungry plugins.
I think some kind of "track auto freezing" feature would be pretty awesome, at least as an option. Especially combined with freeze points, so could could choose where in the fx chain the cutoff is such that everything before is rendered and everything after is processed online.
This kind of tradeoff seems like something that should be in the users hands.
|
|
|
01-10-2019, 10:36 PM
|
#7
|
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,784
|
Quote:
Originally Posted by audio2u
when it comes to mixdown time, you don't see an actual waveform like you do for all your .wav assets in the timeline.
|
Obviously you don't see them. They do not even exist at all. They are generated on the fly when actually rendering.
-Michael
|
|
|
01-11-2019, 02:10 AM
|
#8
|
Human being with feelings
Join Date: Sep 2015
Posts: 690
|
Quote:
Originally Posted by mschnell
Obviously you don't see them. They do not even exist at all. They are generated on the fly when actually rendering.
|
Which also means no one knows what the actual wave form will look like until the instrument plugin (and the follwoing effect plugins) does its job.
BTW, also with audio clips you don't see the actual wave form as it is heard. All you see is the source file and static volume changes applied to it by Reaper (clip and track automation). All further processing of the audio is not reflected in the display. For the same reason as above. No one knows what all processing deso to the signal until it is completely processed. And we have to take side-chaining into account as well.
Sorry audio2u, if you for whatever reason need a visual feedback of the complete project you have to render/freeze all of your tracks.
Masi
|
|
|
01-11-2019, 05:56 AM
|
#9
|
Human being with feelings
Join Date: May 2006
Location: Surrey, UK
Posts: 19,681
|
Put a 'scope on the track?
__________________
DarkStar ... interesting, if true. . . . Inspired by ...
|
|
|
Thread Tools |
|
Display Modes |
Linear Mode
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
All times are GMT -7. The time now is 06:49 PM.
|