Go Back   Cockos Incorporated Forums > REAPER Forums > REAPER General Discussion Forum

Reply
 
Thread Tools Display Modes
Old 07-24-2011, 02:31 PM   #1
karbomusic
Human being with feelings
 
karbomusic's Avatar
 
Join Date: May 2009
Posts: 29,260
Default Why does your DAW perform like ass?

First off, I have been planning to do this for a very long time. I'd say at least 18 months. I have procrastinated because performing the proper testing and analysis is very time consuming. I already have a good three or four solid days on the upcoming posts. So the long and short of it is I wanted to compile some information on various performance issues, bottlenecks and how to solve them as well as other information that tends to be misinformed on the net as a whole. Hopefully, I won't run out of gas, find something incredibly wrong, and delete the post. Fingers crossed.

This first subject and next two posts will be "Disk and the Flashing Red Bar of Death" is going to be a bit long as it is my journey of starting with 250 audio tracks and determining each possible bottleneck and correcting them one at a time until I find the true culprit. I eventually settle at around 150 tracks for the final testing but as you will see there are multiple ways to combat the problem and all play a distinct role in better disk performance and glitch free audio. Some of these tests are what I would call advanced, requiring a pretty good understanding of the OS and its underlying components. Hopefully, I can create some step-by-step instructions later so that results can be posted here then analzyed by others. Most of what follows is my sequential exploration of a DAW's performance as it happened...


[Continued]
__________________
Music is what feelings sound like.

Last edited by karbomusic; 07-24-2011 at 02:53 PM.
karbomusic is offline   Reply With Quote
Old 07-24-2011, 02:33 PM   #2
karbomusic
Human being with feelings
 
karbomusic's Avatar
 
Join Date: May 2009
Posts: 29,260
Default Disk and the Flashing Read Bar of Death Part I

Setup

Using Reaper 4 Beta, I began by creating 250 new tracks in Reaper. I then set and armed all of them to record the same source. This allows me to play a simple guitar piece once, resulting in 250 separate files that are all treated as separate allowing reading 250 files from disk and streaming the output to the soundcard driver. Yes I could have made 249 copies of an identical file but earlier tests showed some unexplained inconsistencies when attempting to repro a bottleneck, thus to be safe, I did it this way. You might also wonder why I didin't just hunt down 250 existing files and toss those in Reaper. Actually, I did that but all the files were so unrelated to each other that together they basically sounded like white noise meaning I couldn't determine by ear if there were pops or glitches. The project and files were recorded @ 48k/24bit using Zero plugins or anything else that may cause issues that are not directly related to streaming 250 audio tracks. I also kept all of Reapers default disk/buffer settings as-is.

Recording/Playback

Recording the tracks was no problem, they went down fine. I simply played for a minute or so then pressed stop. Upon playback as you would imagine it was mayhem. It took forever for playback to start and I never really heard any audio other than a painful glitchy wail or two at first, then the Red Bar began flashing in a never ending cadence. Audio is pretty much out-to-lunch at this point hence the red bar and zero meter activity:




Troubleshooting

Granted I already knew that 250 tracks is likely far beyond what I can achieve with my current setup regardless of any tweaking; however I chose this number for that very reason. I needed to be able to measure and confirm who, what, when and where the bottleneck was and to do that I needed a true bottleneck situation. I know a thing or two about the built-in Performance Monitor in Windows so that's where I began. I loaded up some of my favorite disk counters:

Avg Disk Sec/Read
Average Disk Queue Length
Split IO/sec

I knew that Avg Disk Sec/Read ideally needs to be below .020 and it was. I knew that Disk Queue Length should "ideally" be no higher than 1 per spindle and it wasn't:



Each queued read is waiting for the one in front of it so the read operations are queued up. Maybe this is the issue. The best way to test is to reduce the queue length, I did so first by removing 75% of the tracks and the queue length dropped considerably and the audio was barely better. Next I added all 250 tracks back but split them up across four physical drives (~62 files per spindle) and tried again. Our queue length for each disk was much better but guess what, the FRBOD was just as bad and no audio:



I had also noticed that the Split IO counter was high when I had all files on the one disk. It should really be zero. Split IO means we are performing multiple IO's that should have been one. This is what happens when the disk is fragmented for example. To confirm I used Disk View by SysInternals to have a look at the disk and I see something interesting, fragmentation. The Split IO counter was exactly right. Even though the files were freshly recorded, for some reason the individual files were not contiguous, IE: The red stuff:



SysInternals also has a great tool to deal with this. It’s called Contig.exe which is a command line tool that will take individual fragmented files and make them contiguous. I wrote a .net program a few years ago that is a front-end to contig which allows me to point to a folder and tell contig to process every file in that folder so that's what I did. Ran contig against all 250 files. As you can see from the output below every file is in multiple fragments:



Once Contig finished its job, I ran disk view again and it showed much better and the split IO in perfmon is now happy and flat lined @ zero just like it should be:



But guess what? You guessed it, the red bar continues to flash and the audio is still out-to-lunch. I really knew this would be the case. I'm running 250 tracks @24 bit for god's sake which is far beyond what I can pull off. Still everything we have learned so far is important because any of these can cause issues with lower track counts than this. In other words if your system is right on the edge, any of what we have measured so far can push it just over the edge. Remember, I'm purposely severely over taxing the system to show each issue at its worst. Because of this I decide to lower the track count down closer to 150. I wanted to start getting down to more real world by now. Obviously even if purposely over taxed "something" is "the thing" that is causing the issue and I still have not proven what it is even if I am running way too many tracks for the system. Oh what could it be? Read on....

[Continued]
__________________
Music is what feelings sound like.

Last edited by karbomusic; 07-24-2011 at 05:28 PM.
karbomusic is offline   Reply With Quote
Old 07-24-2011, 02:35 PM   #3
karbomusic
Human being with feelings
 
karbomusic's Avatar
 
Join Date: May 2009
Posts: 29,260
Default Disk and the Flashing Read Bar of Death Part II

I've eliminated most known disk bottlenecks. I'm able to move data off the disk at a reasonable rate if asked, I've fixed a fragmentation issue, I've gotten the queue length down enough that I should have seen some improvement but I haven't seen very much. Well, I see we have some processor action as well as DPCs that are a bit high. Could this be it? Let's find out. I start by looking at DPClat and only when I run Reaper does the latency go up. Actually, it and the processor rises to an unacceptable level. I try various tests, disabling services, jacking up the thread priority and so on but nothing helps. Finally I decide to go deep and use XPerf. If you don't know what XPerf is, it is pretty much the cat’s meow when diagnosing performance issues. It’s not an easy tool to use but if you know how to use it, performance issues simply cannot hide from it. So, I kicked of XPerf with the following commands which measures all types of latency (disk, DPC and proc) and pressed play in Reaper to capture the data:



I then stopped the trace and merged the data into a final .etl file using the command xperf -d h:\reaperMerged.etl. Once that is complete I open it in the XPerfView utility using xperf h:\reaperMerged.etl. I won't go through all the Xperf setup details as it’s a tiny bit complicated and involves symbol servers and such. I'll try to save those details for a future post. What I basically did was to hunt down the major consumer of DPCs that could be tied back to Reaper. I did find them by loading the summary table for DPC usage during the time Reaper was having issues:

Main DPC Graph




Summary Table of the selection above



Hmm... The huge majority of DPC action is coming from Wdf01000.sys. This is part of the new windows driver framework; specifically the new one in Windows 7. Since I have public symbols loaded from MS, I can even see the function names and we have 294,851 DPCs under FxDPC::FxDPCThunk. After about an hour of researching this file, that function and the fact that I'm using Firewire I found that Windows 7 has an issue where the new Windows 7 Firewire stack uses many more DPCs/CPU cycles than the legacy stack. Whoa, this explains some of DPC issues I saw. I even found the KB that explains the issue: http://support.microsoft.com/kb/2450963

I found that currently there is only one workaround and that is to go into device manager and switch the OCHI 1394 driver to the pre Windows 7 legacy version here: http://kc.flex-radio.com/KnowledgebaseArticle50433.aspx but Before I try changing it I want to prove to myself I'm hitting at least some of this issue. I just happen to have a POD X3 that uses USB not Firewire. If the DPC issue I am seeing is for real, then switching to the POD X3s driver in Reaper should dramatically drop the DPCs in DPCLat. So, I open Reaper, switch the driver and voila, DPCs drop exponentially. I really am hitting this issue. But wait, my audio still sucks and the FRBOD persists. Dammit Jim! Again, the issue is real; it could be something you hit in the future. I’m hitting it right now but it just doesn't happen to be my true root cause in my test setup. I even reverted to the 1394 Legacy driver to prove it. DPC was slightly better but audio was, you guessed it, still out-to-lunch. And I'm still not satisfied we are getting disk data the way we should be.

By now, I've identified several real-world causes, corrected each but I still haven't found exactly what the real bottleneck is. I decide to look back at disk once more based on something I saw in Reapers performance meter earlier:



Notice the disk reads. WTF? I think I can move much more data than that and it shows 143k? I don't really have any idea what that number truly means though so I could very well be barking up the wrong tree. I sort of figure it out by jumping back into performance monitor and looking at disk counters and I eventually find that one of the Disk/Read counters shows almost identical numbers when Reaper is running. I'm still a little stumped but I assume there may be a place to modify this somewhere and if so, and I can get it higher maybe I'll see results. Like I said I think that number would help if higher. So back to tools again, I decided to use Process Monitor this time as it will show me disk activity down to the kernel function level. I did so and filtered down to Reaper reading one of the audio files and noticed that during the ReadFile function the length is 262144.




I don't know that much about these functions at this level, that’s either normal or low or high. Going back to Reapers disk meter I noticed that it actually hovered around this value but why the hell isn't it higher or should even be? I noticed it jump up once or twice and when it did, audio played even if only for a moment!

I looked up the ReadFile function and found that when you call it you have to send the "length" of the buffer to the function so this is the buffer data is read into as it is being read from disk. If that's the case there is likely only one process that can actually set it, the process that is reading the file. REAPER!! I went into Reapers Options > Preferences > Buffering > Advanced Disk I/O Options and in all its beauty and glory there was the exact same number I saw in Process Monitor "Read Buffer Size: 262144" At this point I'm thinking this is a multiple of 1024 and it must be a happy medium between track count and memory consumption. The bigger the read buffer * the number of tracks will likely jack up the memory used. I quadrupled this number as a test step and it more than doubled Reapers memory footprint immediately but guess what else happened? 150 tracks began playing without glitches and the red bar of death disappears. EDIT: From my observations you should close then restart reaper for the change to take effect.

Summary

This does NOT mean everyone should go in and jack up that number. I also noticed there seems to be a cutoff point where raising it doesn't change anything. Also, I still recommend leaving it at the default unless you know exactly why you raised it. If you do make sure it is multiples of 1024 and realize you are quickly raising the memory Reaper is going to use. Also, I didn't test a single thing outside of this track test and that setting could have plenty of other consequences that I am completely unaware of.

The main reason for this entire journey is that each step along the way measures and confirms with near 100% certainty what is and what is not the problem. There should be enough information above to help you understand more about how DAWs perform and how to troubleshoot performance, with less about the "black box" methods. As you'll see in the following post, I'm not a big fan of making lots of tweaks if you can't prove they apply to you. There is no faster way to an unstable DAW that to be in that habit. I'll try at some point to provide specific instructions and links for some of these tests but I'm truly exhausted after completing what I have so far.
__________________
Music is what feelings sound like.

Last edited by karbomusic; 07-24-2011 at 05:39 PM.
karbomusic is offline   Reply With Quote
Old 07-24-2011, 02:39 PM   #4
karbomusic
Human being with feelings
 
karbomusic's Avatar
 
Join Date: May 2009
Posts: 29,260
Default Virtual Memory and the Page File

This is a duplication from another thread I replied to previously. I'm adding it here so the performance stuff is together.

1. The vast majority of information on the internet concerning computers and performance is hearsay and/or just plain wrong. Even from credible sources at times. The high percentage of the information is often nothing more than the blind leading the blind. "I did this and it seemed to work, you should try it". Arggg...

2. Most of us musicians are more than intelligent enough to understand all of this but there is a time investment in both the learning curve and on top of that, measuring the performance in such a way as to come to valid conclusions can take up lots of time better spent making music. See the first posts above.

As far as performance goes, every situation is different and should be approached as such when possible. Outside of the no-brainer stuff, chasing symptoms by blindly making changes because "dude on the Internet had a similar issue" without actually being able to verify first, eventually leads to lots of unstable machines then blaming the OS. The long and short of it is don't make the change unless you KNOW it applies to you and why.

==================
Virtual Memory
==================

Again, trying to keep it simple as possible for the time being. Virtual memory is part of the VMM (Virtual Memory Manager). It is a layer of sorts between a process and physical memory/paging file. In the simplest terms the VMM "Maps" virtual memory to physical memory and page file and the process only sees the virtual memory as one large piece of memory. Typically the process doesn't know or care what is physical or what is page file.

For those in the know, I purposely left out kernel mode memory, user mode memory, 3GB, USERVA, PAE and AWE out of this post in order to get the basic idea across. Adding all of that minutia right out of the gate is part of the problem Moving on... here is a very simple diagram showing the memory manager mapping physical memory resources to appear as real memory to a process. Why of course is another discussion but it is basically a method that allows the computer to act as if it has more physical memory that it actually does:

Code:
==============    ==============
Physical RAM        PAGE FILE
==============    ==============
    \                   /
================================
	Virtual Memory
================================
          \       /
     ===================
         Process.exe
     ===================
================
Page File Size
================

So the heart of the matter in this post was page file size. As I stated before there is no one size fits all rule or algorithm that will tell you what the proper page file size is. Why? Because every system has different performance and memory requirements. This is precisely why OS's tend to have dynamically adjusting page files out-of-the-box. Thus, you need to do a few simple investigations on your own to calculate the size that's best for your system. I also mentioned that Task Manager doesn't always tell the right story, the numbers are somewhat accurate but the ones it displays and what it calls them is misleading and has changed from OS version to OS version.

The best tool to use instead is Process Explorer and there are three values it displays that are critical in determining the best page file size for your system. They are listed under "Commit Charge".




Notice on the bottom left of the picture above under commit charge (Current, Limit, Peak). Commit Charge is a true committal by Windows that a process will have the memory it has promised to that process. Meaning it must be available as virtual memory (even if it is a mix of page/physical, see diagram from before). Here are the definitions:

Current: The current amount of committed memory that is the sum of commited memory for all processes.
Limit: The current hard limit that can safely be committed. (Appx. calculated by adding Page file size + Physical RAM).
Peak: The highest amount of of memory that has been committed since the last reboot.

It's important to note that if you do not set a fixed page size the commit limit can grow/shrink based on system needs. If you do have a fixed size then it is pretty much a hard limit because the page file cannot be expanded to meet unexpected demands in the future. Historically DAW users have set fixed page file sizes so that growing/shrinking of the page file doesn't cause audio glitches. I will say that it is somewhat rare these days for this to happen once the machine is "settled in" and you don't hit any sudden extremes and have a decent amount of RAM. However, fixed is perfectly fine so long as it is calculated with proper headroom to meet sudden demand.

Finally, in order to find what is the best "fixed" page file size for YOUR DAW (should you choose to use fixed) is to look at the peak AFTER the machine has ran for a few days, a week or however long it takes for you to use the system to the maximum performance limits it usually ever sees without reboots. For example, if this peak after that period of time is 6GB and you have 3GB of RAM installed then a safe page file setting might be 4GB which includes 1GB of emergency memory headroom. IE: 3GB physical + 4GB page file = 7GB Commit Charge Limit which would be decently above the 6GB peak you previously measured.

Reminder: I purposely left out 32bit/64bit and 3GB switch discussions for now because the basic ideas above are more important to understand first. Even if the numbers I quoted don't align with 32 or 64bit boundaries, the basic concept is the same.

"I have a 64 bit machine and a huge amount of memory, I don't need no stinkin' page file."

True. Windows will run just fine without a page file but there are a couple of things to be aware of:

1. If the system BSODs and creates a kernel dump it must be written to the system drive because even disk drivers are bypassed to write the dump. Since for all we know the BSOD was caused by a disk driver, the dump is literally written from memory directly to the page file sector by sector. Later, when the system reboots, the .dmp file is extracted from the page file during reboot. If you have no page file on the system drive and you need that dump for troubleshooting by a vendor or even Cockos, its not going to get written. To cover for this, keep it around 200-256MB. Why the system drive? Because Windows can depend on the fact that it exists because it booted from it.

2. It can actually hurt performance in some cases because with no page file since every single resource that uses memory on the system must remain in memory at all times even if you aren't actively using it. Even processes and data structures that have nothing to do with your DAW and won't ever be needed while running it are now being managed in memory at all times because it cannot be paged out until later.

"I heard that a large page file hurts performance due to excessive paging"

False. Excessive paging is due to not having enough physical memory to supply demand. If the page file size is larger but not used much (machine isn't memory starved), its not hurting performance outside of the chance it gets fragmented over time (different topic) but then again, just sitting there isn't causing fragmentation either. If fragmentation is ever a concern, just recreate the page file, reboot and problem solved.

I hope this helps at least a little.
__________________
Music is what feelings sound like.

Last edited by karbomusic; 07-24-2011 at 05:50 PM.
karbomusic is offline   Reply With Quote
Old 07-24-2011, 03:56 PM   #5
Ed Zeppeli
Human being with feelings
 
Ed Zeppeli's Avatar
 
Join Date: Aug 2010
Location: Nanaimo, BC
Posts: 559
Default

hey thanks for taking the time to post up all this analysis.

I'll be reading in more detail later but I very much appreciate your efforts here.

Cheers,

Warren
Ed Zeppeli is offline   Reply With Quote
Old 07-24-2011, 04:24 PM   #6
ArrowHead
Human being with feelings
 
Join Date: Jul 2011
Posts: 227
Default

Awesome write up. Your dedication and effort is amazing. I'll be reading this through several times.
ArrowHead is offline   Reply With Quote
Old 07-24-2011, 04:57 PM   #7
Guido
Human being with feelings
 
Join Date: Nov 2007
Posts: 674
Default Thank You

Hi,

Thank you SOOOOOO MUCH. I originally came from Macland round 2006 and found Reaper on 2007..and this information is something I been looking for for ages!
Thank u for this. Awewsome!

Guido
Guido is offline   Reply With Quote
Old 07-24-2011, 05:02 PM   #8
Quicksilver
Human being with feelings
 
Quicksilver's Avatar
 
Join Date: May 2010
Location: Australia
Posts: 125
Default

Fantastic guide, thanks for taking the time to write it and share it with us.
Quicksilver is offline   Reply With Quote
Old 07-24-2011, 05:23 PM   #9
flmason
Human being with feelings
 
Join Date: Nov 2009
Posts: 642
Default

Cool stuff Karbo.

I was somewhat involved in this sort of thing in the mainframe world.

Been wondering how to transfer it over to the Window/Linux/Mac world, considering the way the job market is these days.

Would be very cool if I could do internals development on these small system OS's some day.
flmason is offline   Reply With Quote
Old 07-24-2011, 05:55 PM   #10
karbomusic
Human being with feelings
 
karbomusic's Avatar
 
Join Date: May 2009
Posts: 29,260
Default

Quote:
Originally Posted by flmason View Post
Would be very cool if I could do internals development on these small system OS's some day.
Windows Internals is pretty much THE book to start with for the internals stuff for Windows. The author of most of the tools I mentioned (and that book) did not initially work for MS when he wrote the tools. He was so good at figuring out how Window's ticked, they hired him to in order to "get him off the streets"
__________________
Music is what feelings sound like.
karbomusic is offline   Reply With Quote
Old 07-24-2011, 06:31 PM   #11
bluzkat
Human being with feelings
 
bluzkat's Avatar
 
Join Date: Jun 2007
Location: Northern Michigan
Posts: 6,919
Default

Quote:
Originally Posted by karbomusic View Post
He was so good at figuring out how Window's ticked, they hired him to in order to "get him off the streets"
karbo is talking about Mark Russinovich, the guy is just amazing!! I have been using some of Mark's utilities on my PCs for as long as I can remember.

Remember the Sony/BMG 'root kit' incident a few years back?? Mark was the 'hero'.

See here:http://en.wikipedia.org/wiki/Sony_BM...ootkit_scandal.

One of my favorite 'tech' people.

@Karbo... I'm looking forward to the rest of your investigation, excellent stuff. Thank you for your effort.


__________________
Peace...
bluzkat

Last edited by bluzkat; 07-24-2011 at 08:32 PM.
bluzkat is offline   Reply With Quote
Old 07-24-2011, 07:04 PM   #12
tls11823
Human being with feelings
 
tls11823's Avatar
 
Join Date: Aug 2010
Location: Harrisburg, PA USA
Posts: 1,481
Default

Quote:
Originally Posted by karbomusic View Post
Windows Internals is pretty much THE book to start with for the internals stuff for Windows.
Way off topic, but years ago, when I was teaching myself UNIX, I bought a book called UNIX Internals. My wife, who had been to the gynecologist a few days before, saw it and said, "I hate internals. They're really uncomfortable." I've never looked at that word the same way since.

Anyway, thanks so much for this thread. It's too much information to absorb at the moment, but I plan to devour it when I have some spare time.
__________________
We act as though comfort and luxury were the chief requirements of life, when all that we need to make us happy is something to be enthusiastic about.
--Charles Kingsley... or maybe Albert Einstein... definitely somebody wiser than myself--
tls11823 is offline   Reply With Quote
Old 07-25-2011, 05:02 AM   #13
-R-
Human being with feelings
 
-R-'s Avatar
 
Join Date: Mar 2010
Posts: 305
Default

Excellent thread! Thank you very much. We'll take as much as you can give!
Have a nice day
-R- is offline   Reply With Quote
Old 07-25-2011, 05:48 AM   #14
nofish
Human being with feelings
 
nofish's Avatar
 
Join Date: Oct 2007
Location: home is where the heart is
Posts: 12,096
Default

Excellent stuff, thanks karbo.

Hey, the thread title made me smile (reference to the "why do your mixes sound like ass" thread ?!)

btw, your post about virtual memory recently was the first one I could follow 100% on this topic, so looking forward to reading this one.
nofish is offline   Reply With Quote
Old 07-25-2011, 08:55 AM   #15
Snap
Human being with feelings
 
Snap's Avatar
 
Join Date: Jul 2011
Posts: 850
Default

Thanks so much for sharing and all you effort. Hats off!!! Amazing job. Big thanks.
Snap is offline   Reply With Quote
Old 07-25-2011, 09:18 AM   #16
chip mcdonald
Human being with feelings
 
chip mcdonald's Avatar
 
Join Date: May 2006
Location: NA - North Augusta South Carolina
Posts: 4,294
Default

Thanks for the effort, as I immediately recognized the waveform profile of your DPC situation. I've got a Line 6 UX-2 on my office (low end) laptop, and I get a continual "gurgle" from what Windows system monitor reports as DPC's, regardless of activity in Reaper.

I had gotten to the point where I thought it probably had something to do with the USB connection; what did you mean about "switching to the X3" driver in Reaper? Are you also using a 1394 device for audio, or is there some way the 1394ohci.sys driver interacts with USB (even if one doesn't have a 1394 port, such as my laptop)?
__________________
]]] guitar lessons - www.chipmcdonald.com [[[
WEAR A FRAKKING MASK!!!!
chip mcdonald is offline   Reply With Quote
Old 07-25-2011, 09:19 AM   #17
ivansc
Human being with feelings
 
Join Date: Aug 2007
Location: Near Cambridge UK and Near Questembert, France
Posts: 22,754
Default

(holds out gruel bowl) PLease sir, can I have some more, sir?


Excellent stuff. I did a vey basic tutorial on OS9/OSK years and years ago which was nowhere near as in-depth as this promises to be and to my smug delight, got some very nice feedback.
You deserve far more for taking this on, mate.

Thank you - I await part 2 with interest...


P.S. And written in a way that pretty much anyone could follow, too!
ivansc is offline   Reply With Quote
Old 07-25-2011, 10:53 AM   #18
karbomusic
Human being with feelings
 
karbomusic's Avatar
 
Join Date: May 2009
Posts: 29,260
Default

Quote:
Originally Posted by chip mcdonald View Post
Thanks for the effort, as I immediately recognized the waveform profile of your DPC situation. I've got a Line 6 UX-2 on my office (low end) laptop, and I get a continual "gurgle" from what Windows system monitor reports as DPC's, regardless of activity in Reaper.

I had gotten to the point where I thought it probably had something to do with the USB connection; what did you mean about "switching to the X3" driver in Reaper? Are you also using a 1394 device for audio, or is there some way the 1394ohci.sys driver interacts with USB (even if one doesn't have a 1394 port, such as my laptop)?
Hi Chip,

Let me clairfy a couple things that I think will help. Typically when we speak of DPC usage via DPClat.exe etc., it has to do with "rouge" drivers unlrelated to the DAW. Enabled wireless NICs are notorious for this for example and you'll see high DPC when Reaper isn't even running.

In my test case my DPCs were very low when Reaper wasn't running (50-100us). Only when I when pressed play DPCs went into the red. After running XPerf which allowed me to see just what might be causing the high DPCs I found it was happening in the actual Firewire stack which explained why it only happened when playing audio (I used my RME Fireface 800 for 99% of the testing). I then found a known issue in the new Windows 7 Firewire stack that was possibly responsible.

Soooo, I then assumed based on what I learned above that a soundcard that uses USB instead of Firewire might give different results. I thought that might help me confirm if what I saw was real. I just happen to have a USB soundcard I never really use... The POD X3... Which uses USB and NOT Firewire, I switched to the POD as my ASIO device for a moment as a test. When I did so the DPCs dropped. So, it wasn't really an issue with the Fireface itself or the POD but the 1394 OCHI driver in the windows firewire stack. The POD just help me prove it. Once that was done, I went back to my FF800 to complete the testing.

As far as your constant DPCs being high even when Reaper isn't running, I'd suspect my first explanation. Some unrelated driver such as the wireless network adapter or something. The only catch is that from what I can tell the previous RATT tool doesn't give the proper information to diagnose just which driver it is in Vista/Win7. The RATT tool is what used to be used in conjunction with DPClat to find the actual driver causing the issue. Nowadays in later versions of Windows, XPerf is the tool that does the job the RATT tool used to do. Hopefully, I can get a step-by-step on how to do this with XPerf soon. Hope that makes sense.
__________________
Music is what feelings sound like.
karbomusic is offline   Reply With Quote
Old 07-25-2011, 01:24 PM   #19
SiKo
Human being with feelings
 
SiKo's Avatar
 
Join Date: Aug 2008
Location: dusty hot place
Posts: 1,492
Default

Thanks km! Very nice article. Lots of detailed info, references, tools, good stuff.

Xperf link is extremely nice.

Thanks again!
__________________
... yOu aNd mE are ...
SiKo is offline   Reply With Quote
Old 07-25-2011, 03:02 PM   #20
Mercado_Negro
Moderator
 
Mercado_Negro's Avatar
 
Join Date: Aug 2007
Location: Caracas, Venezuela
Posts: 8,676
Default

Thanks karbo,

Bookmarked!
__________________
Pressure is what turns coal into diamonds - Michael a.k.a. Runaway
Mercado_Negro is offline   Reply With Quote
Old 07-26-2011, 12:09 AM   #21
steveo42
Human being with feelings
 
Join Date: Dec 2007
Posts: 385
Default

This is a really cool thread!!!
Thanks for posting!
steveo42 is offline   Reply With Quote
Old 07-26-2011, 12:56 AM   #22
Anomaly
Human being with feelings
 
Anomaly's Avatar
 
Join Date: Sep 2007
Posts: 642
Default

Karbo

Thank you for the detailed, technical report. I enjoyed reading it.
I'd like to encourage you to make similar test with other disk read modes as well.

I have always found that the default disk read mode (Asynchronous buffered) is performing worse than the Asynchronous unbuffered, which I have used since day one. I have to admit that I don't understand why Asynchronous buffered is default in Reaper. But at least for me, the unbuffered mode eliminates slow responsiveness and the red bar flashing issues when track count is high.

Your test also confirms What I found out concerning firewire cpu usage. That was the one reason I got rid of my firewire audio interface and moved back to internal.

Cheers

Last edited by Anomaly; 07-26-2011 at 05:01 AM.
Anomaly is offline   Reply With Quote
Old 07-26-2011, 01:07 AM   #23
Bassman1
Human being with feelings
 
Bassman1's Avatar
 
Join Date: Mar 2007
Location: UK
Posts: 472
Default

For the layman / Musician is there a brief summary of what this means and how it would help. REAPER seems to perform brilliantly as it is, but I'm always up for making it even better.

A lot of this technical stuff goes over my head.
__________________
Bought REAPER V1.5 and still going strong today with V5.
Thanks Justin & Co !
Bassman1 is offline   Reply With Quote
Old 07-26-2011, 01:15 AM   #24
flmason
Human being with feelings
 
Join Date: Nov 2009
Posts: 642
Default

On the subject of DPC's.

I have a a Compaq (this one, LOL!) that had horrendous DPC numbers as delivered. Ended up being an enabled RAID array driver... on a laptop with one disk... disabled it, problem solved.

Just thought I'd toss it out there.

I managed to fire up and test Reaper on a Celeron M 1.6 ghz, 512K RAM. So in and off itself, it's pretty lightweight on resources.
flmason is offline   Reply With Quote
Old 07-26-2011, 08:57 AM   #25
chip mcdonald
Human being with feelings
 
chip mcdonald's Avatar
 
Join Date: May 2006
Location: NA - North Augusta South Carolina
Posts: 4,294
Default

Quote:
Originally Posted by karbomusic View Post
Firewire, I switched to the POD as my ASIO device for a moment as a test.
Ahh, ok.

Quote:
As far as your constant DPCs being high even when Reaper isn't running, I'd suspect my first explanation. Some unrelated driver such as the wireless network adapter or something.
Yeah, unfortunately I've already taken it out. I was hoping that since - if we were cardiologists - your "chart" appeared to have a similar S1/S2 timing signature as I was seeing, and I was hoping for an easy solution. Oh well.

Quote:
versions of Windows, XPerf is the tool that does the job the RATT tool used to do. Hopefully, I can get a step-by-step on how to do this with XPerf soon. Hope that makes sense.
Yeah, I'm a computer geek in reformation. I used to not be able to resist KNOWING what was going on and knew the whimsy of various performance tools/etc. but I'm trying to let the Cloud stay on top of trends. Ever since svhost.exe became the catch-all for All Things Nebulous I've lost my fervor. I still hate to watch things shuffle around CPU in Performance Monitor - with nothing going on - without actually knowing why. But I don't *need* to know why, right? Right? Invisible Nano Gnomes Doing Miniscule Tasks inside my machine, that's all.



/ in an alternate universe I'm a rich Arduino developer
__________________
]]] guitar lessons - www.chipmcdonald.com [[[
WEAR A FRAKKING MASK!!!!
chip mcdonald is offline   Reply With Quote
Old 07-31-2011, 03:04 PM   #26
karbomusic
Human being with feelings
 
karbomusic's Avatar
 
Join Date: May 2009
Posts: 29,260
Default DAW Process Memory Limits (32 Bit)

I've been thinking about the next post in this series and as I browse the forum I keep seeing posts & threads about memory and memory related crashes. The one thing about running out of memory where Reaper is concerned is that there may be little or no indication that running out of process memory is the reason it crashed. If a user suspects its memory they may see Reaper only using 1.3GB for example and conclude it's not memory depletion because in a 32 bit process they should still have 700 MB or so available (2 GB - 1.3 GB). So, I thought I would fire up Reaper, attach some monitoring tools and purposely take it down due to memory exhaustion.

Setup

Windows 7 32Bit
4GB physical Memory
3GB visible to the OS
2GB available to Reaper (3GB switch not used)
No VSTs run in separate processes etc. so everything must fit inside that 2GB

Repro

Open an already large project and begin loading Addictive Drums instances + various kits until we run out of memory crashing the DAW.

Tools Used

VMMap - SysInternals - Used to view all allocations and memory usage.


VMMap is a great tool for digging into the memory a process is using. If you really want to know just what is using which memory this is the tool to use. VMMap also requires no install, just unzip and run. The first step was to launch my largish project (~32 tracks + VSTs), grab a memory profile with VMMap. This will give us a snapshot of memory before very much of it actually gets allocated. Let's look at that very first snapshot and I'll give a quick overview of which sections matter to us including some additional information later just for interest's sake:





First the important stuff we care about. In the image above notice the two free memory sections I'm pointing out in red. "Total Available" and "Largest Available Block". Reaper is loading peaks, VSTs etc. so we still have almost all of the total memory available to Reaper still available (1.9 GB). However, the largest amount of memory Reaper can possibly allocate at one time is 1.4 GB. At this point that's not a problem. To get a better visual of this take a look at the "Address Space Fragmentation" window. The white areas are free memory and I have highlighted that 1.4 gig free block. If you look closely you will see a very thin speck or two of purple at the top right and left of the black square. That small area of allocated memory is the barrier that is causing our 1.9GB to be split into two pieces (non-contiguous). The black square is our 1.4GB free area and the white area above it is another 500ish MB free block. The various stripes of colors at the top of either window is what memory is being used for what and is color coded. In the main window are the amounts and descriptions of how the memory is being used and in the fragmentation window we are seeing exactly how it is "laid out" in virtual memory.

Now that we have an idea of what VMMap does let’s look at a new snapshot after the project has completely loaded:



The light bulb should be starting to light up for everyone now as we can see that once Reaper has completed loading all VSTs and other files it needs our new snapshot shows the result and our memory usage and free memory has changed quite a bit. We are @ 1.2GB of total available memory for Reaper to use and the largest single chunk it can allocate is 732MB. What is important to understand at this point is that even though we have 1.2 GB left, if some memory intensive VST were to be added that needed more than 732MB in contiguous memory, Reaper is going to likely take a dirt nap because it doesn't have a contiguous block larger than 732MB. Usually when this happens Reaper is likely to just go POOF and show a crash dialog. Keep in mind that only the most intensive VSTs would be a problem right now.

In this following snapshot what I have done is continually add instances of Addictive Drums until I have used pretty much all the available memory that Reaper can possibly use. If you remember from my earlier post about "committed" memory also notice the commited amount at the top right. We're almost at our 2GB limit:




What a difference. We only have 77 MB free period and of that 77 MB the largest possible allocation Reaper can handle before going bye-bye is a measly 7.2 MB. If any allocation occurs from this moment forward that is higher than 7.2 MB without something being freed first, its game over. In my test the VST added fine but the memory was so low that when I clicked play and/or clicked around various windows and features we hit the limit and Reaper crashed. Also notice the address space window, that small white area is that 7.2 MB free block. Finally here is the final snapshot right after I exceeded that 7.2 MB and Reaper is now crashing:



Take a look at our total available free and largest free block now (64k). Reaper is not coming back from this condition and we don't really even have 64k. Notice the memory address where that supposed 64k is sitting at: 0x0000000. This means we're done and this is what actually caused the crash, trying to allocate something that doesn’t exist. I should mention this is post-crash so these numbers are not completely accurate at this point (hence the imagined 64k free). If we go back to the snapshot just before the crash we can probably find some tiny free blocks of memory but they are so small nothing can actually use them. All of these would add up to our total of 77 MB but none would be contiguous. See the video below as I zoom in and scroll selecting them. Notice the bottom of the window that shows the address and size of the free sections I'm clicking on. All the small white blocks are free FYI:

You may want to watch the videos at full screen in order to see everything.

https://www.youtube.com/watch?v=1IWwO7n8gU8

===============
Memory Leaks
===============

If a VST (or whatever) allocates some memory, uses it, finishes using it but doesn't release it, then allocates new memory again, rinse/repeat over and over... It's called a memory leak. What I did above is NOT a memory leak, its memory depletion. A memory leak would be similar to taking a particular action in Reaper over and over for example and each time that action is taken the memory increases until Reaper finally crashes. It's important to distinguish between the two since depletion is normal if you don't have enough memory for the project and a memory leak is usually a bug because the software isn't properly cleaning up after itself. You may not be able to use VMMap to isolate exactly what is causing the leak but you can probably deduce it is happening by watching memory continually grow even though you are not doing anything that would increase memory usage such as loading VST and samples etc.

=====================
Memory Fragmentation
=====================

Memory fragmentation is a condition where small blocks of memory are allocated and released in such a pattern that we end up with a large sum total available of memory, yet little or none of it is very contiguous. In other words, we have lots of free memory but no block is big enough to service any reasonable request. This post demonstrates similarities but fragmentation isn't really the issue here. If the ~77 MB I spoke of above were contiguous we would have not crashed as quickly but the demonstration is to show how to determine lack of memory and the small amount of fragmentation near the end comes into play. If it were a true fragmentation issue, we would be crashing much sooner with much larger total free if that makes sense. Some of it is semantics but I hope you get the idea here.

===============
Extra Credit
===============

Before I conclude I'll give a quick video tour of how the memory is used and mapped out. All the orange is heap memory, allocations etc. being used by VSTs and everything that gets loaded while Reaper is running. The Purple are mapped images such as all the VST dlls themselves, Reaper.exe etc. loaded in memory and the blue sections are what are called mapped files. A mapped file is a file that has literally been loaded and mapped into memory but doesn't execute per se. Reaper's TCP Peak files for example are mapped files but audio files are not because they are streamed in. The peaks however are fully loaded as you will see. You could for example, calculate exactly how much memory the peak files in your project are using:

https://www.youtube.com/watch?v=Y2ng1KMO95Y

===============
Summary
===============

In this post we took a look at how Reaper (or any process) is given 2GB of memory address space to consume in a 32 bit system as well as what happens when Reaper crashes due to memory exhaustion. As you can see, if you have questions as to whether your unexplained crash is due to running out of memory, you can use a tool such as VMMap to confirm it. One thing to remember about VMMap is you need to press F5 to refresh it and get the latest snapshot. So, if you want to troubleshoot memory in this fashion just launch VMMap after Reaper is running, select Reaper in the process list and refresh the VMMap view when you think you are close to the crash. Take a look at the largest available block, if it is small enough you can confirm the issue. If you wait until after it has crashed there is no process to poll and you'll lose the information. You can also save snapshots as you go and open them in VMMap later which is what I did.
__________________
Music is what feelings sound like.

Last edited by karbomusic; 07-31-2011 at 03:42 PM.
karbomusic is offline   Reply With Quote
Old 08-01-2011, 12:28 AM   #27
Snap
Human being with feelings
 
Snap's Avatar
 
Join Date: Jul 2011
Posts: 850
Default

Once again... Awesome info!!! Thanks so much!
Snap is offline   Reply With Quote
Old 02-11-2013, 08:25 PM   #28
Archimedes
Human being with feelings
 
Join Date: Aug 2011
Posts: 365
Default Bravo Zulu

Karbo - your analysis is outstanding - first rate engineering!

Thanks very much for taking the time, and providing excellent detail. The best science is science you can duplicate. Such is the case with your analyses.
Archimedes is offline   Reply With Quote
Old 02-12-2013, 12:17 AM   #29
danfuerth
Human being with feelings
 
Join Date: Mar 2012
Posts: 1,824
Default

one problem is there is no ram limitations that an OS can see, this has been solved since the Pentium Pro days.

The issue has always been licencing. Microsoft licences different ram amounts that are supported on their client Desktop OS's

Even Apple was guilty of this before moving to 64 bit.



There are no Ram limits that the OS can not see the extra ram if you patch your kernels ( modified). The only limits are on Applications that stick to 32 bit code instead of extending their code to 36 bit using AWE

This problem was partially solved by Intel back in 1995

When we had Windows XP 64 bit back in 2003 no one gave 2 shits about moving to 64 bit due to the lazy ass video card companies not having their cards ready for 64 bit and the issue of the PAE and graphics cards crashing the OS all the time.

That is the reason why microsoft disabled the PAE kernel in XP SP1.
The PAE kernel was on the original XP.


I trully blame this loss of moving to 64 bit back in 2003 to NVIDIA the kings of crap install to your system

They are the reason they we took forever to move to 64 bit.

Remember the Vista fiasco? All intel on that one was they could not get their OEM video cards to work properly with Vista, which you know by the "Vista Compatible " Logo crap.


Again Microsoft needs to stop catering to these fools. With Windows 8 this all has changed, as Microsoft is doing what Apple has been doing for ages : You make we test, if it's crap you do not make for us anymore.


Other than that EXCELLENT POST!!!
danfuerth is offline   Reply With Quote
Old 11-17-2014, 10:04 PM   #30
yep
Human being with feelings
 
Join Date: Aug 2006
Posts: 2,019
Default

Just saw this thread linked on the main board, and wanted to say what awesome work this is. Karbomusic did a fantastic job laying out a lot of the technical woowoo that can cause problems for audio machines, and I think hit a nice balance between technical data and theory, and practical application.

People don't typically design computer hardware, drivers, software, etc, with the idea that it will be used to capture or process 20 or 100 channels of audio with extreme low-latency processing over firewire or usb. It's not that it can't be done, or even that it is especially hard to do, it's just not usually on their list of primary design-goals. It's a specialty application.

I am reminded of one of the early design-requirements for WWII-era Jeeps, I think something like they had to be able to operate at a steady 4mph for hours on end, to keep pace with marching troops. This was a surprisingly rare and nontrivial requirement, because most car and truck engines would apparently overheat and burn out at such low speeds. Of course, every car ever made has to start from a stop, and return to one, so getting it to 4mph is easy. But cars are not typically designed to run at such low speeds for hours at a time, especially if they have engines powerful enough for towing, off-road hill-climbing, and evasive maneuvers.

So it is with DAW: we are expecting a level of constant, fast, reliable, low-level data throughput that is completely outside the normal boundaries of office-work, web-development, home media consumption, or even high-performance gaming. Somebody typing an email who has her cursor freeze for a split-second doesn't care. But if it happens during her guitar-solo, the take is ruined. Even with 60 or 144 FPS gaming, we are talking orders of magnitude lower than DAW, in terms of tolerance for things like data-dropouts.
yep is offline   Reply With Quote
Old 11-17-2014, 10:29 PM   #31
JHughes
Banned
 
Join Date: Aug 2007
Location: Too close to Charlotte, NC
Posts: 3,554
Default

Thanks for the bump, should be stickied somewhere, or in Karbo's sig anyway.
JHughes is offline   Reply With Quote
Old 11-18-2014, 03:51 AM   #32
Sambo Rouge
Human being with feelings
 
Sambo Rouge's Avatar
 
Join Date: Sep 2010
Location: Hertfordshire, England
Posts: 1,965
Default

Yes, MAKE THIS A STICKY! (please)
Sambo Rouge is offline   Reply With Quote
Old 11-18-2014, 05:08 AM   #33
technogremlin
Human being with feelings
 
technogremlin's Avatar
 
Join Date: Mar 2008
Location: Netherlands
Posts: 2,629
Default

Quote:
Originally Posted by Sambo Rouge View Post
Yes, MAKE THIS A STICKY! (please)
+1
technogremlin is offline   Reply With Quote
Old 01-17-2018, 01:00 PM   #34
JHughes
Banned
 
Join Date: Aug 2007
Location: Too close to Charlotte, NC
Posts: 3,554
Default

A friendly bump.
JHughes is offline   Reply With Quote
Old 01-17-2018, 03:44 PM   #35
Philbo King
Human being with feelings
 
Philbo King's Avatar
 
Join Date: May 2017
Posts: 3,202
Default

Karbo - Some amazing forensic work! Nicely done.

+1 for making this a sticky.
Philbo King is online now   Reply With Quote
Old 01-17-2018, 03:53 PM   #36
karbomusic
Human being with feelings
 
karbomusic's Avatar
 
Join Date: May 2009
Posts: 29,260
Default

Thanks Philbo, I thought about putting in my sig but IIRC some of it is 32 bit related which is sort of out of date now. I'll go back and review above when I can (including looking to see if I can find the lost images) and see if there is enough relevant info for 2018.
__________________
Music is what feelings sound like.
karbomusic is offline   Reply With Quote
Old 01-17-2018, 08:02 PM   #37
cassembler
Human being with feelings
 
cassembler's Avatar
 
Join Date: Aug 2017
Location: Dallas, TX
Posts: 348
Default

Good thread, nicely done. A high-quality contribution

I suspect SSDs (SATA/PCI-E) render adjustment of page file size much less 'effective,' though I'd be curious to see the overall effect they have on the identified bottlenecks.

Given the proliferation of 'greedier' plugs (COUGH *acustica* COUGH), I'd venture CPU is the king bottleneck these days...
__________________
It helps if the hitter thinks you're a little crazy
- Nolan Ryan
cassembler is offline   Reply With Quote
Old 01-17-2018, 08:09 PM   #38
karbomusic
Human being with feelings
 
karbomusic's Avatar
 
Join Date: May 2009
Posts: 29,260
Default

Quote:
Originally Posted by cassembler View Post

Given the proliferation of 'greedier' plugs (COUGH *acustica* COUGH), I'd venture CPU is the king bottleneck these days...
Possibly more about the audio driver thread needing to be on a single core (meaning more cores don't make that component faster) and interrupts from other hardware while using lower buffer settings. You are probably already familiar with this video which touches on the general idea...

https://www.youtube.com/watch?v=GUsLLEkswzE

However, you make a great point since disk contention is nearly a thing of the past as far as DAWs are concerned or at least light years better than my days of ATA 133. Yay SSD.
__________________
Music is what feelings sound like.
karbomusic is offline   Reply With Quote
Old 01-17-2018, 08:27 PM   #39
Cableaddict
Human being with feelings
 
Join Date: Apr 2008
Posts: 1,910
Default

Karbo, there's some great stuff here, (You da' MAN !!! )
- but I think this thread desperately needs a heavily-condensed summary section. - then readers can go back for the dizzying, in-depth explanations afterwards.
---------------------------

Also, I think a lot of what's above doesn't really matter any longer with 64bit systems and 16 - 32 GB of memory, plus SSD boot drives vastly negate the need to worry about page file settings. (As you know, of course.)

It's not your fault that folks are bumping such an old thread, but maybe you could make a stripped-down version that only deals with modern 64-bit systems?

Also then we can add in some of the things I've recently been posting about (& we've been discussing.)
- CPU priority, turbo Boost, etc) .....

I'd also like to have a discussion about core allotment (Reaper itself, vs individual VSTi's, vs or in conjunction with other concurrently-running software such as Traktor or a Rewire'd application....

And a separate, in-depth discussion / analysis of Reaper's fairly arcane & mysterious audio & buffering settings sure wouldn't hurt. I mean a really deep technical explanation of what each setting does.


And finally, a careful & clear distinction needs to be drawn between the hunt for better low latency performance, vs the hunt for better overall computational power. (Two very different things, obviously.) Even your tweaks for more track count might negatively affect othr areas of system performance, depending on the user's particular needs.

But regardless, if anyone can compile a serious, no bullshit Windows & DAW optimization program, it's probably you. Given all the bad information that can be found all over the internet, this would be a very welcome thing.

Last edited by Cableaddict; 01-17-2018 at 08:59 PM.
Cableaddict is offline   Reply With Quote
Old 01-17-2018, 08:40 PM   #40
hopi
Human being with feelings
 
hopi's Avatar
 
Join Date: Oct 2008
Location: Right Hear
Posts: 15,618
Default

thanks karbo... really well done and explained...

I do have a question about what you show in the advanced pref's for buffering

OK so you enlarged the read buffer size... to something more and in multiples of 1024... so the default is 256 x 1024

OK let's say we double that ... 512 x 1024

then what about the number of read buffers... the box just above that setting.... default is 3
...and IF I'm understanding how your changes to the buffer size improved a certain aspect of performance... I would imagine that more buffers would also help... no?
__________________
...should be fixed for the next build... http://tinyurl.com/cr7o7yl
https://soundcloud.com/hopikiva
hopi is offline   Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 04:38 PM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.