Difference in memory leaks using _crtBreakAlloc compared to UMDH

130 views Asked by At

While trying to diagnose memory leaks with the {,,ucrtbased.dll}_crtBreakAlloc method, I found a memory leak which depends on timing in a multithreaded environment. Due to the multithreading, it's no longer predictable, which allocation number the leak in question will get. Using _crtBreakAlloc in Visual Studio, I get a total of less than 20 memory leaks reported.

Because I can't conclude over different runs in Visual Studio with these multithreading issues, I decided to take a different approach using UMDH. I enabled the GFlags User Mode Stack Trace Database (UST) and took a snapshot at the initial breakpoint in WinDbg and another snapshot just before process exit.

UMDH reports more than 1000 memory leaks, a lot of them which have .NET as part of the call stack (clr!something+X), but also others (I call them "native").

I want to figure out which filter I could use in UmdhGui in order to reduce the results to the same amount.

What kind of decision making does Visual Studio or _crtBreakAlloc apply so that it reports less than 20 leaks?

1

There are 1 answers

0
user3967979 On

_crtBreakAlloc and the entire _crtXXX family take care of allocations made through C-runtime. On the other hand, umdh collects allocations made through Win32 HeapAlloc. Modern C-runtimes end up calling HeapAlloc hence, you'll see in umdh dump both: all C-runtime allocations and allocations made by direct calls to HeapAlloc.

As to your practical question, here is what I can suggest:

  1. You can filter out allocations from C-runtime by module name: ucrtbase.dll (or ucrtbased.dll for debug builds)
  2. It will be much simpler if you manage to narrow the scope of umdh diff. Usually, you know which activity in your application causes a leak. Then you take the first dump right before initiating the activity and take the second dump when the activity is over.