While trying to diagnose memory leaks with the {,,ucrtbased.dll}_crtBreakAlloc
method, I found a memory leak which depends on timing in a multithreaded environment. Due to the multithreading, it's no longer predictable, which allocation number the leak in question will get. Using _crtBreakAlloc
in Visual Studio, I get a total of less than 20 memory leaks reported.
Because I can't conclude over different runs in Visual Studio with these multithreading issues, I decided to take a different approach using UMDH. I enabled the GFlags User Mode Stack Trace Database (UST) and took a snapshot at the initial breakpoint in WinDbg and another snapshot just before process exit.
UMDH reports more than 1000 memory leaks, a lot of them which have .NET as part of the call stack (clr!something+X
), but also others (I call them "native").
I want to figure out which filter I could use in UmdhGui in order to reduce the results to the same amount.
What kind of decision making does Visual Studio or _crtBreakAlloc
apply so that it reports less than 20 leaks?
_crtBreakAlloc
and the entire_crtXXX
family take care of allocations made through C-runtime. On the other hand, umdh collects allocations made through Win32HeapAlloc
. Modern C-runtimes end up callingHeapAlloc
hence, you'll see in umdh dump both: all C-runtime allocations and allocations made by direct calls toHeapAlloc
.As to your practical question, here is what I can suggest:
ucrtbase.dll
(orucrtbased.dll
for debug builds)