Should I set ASP.NET application pool to auto-recycle?

10.6k views Asked by At

I have a number of ASP.NET (4.0) web applications that appear to leak (a small amount) of memory during each request. It is such a small amount, that for most use-cases, it will not grow to become a problem for weeks or even months at a time. I generally try to be good with closing any connections managed by the application, avoiding state-variables (or instance variables for my singleton), etc.

My question is this - is this normal behavior for ASP.NET applications? I had turned off the default (IIS 7) behavior of recycling the app pool after 20 minutes of being idle. I do this since the application takes a few minutes to build its internal cache, and I want to avoid negatively impacting the user experience (and having them wait for the application to start when they issue the request).

I am aware this could be mitigated by serializing the cache or speeding up the cache generation process but my question has more to do with the principle of it: I personally consider relying on the IIS auto-recycle feature as a bandaid approach. Am I wrong? am I just not seeing the garbage collector at work because the application's memory usage is not high enough compared to the amount of available memory? or should I dig deeper into the memory issues?

Any insight would be appreciated.

1

There are 1 answers

0
Oded On BEST ANSWER

Unfortunately it is normal, though mostly dues to shoddy application writers than anything else.

IIS by default configures newly created application pools to recycle every 1740 minutes for this reason.

As you said, this is a band-aid. A well written application that clears up all of its resources (including dangling event handlers), shouldn't leak at all.

See this blog post about the subject.