Why is /optimize in a C# project generating more Code Analysis warnings than without this enabled?

473 views Asked by At

We have a project where we use StyleCop and Code analysis to verify the structure of our code. We have set treat warnings as errors for both mechanisms.

We discover however a strage behaviour which we cannot explain. We have a debug and a release configuration. On our debug configuration we didn't get one CA warning while we get this warning on our release configuration. We started to look at the differences between those 2 configurations and we discovered the optimize checkbox was the difference why we got this warning during release but not during debug.

We have the configuration below.

<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">
    <DebugSymbols>true</DebugSymbols>
    <DebugType>full</DebugType>
    <Optimize>false</Optimize>
    <OutputPath>bin\Debug\</OutputPath>
    <DefineConstants>DEBUG;TRACE</DefineConstants>
    <ErrorReport>prompt</ErrorReport>
    <WarningLevel>4</WarningLevel>
</PropertyGroup>
<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
    <DebugType>pdbonly</DebugType>
    <Optimize>true</Optimize>
    <OutputPath>bin\Release\</OutputPath>
    <DefineConstants>TRACE</DefineConstants>
    <ErrorReport>prompt</ErrorReport>
    <WarningLevel>4</WarningLevel>
</PropertyGroup>

When we set the optimize value to true for debug we also the CA warning. When we set this to false the warning is gone. This is also only applicable for warning CA1806. Other CA warnings are correctly show regardless of the optimize value.

The code triggering this warning is the code below. This is just testing code but it simulates the real situation we had. A variable which has assigned a default value but is never used in any code.

public CourseService(IOrtContext context)
{
    this.context = context;
    var defaultDate = new DateTime(1900, 1, 1);
}

So, does anybody know why CA1806 is shown dependent on whether optimize is enabled or not?

2

There are 2 answers

0
Frédéric Hamidi On BEST ANSWER

I believe that's because the optimizer completely elides the assignment to defaultDate.

In debug mode, a new DateTime is instantiated and is assigned to the defaultDate local variable:

var defaultDate = new DateTime(1900, 1, 1);

The assignment is considered as a "use" of the DateTime instance, so CA1806 is not raised, even if defaultDate is subsequently not used.

On the other hand, the optimizer elides the assignment (and the local variable) in release mode:

/* var defaultDate = */ new DateTime(1900, 1, 1);

Therefore the DateTime instance is not considered as "used" anymore, and CA1806 is raised.

0
xanatos On

Throwing rocks at random here... Simpler code for debug: http://goo.gl/8TXmE9 and release: http://goo.gl/XRBfQp .

In debug mode, the lifetime of local variables is "extended" until the end of the method, so that they are easier to debug. In release mode this doesn't happen. Variables are aggressively deallocated. You can see it if you compile a program in debug mode or in release mode and try to debug it... in release mode sometimes some variables can't be accessed, because they have ended their lifetime.

This can be see in the release IL code:

IL_0007: newobj instance void [mscorlib]System.DateTime::.ctor(int32, int32, int32)
IL_000c: pop

the return of new DateTime() is popped out of the stack immediately. The CA1806 can easily detect it.

In debug mode, the value isn't ever popped out of the stack. An analysis to check if it is used somewhere would be quite complex and probably it isn't done.