I have a working IL code:
.method public hidebysig static void Main(string[] args) cil managed
{
.entrypoint
.maxstack 1
ldc.i4.s 10
ldc.i4.s 5
ldc.i4.s 15
ldc.i4.s 5
add
call void [mscorlib]System.Console::WriteLine(int32)
ret
}
I set .maxstack to 1 and pushed 4 values on the eval stack. Why does it work?
The CLI spec is fairly opaque about the intended usage of the
.maxstack
directive. Some hint that they foresaw the need for it but didn't nail down what the exact rules need to be. We can glean some insight in exactly how it is used from the jitter source that's included with the SSCLI20 distribution. The relevant C++ code in clr/src/fjit/fjit.cpp:Note the elbow-room they leave to avoid having to repeatedly re-allocate the operand stack data structure,
size+4
is enough to explain your observation. That's not the only reason, it also matters what method was jitted before and if it had a large.maxstack
then nothing goes wrong either.Fjit.cpp is just a sample implementation, so no guarantee that this works the same way in the jitter you use. But otherwise enough to provide insight, the directive is there to help the jitter avoid having to solve the chicken-and-egg problem, having to allocate the data structure before jitting the code. It is an optimization.
It can bomb, there's of course no point in intentionally lying about it.