Why can enum arrays be cast to two different IEnumerable<T> types?

429 views Asked by At

I seemed to have stumbled upon something unusual within C# that I don't fully understand. Say I have the following enum defined:

public enum Foo : short 
{
    // The values aren't really important
    A,
    B
}

If I declare an array of Foo or retrieve one through Enum.GetValues, I'm able to successfully cast it to both IEnumerable<short> and IEnumerable<ushort>. For example:

Foo[] values = Enum.GetValues( typeof( Foo ) );
// This cast succeeds as expected.
var asShorts = (IEnumerable<short>) values;
// This cast also succeeds, which wasn't expected.
var asUShorts = (IEnumerable<ushort>) values;
// This cast fails as expected:
var asInts = (IEnumerable<int>) values;

This happens for other underlying types as well. Arrays of enums always seem to be castable to both the signed and unsigned versions of the underlying integral type.

I'm at a loss for explaining this behavior. Is it well-defined behavior or have I just stumbled onto a peculiar quirk of the language or CLR?

1

There are 1 answers

1
Jon Skeet On BEST ANSWER

It's not just IEnumerable<T> - you can cast it to the array type as well, so long as you fool the compiler first:

public enum Foo : short 
{
    A, B
}

class Test
{
    static void Main()
    {
        Foo[] foo = new Foo[10];
        short[] shorts = (short[]) (object) foo;
        ushort[] ushorts = (ushort[]) (object) foo;        
    }
}

The C# language provides no expectation of this conversion being feasible, but the CLR is happy to do it. The reverse is true too - you could cast from a short[] to a Foo[]. Oh, and if you had another enum with the same underlying type, you could cast to that, too. Basically, the CLR knows that all of these types are just 16-bit integers - all bit patterns are valid values, even though they'll have different meanings - so it lets you treat one type as another. I don't think that's documented in the CLI specification - I couldn't find any reference to it, anyway.

This can cause some interesting problems when optimizations in LINQ (in Cast<> and ToList) are combined, just for the record.

For example:

int[] foo = new int[10];
var list = foo.Cast<uint>().ToList();

You might expect this to come up with an InvalidCastException (as it does if you start with a List<int> with any values, for example) but instead you end up with:

Unhandled Exception: System.ArrayTypeMismatchException: Source array type cannot be assigned to destination array type.
   at System.Array.Copy(Array sourceArray, Int32 sourceIndex, Array destinationArray, Int32 destinationIndex, Int32 length, Boolean reliable)
   at System.SZArrayHelper.CopyTo[T](T[] array, Int32 index)
   at System.Collections.Generic.List`1..ctor(IEnumerable`1 collection)
   at System.Linq.Enumerable.ToList[TSource](IEnumerable`1 source)
   at Test.Main()

That's because the Cast operation has checked and found that the int[] is already a uint[], so it's passed it right along to ToList(), which then tries to use Array.Copy, which does notice the difference. Ouch!