c# Implicitly typed arrays behind the scenes

69 views Asked by At

Can someone explain how the compiler proceeds here to find the "common" type (double).
I assume the IConvertible plays a role here?

private static void Main(string[] args)
{
    var nrColl = new[] { 1, 2, 0.14, 'a',};
}
2

There are 2 answers

0
Sweeper On BEST ANSWER

See Finding the best common type of a set of expressions in the specification.

  • A new unfixed type variable X is introduced.
  • For each expression Ei an output type inference is performed from it to X.
  • X is fixed, if possible, and the resulting type is the best common type.
  • Otherwise inference fails.

Intuitively this inference is equivalent to calling a method void M(X x₁ ... X xᵥ) with the Eᵢ as arguments and inferring X.

If you follow the steps of how an output type inference is done, you will eventually see that in this case the type variable X will have the lower bounds int, double, and char.

Then, to fix X,

For each lower bound U of Xᵢ all types Uₑ to which there is not an implicit conversion from U are removed from the candidate set.

If among the remaining candidate types Uₑ there is a unique type V to which there is an implicit conversion from all the other candidate types, then Xᵢ is fixed to V.

We can try to do that manually.

Let's first examine int. int can be implicitly converted to double, so double is not removed. int cannot be converted to char, so char is removed. Next we will examine double. double cannot be implicitly converted to int, so int is removed. We are now only left with double, and so X is fixed to double.

Basically, if there is exactly one type T in the types of the expressions in the array initialiser, that all the other expressions can be implicitly converted to, then T is the type of the array.

1
Lukas On

This happens because every type you give it here is implicitly convertable to a double. When you try to change the char ('a') to a string ("a") you'll notice that it won't work anymore. This is because a character in .Net is nothing more than a UInt16 with some additional methods and different Overrides (for example ToString) and it defines a implicit conversion to double.

Here is a example:

public static void Main()
{
    var bar = new[] { 1, 2, 0.14, 'a', new Foo(42)};
}

public class Foo
{
    public Foo(int value)
    {
        _value = value;
    }

    private int _value;

    public static implicit operator double(Foo foo)
    {
        return foo._value;
    }
}

Interestingly this also works if no direct implicit operator exists, but a chain of multiple implicit operators exists.

public static void Main()
{
    var bar = new[] { 1, 2, 0.14, 'a', new Foo(42)};
}

public class Foo
{
    public Foo(int value)
    {
        _value = value;
    }

    private int _value;

    public static implicit operator int(Foo foo)
    {
        return foo._value;
    }
}

But this 'conversion chaining' only seems to work with internal types for some reason.

This doesn't work:

public static void Main()
{
    var bar = new[] { 1, 2, 0.14, 'a', new Foo(42)};
}

public class Bar
{
    public Bar(int value)
    {
        Value = value;
    }

    public int Value { get; set; }
    public static implicit operator double(Bar bar) => bar.Value;
}

public class Foo
{
    public Foo(int value)
    {
        _value = value;
    }

    private int _value;

    public static implicit operator Bar(Foo foo)
    {
        return new Bar(foo._value);
    }
}