In an earlier question I wondered if there is some way in C# to implement matrix dimensionality as a type-level property. That is, I wanted to be able to declare a generic family of classes of the form Matrix<M, N>, where M and N are natural numbers. The ultimate goal was to have dimensionality checks at compile-time, so that, for example, trying to multiply a 3x3 matrix by a 2x3 matrix should yield a compile-time error, or having functions that only accept square matrices (Matrix<M, M>).
Unfortunately it does not seem like this is possible in C#, the issue being that M and N in the generic type Matrix<M, N> have to be types, not integers. Some other languages seem to have mechanisms that get around this issue however:
Some functional languages, such as Haskell, have the possibility to define type-level naturals. If this were possible in C#, then it should also be possible to define
Matrix<M, N>, withMandNbeing type-level naturals. I don't know much about F#, but this answer seems to indicate that this is possible in F# as well.In C++ one can use templates with non-type template parameters. If I understand it correctly, classes for the specific sizes that are actually used in the code are then generated at compile-time.
Haskell has something called indexed type families that seem capture exactly the concept I am after.
Now, since C# and F# are both .NET languages I am primarily interested in point 1 above. Would it perhaps be possible to implement the types that I need in F#, and then provide them to my application as a library? The best would be if I could define only the type-level naturals in F#, and then use them in C# to define Matrix<M, N>.
I would do this using an interface with a static member. In F#, you can define type-level natural numbers like this:
Then in C#, you can define a matrix like this:
Example usage:
Output is: