Scala type inference for both a generic type and it's type parameter - why doesn't it work?

595 views Asked by At

If I were to name the single most annoying thing about scala, it would be that for the following code:

trait G[+T]
class H[+T] extends G[T]

def f[A<:G[X], X<:Int](g :A)

val g :H[Int]
f(g)

the compiler infers types of the last call to f[H[Int], Nothing] and complains before me for its own stupidity.

Knowing scala however, it actually knows better than me. What's the reason behind it? As both G and H are covariant regarding to T, S <: G[X] with H[_] <=> S<: H[X] for any type S. This one shortcoming made me design everything around avoiding having to specify the types explicitly - it may look like nothing here, but when names become of 'real' length and pretty much any method is generic, and often working on two generic types, it turns out that most of the code are type declarations.

EDIT: the case above was solved below by Noah, but what when derived class is not of the same kind as the base class, as below?

trait G[+X]
class H[+X, Y] extends G[X]
class F extends G[Int]
def f[A<:G[X], X<:Int](g :A) = g

val h: H[Int, String] = ???
val g :F = ???
f(g)
f(h)
1

There are 1 answers

4
Noah On BEST ANSWER

If you make A take a type paramater A[_] I think you can get the Scala compiler to agree with you instead of just making everything Nothing:

def f[A[_] <: G[_], X <: Int](g: A[X]) 

As a side note, I usually take a look in the scalaz source whenever I have a type problem as they have usually encountered it and solved it as best as possible.

UPDATE

The method I provided above still works with the additional constraints given:

  trait G[+X]

  class H[+X, Y] extends G[X]

  class F extends G[Int]

  class I extends G[String]

  def f[A[_] <: G[_], X <: Int](g: A[X]) = g

  val h: H[Int, String] = new H[Int, String]
  val g: F = new F
  val i:I = new I
  f(g) //works
  f(h) //works
  f(i) // should fail and does fail