Pattern Matching chooses the wrong case if used with ClassTag

373 views Asked by At

What is the connection between variance and ClassTags or TypeTags?

I have two types T1 and T2, that are used as type params.

case class T1()
case class T2()

I have an abstract class with an invariant type parameter and one subclass, if I want to check the type of the type parameter it only works if there is no type check in the pattern, just in the guard. If a type test is present it always chooses the first case.

The type check is necessary because in my real code I want to call different functions for each type. There are separate functions for In[T1] and In[T2].

abstract class In[T]
case class VIn[T]() extends In[T]

def test1[T:ClassTag](v:In[T]) = v match {
  case x : VIn[T1@unchecked] if classTag[T] == classTag[T1] => "T1"
  case y : VIn[T2@unchecked] if classTag[T] == classTag[T2] => "T2"
}
test1(VIn[T1]())  //T1
test1(VIn[T2]())  //T1 !!!

def test2[T:ClassTag](v:In[T]) = v match {
  case x if classTag[T] == classTag[T1] => "T1"
  case y if classTag[T] == classTag[T2] => "T2"
}
test2(VIn[T1]())  //T1
test2(VIn[T2]())  //T2

While playing around with the List type that's use in many examples, I realized that if the type parameter is changed to be covariant it works in both tests.

abstract class Co[+T]
case class VCo[T]() extends Co[T]

def test1[T:ClassTag](v:Co[T]) = v match {
  case x : VCo[T1@unchecked] if classTag[T] == classTag[T1] => "T1"
  case y : VCo[T2@unchecked] if classTag[T] == classTag[T2] => "T2"
}
test1(VCo[T1]())  // T1
test1(VCo[T2]())  // T2

def test2[T:ClassTag](v:Co[T]) = v match {
  case x if classTag[T] == classTag[T1] => "T1"
  case y if classTag[T] == classTag[T2] => "T2"
}
test2(VCo[T1]())  // T1
test2(VCo[T2]())  // T2

Why does the first tests fail for invarinat types? There isn't a compiler warning, or a runtime error, it just picks the first case, but the guard is obviously false, as demonstrated in test2.

1

There are 1 answers

0
Jatin On BEST ANSWER

I definitely think it is a compiler bug.

def test1[T:ClassTag](v:In[T]) = {  
  val t1 = classTag[T] == classTag[T1]
  val t2 = classTag[T] == classTag[T2]

  println(v match {
    case x : VIn[T1@unchecked] if t1 => "T1"
    case y : VIn[T2@unchecked] if t2 => "T2"
  })

  v match {
  case x:In[T1] if classTag[T] == classTag[T1] => "T1"
  case y: In[T2] if classTag[T] == classTag[T2] => "T2"
}}

Printing with -Xprint:typer shows that (evidence):

def test1[T](v: In[T])(implicit evidence$1: scala.reflect.ClassTag[T]): String = {
....

val t1: Boolean = scala.reflect.`package`.classTag[T](evidence$1).==
         (scala.reflect.`package`.classTag[T1]((ClassTag.apply[T1](classOf[T1]): scala.reflect.ClassTag[T1]))) 


val t2: Boolean = scala.reflect.`package`.classTag[T](evidence$1).==        
        (scala.reflect.`package`.classTag[T2]((ClassTag.apply[T2](classOf[T2]): scala.reflect.ClassTag[T2]))) //classTag[T] == classTag[T2]

The if statement of the pattern match is:

scala.reflect.`package`.classTag[T](evidence$1).==(scala.reflect.`package`.classTag[T1](evidence$1))

scala.reflect.`package`.classTag[T](evidence$1).==(scala.reflect.`package`.classTag[T2](evidence$1))

The compiler is passing evidence$1 to the implicit argument of classTag[1]() and classTag[2](). So it is actually comparing itself to itself. As a work around, as you suggested test2or a pre-calculated if seems to work.