I have define some trait and class in spark program, it works well when packaged into a jar, but if execute in spark repl, it will fail.
trait Builder {
trait Layer {
def layerSize: Int
}
abstract class Layer1 extends Layer
class Layer2(val layerName: String) extends Layer {
override def layerSize: Int = 10
def addInput(from: Layer): String = {
""
}
}
object Layer2 {
def apply(name: String): Layer2 = {
new Layer2(name)
}
}
val FEATURES: Layer = new Layer1 {
override val layerSize = 10
}
}
class aBuilder extends Builder
then i run the code :
val builder = new aBuilder
val test = builder.Layer2("")
test.addInput(builder.FEATURES)
it error with:
<console>:32: error: type mismatch;
found : builder.Layer
required: builder.Layer
test.addInput(builder.FEATURES)
Ok the following seems to work with spark 1.6.0 and official binaries (scala 2.10)
Nevertheless, I'm not sure if it's a variant of the following issue : SPARK-5149.