Followed this https://docs.databricks.com/notebooks/package-cells.html
On Community Edition - latest release Spark 3.x:
A.1. Created the Package with Object as per the example.
A.2. Ran in same Notebook in a different cell without cluster re-start. No issues, runs fine.
package x.y.z
object Utils {
val aNumber = 5 // works!
def functionThatWillWork(a: Int): Int = a + 1
}
import x.y.z.Utils
Utils.functionThatWillWork(Utils.aNumber)
B.1. Ran this in different Notebook without Cluster re-start. Error.
import x.y.z.Utils
Utils.functionThatWillWork(Utils.aNumber)
C.1. Re-started the Cluster. Ran the import. Error.
import x.y.z.Utils
Utils.functionThatWillWork(Utils.aNumber)
Question
Is this an issue with Community Edition? Do not think so, but cannot place it. Observations contradict the official docs.