When using default settings, an app that contains multiple scala/java files and spark-submit we will end up with a ClassNotFound exception.
1) start a spark cluster (local is ok with $SPARK_HOME/sbin/start-all.sh)
2) Create a project with 2 files:
3) Package it (for sbt projects >sbt package will do)
4) Submit the app:
Simply set "spark.ext.h2o.repl.enabled" to "false" on SparkConfig.
1) Simply add this to the doc (do we need repl.enabled in true when doing spark-submit??)
2) detect the appropriate default "spark.ext.h2o.repl.enabled" setting automatically
3) fix the classpath issue
Tried it only with SW1.6.3 and Spark 1.6.1, need to check other versions.
When running H2OIMain#initialize this branch is called:
The comment would suggest it's not right?
When trying to Load the IntWrapper class via InterpreterClassLoader#LoadClass this branch is called and throws ClassNotFound:
Changing it to Class.forName(name) works but I'm not sure if that's the right solution.