Hi James,
another Spark behavior: Spark SQL context (in fact Hive) requires the use of some metastore.
If i understand well Spark documentation, It requires `${SPARK_HOME}/conf/hive-site.xml` configuration
which configure location of hive metastore.
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:derby:;databaseName=/<file-location>/metastore_db;create=true</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
</configuration>
Michal