We're updating the issue view to help you get more done. 

Update FAQ with information about hive metastore location

Description

Hi James,

another Spark behavior: Spark SQL context (in fact Hive) requires the use of some metastore.
If i understand well Spark documentation, It requires `${SPARK_HOME}/conf/hive-site.xml` configuration
which configure location of hive metastore.

<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:derby:;databaseName=/<file-location>/metastore_db;create=true</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
</configuration>

Michal

Environment

None

Status

Assignee

Jakub Hava

Reporter

Michal Malohlava

Labels

None

Release Priority

None

CustomerVisible

No

testcase 1

None

testcase 2

None

testcase 3

None

h2ostream link

None

Affected Spark version

None

AffectedContact

None

AffectedCustomers

None

AffectedPilots

None

AffectedOpenSource

None

Support Assessment

None

Customer Request Type

None

Support ticket URL

None

End date

None

Baseline start date

None

Baseline end date

None

Task progress

None

Task mode

None

Components

Fix versions

Priority

Major