We're updating the issue view to help you get more done. 

In PySparkling, getOrCreate(spark) still incorrectly complains that we should use spark session

Description

The issue is that we still use spark context internally in pysparkling, sending it to the java/scala backend where this warning is printed.

This can be easily fixed by properly using sparksession instead of spark context in pysparkling internals.

Environment

None

Status

Assignee

Jakub Hava

Reporter

Jakub Hava

Labels

None

Release Priority

None

CustomerVisible

No

testcase 1

None

testcase 2

None

testcase 3

None

h2ostream link

None

Affected Spark version

None

AffectedContact

None

AffectedCustomers

None

AffectedPilots

None

AffectedOpenSource

None

Support Assessment

None

Customer Request Type

None

Support ticket URL

None

End date

None

Baseline start date

None

Baseline end date

None

Task progress

None

Task mode

None

Fix versions

Priority

Major