Revisit PySparkling Water Initializer - adding sparkling-water.jar to classpath has non-deterministic behavior

Description

Problem: if using PySparkling i can see that defined jars are added to executor in different order with respect to `sparkling_water_assemly.jar`

The code is here: https://github.com/h2oai/sparkling-water/blob/master/py/src/ai/h2o/sparkling/Initializer.py#L144-L160

Test scenario:

  • Spark 2.4.0, Sparkling Water 3.26.6-2.4, mojo2-runtime 2.1.8-SNAPSHOT (d79b102494b31c55307a20eed3e975c8404ea43f)

  • CMD line:

In successful run, the executor log contains the following order:

but in failing runs the order is different:

Note: spark is adding the jars to executor classloader in undefined order:
https://github.com/apache/spark/blob/5a512e86e94593bc004a35101ad6497e20c13e0a/core/src/main/scala/org/apache/spark/executor/Executor.scala#L818-L826

Assignee

Unassigned

Reporter

Michal Malohlava

Labels

None

CustomerVisible

No

testcase 1

None

testcase 2

None

testcase 3

None

h2ostream link

None

Affected Spark version

None

AffectedContact

None

AffectedCustomers

None

AffectedPilots

None

AffectedOpenSource

None

Support Assessment

None

Customer Request Type

None

Support ticket URL

None

End date

None

Baseline start date

None

Baseline end date

None

Task progress

None

Task mode

None

Priority

Major
Configure