Details
Description
For example, if we store LDAP login conf on `wasb` data lake as `H2OConf.set_login_conf("wasb://///"), this address is passed to H2O which then tries to read that file but successful as this data source is not supported.
On the other hand, Spark can read from these paths, so what we can do is to add file using sc.addFile and then read using SparkFiles.get. This will add the file to the spark driver and we can than access it on the local filesystem.
This can happen behind the `set_login_conf` call so the user does not need to worry about those