pyspark.sql.SparkSession.addArtifacts#

SparkSession.addArtifacts(*path, pyfile=False, archive=False, file=False)[source]#

Add artifact(s) to the client session. Currently only local files are supported.

New in version 3.5.0.

Parameters
*pathtuple of str

Artifact’s URIs to add.

pyfilebool

Whether to add them as Python dependencies such as .py, .egg, .zip or .jar files. The pyfiles are directly inserted into the path when executing Python functions in executors.

archivebool

Whether to add them as archives such as .zip, .jar, .tar.gz, .tgz, or .tar files. The archives are unpacked on the executor side automatically.

filebool

Add a file to be downloaded with this Spark job on every node. The path passed can only be a local file for now.

Notes

This is an API dedicated to Spark Connect client only. With regular Spark Session, it throws an exception.