Custom Spark Jobs
Run a custom Spark job.
Fusion identifies all JARs in the blob store and appends them to the Spark environment. Make sure to upload the JAR file to the blob store with resourceType=spark:jar
so your custom Spark job will see it.
Legacy Product
Run a custom Spark job.
Fusion identifies all JARs in the blob store and appends them to the Spark environment. Make sure to upload the JAR file to the blob store with resourceType=spark:jar
so your custom Spark job will see it.
Use this job when you want to run a custom JAR on Spark
The ID for this Spark job. Used in the API to reference this job. Allowed characters: a-z, A-Z, dash (-) and underscore (_). Maximum length: 63 characters.
<= 63 characters
Match pattern: [a-zA-Z][_\-a-zA-Z0-9]*[a-zA-Z0-9]?
Spark configuration settings.
object attributes:{key
required : {
display name: Parameter Name
type: string
}value
: {
display name: Parameter Value
type: string
}}
Fully-qualified name of the Java/Scala class to invoke
Additional options to pass to the application when running this job.
Use this text field if you want to override the default behaviour, which is to run className.main(args)
Default: custom_spark_scala_job
Allowed values: custom_spark_scala_job