Legacy Product

Fusion 5.10
    Fusion 5.10

    Custom Spark Jobs

    Run a custom Spark job.

    Fusion identifies all JARs in the blob store and appends them to the Spark environment. Make sure to upload the JAR file to the blob store with resourceType=spark:jar so your custom Spark job will see it.

    Use this job when you want to run a custom JAR on Spark

    id - stringrequired

    The ID for this Spark job. Used in the API to reference this job. Allowed characters: a-z, A-Z, dash (-) and underscore (_). Maximum length: 63 characters.

    <= 63 characters

    Match pattern: [a-zA-Z][_\-a-zA-Z0-9]*[a-zA-Z0-9]?

    sparkConfig - array[object]

    Spark configuration settings.

    object attributes:{key required : {
     display name: Parameter Name
     type: string
    }
    value : {
     display name: Parameter Value
     type: string
    }
    }

    klassName - stringrequired

    Fully-qualified name of the Java/Scala class to invoke

    submitArgs - array[string]

    Additional options to pass to the application when running this job.

    script - string

    Use this text field if you want to override the default behaviour, which is to run className.main(args)

    type - stringrequired

    Default: custom_spark_scala_job

    Allowed values: custom_spark_scala_job