Show / Hide Table of Contents

Class JobArgs

Inheritance
System.Object
InputArgs
ResourceArgs
JobArgs
Inherited Members
ResourceArgs.Empty
System.Object.Equals(System.Object)
System.Object.Equals(System.Object, System.Object)
System.Object.GetHashCode()
System.Object.GetType()
System.Object.MemberwiseClone()
System.Object.ReferenceEquals(System.Object, System.Object)
System.Object.ToString()
Namespace: Pulumi.Gcp.Dataproc
Assembly: Pulumi.Gcp.dll
Syntax
public sealed class JobArgs : ResourceArgs

Constructors

View Source

JobArgs()

Declaration
public JobArgs()

Properties

View Source

ForceDelete

By default, you can only delete inactive jobs within Dataproc. Setting this to true, and calling destroy, will ensure that the job is first cancelled before issuing the delete.

Declaration
public Input<bool> ForceDelete { get; set; }
Property Value
Type Description
Input<System.Boolean>
View Source

HadoopConfig

Declaration
public Input<JobHadoopConfigArgs> HadoopConfig { get; set; }
Property Value
Type Description
Input<JobHadoopConfigArgs>
View Source

HiveConfig

Declaration
public Input<JobHiveConfigArgs> HiveConfig { get; set; }
Property Value
Type Description
Input<JobHiveConfigArgs>
View Source

Labels

The list of labels (key/value pairs) to add to the job.

Declaration
public InputMap<string> Labels { get; set; }
Property Value
Type Description
InputMap<System.String>
View Source

PigConfig

Declaration
public Input<JobPigConfigArgs> PigConfig { get; set; }
Property Value
Type Description
Input<JobPigConfigArgs>
View Source

Placement

Declaration
public Input<JobPlacementArgs> Placement { get; set; }
Property Value
Type Description
Input<JobPlacementArgs>
View Source

Project

The project in which the cluster can be found and jobs subsequently run against. If it is not provided, the provider project is used.

Declaration
public Input<string> Project { get; set; }
Property Value
Type Description
Input<System.String>
View Source

PysparkConfig

Declaration
public Input<JobPysparkConfigArgs> PysparkConfig { get; set; }
Property Value
Type Description
Input<JobPysparkConfigArgs>
View Source

Reference

Declaration
public Input<JobReferenceArgs> Reference { get; set; }
Property Value
Type Description
Input<JobReferenceArgs>
View Source

Region

The Cloud Dataproc region. This essentially determines which clusters are available for this job to be submitted to. If not specified, defaults to global.

Declaration
public Input<string> Region { get; set; }
Property Value
Type Description
Input<System.String>
View Source

Scheduling

Optional. Job scheduling configuration.

Declaration
public Input<JobSchedulingArgs> Scheduling { get; set; }
Property Value
Type Description
Input<JobSchedulingArgs>
View Source

SparkConfig

Declaration
public Input<JobSparkConfigArgs> SparkConfig { get; set; }
Property Value
Type Description
Input<JobSparkConfigArgs>
View Source

SparksqlConfig

Declaration
public Input<JobSparksqlConfigArgs> SparksqlConfig { get; set; }
Property Value
Type Description
Input<JobSparksqlConfigArgs>
  • View Source
Back to top Copyright 2016-2020, Pulumi Corporation.