Job

Creates a job on Dataflow, which is an implementation of Apache Beam running on Google Compute Engine. For more information see the official documentation for Beam and Dataflow.

Note on “destroy” / “apply”

There are many types of Dataflow jobs. Some Dataflow jobs run constantly, getting new data from (e.g.) a GCS bucket, and outputting data continuously. Some jobs process a set amount of data then terminate. All jobs can fail while running due to programming errors or other issues. In this way, Dataflow jobs are different from most other Google resources.

The Dataflow resource is considered ‘existing’ while it is in a nonterminal state. If it reaches a terminal state (e.g. ‘FAILED’, ‘COMPLETE’, ‘CANCELLED’), it will be recreated on the next ‘apply’. This is as expected for jobs which run continuously, but may surprise users who use this resource for other kinds of Dataflow jobs.

A Dataflow job which is ‘destroyed’ may be “cancelled” or “drained”. If “cancelled”, the job terminates - any data written remains where it is, but no new data will be processed. If “drained”, no new data will enter the pipeline, but any data currently in the pipeline will finish being processed. The default is “cancelled”, but if a user sets on_delete to "drain" in the configuration, you may experience a long wait for your pulumi destroy to complete.

Create a Job Resource

new Job(name: string, args: JobArgs, opts?: CustomResourceOptions);
def Job(resource_name, opts=None, additional_experiments=None, ip_configuration=None, labels=None, machine_type=None, max_workers=None, name=None, network=None, on_delete=None, parameters=None, project=None, region=None, service_account_email=None, subnetwork=None, temp_gcs_location=None, template_gcs_path=None, zone=None, __props__=None);
func NewJob(ctx *Context, name string, args JobArgs, opts ...ResourceOption) (*Job, error)
public Job(string name, JobArgs args, CustomResourceOptions? opts = null)
name string
The unique name of the resource.
args JobArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
resource_name str
The unique name of the resource.
opts ResourceOptions
A bag of options that control this resource's behavior.
ctx Context
Context object for the current deployment.
name string
The unique name of the resource.
args JobArgs
The arguments to resource properties.
opts ResourceOption
Bag of options to control resource's behavior.
name string
The unique name of the resource.
args JobArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.

Job Resource Properties

To learn more about resource properties and how to use them, see Inputs and Outputs in the Programming Model docs.

Inputs

The Job resource accepts the following input properties:

TempGcsLocation string

A writeable location on GCS for the Dataflow job to dump its temporary data.

TemplateGcsPath string

The GCS path to the Dataflow job template.

AdditionalExperiments List<string>

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

IpConfiguration string

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

Labels Dictionary<string, object>

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

MachineType string

The machine type to use for the job.

MaxWorkers int

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

Name string

A unique name for the resource, required by Dataflow.

Network string

The network to which VMs will be assigned. If it is not provided, “default” will be used.

OnDelete string

One of “drain” or “cancel”. Specifies behavior of deletion during pulumi destroy. See above note.

Parameters Dictionary<string, object>

Key/Value pairs to be passed to the Dataflow job (as used in the template).

Project string

The project in which the resource belongs. If it is not provided, the provider project is used.

Region string

The region in which the created job should run.

ServiceAccountEmail string

The Service Account email used to create the job.

Subnetwork string

The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.

Zone string

The zone in which the created job should run. If it is not provided, the provider zone is used.

TempGcsLocation string

A writeable location on GCS for the Dataflow job to dump its temporary data.

TemplateGcsPath string

The GCS path to the Dataflow job template.

AdditionalExperiments []string

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

IpConfiguration string

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

Labels map[string]interface{}

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

MachineType string

The machine type to use for the job.

MaxWorkers int

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

Name string

A unique name for the resource, required by Dataflow.

Network string

The network to which VMs will be assigned. If it is not provided, “default” will be used.

OnDelete string

One of “drain” or “cancel”. Specifies behavior of deletion during pulumi destroy. See above note.

Parameters map[string]interface{}

Key/Value pairs to be passed to the Dataflow job (as used in the template).

Project string

The project in which the resource belongs. If it is not provided, the provider project is used.

Region string

The region in which the created job should run.

ServiceAccountEmail string

The Service Account email used to create the job.

Subnetwork string

The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.

Zone string

The zone in which the created job should run. If it is not provided, the provider zone is used.

tempGcsLocation string

A writeable location on GCS for the Dataflow job to dump its temporary data.

templateGcsPath string

The GCS path to the Dataflow job template.

additionalExperiments string[]

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

ipConfiguration string

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

labels {[key: string]: any}

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

machineType string

The machine type to use for the job.

maxWorkers number

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

name string

A unique name for the resource, required by Dataflow.

network string

The network to which VMs will be assigned. If it is not provided, “default” will be used.

onDelete string

One of “drain” or “cancel”. Specifies behavior of deletion during pulumi destroy. See above note.

parameters {[key: string]: any}

Key/Value pairs to be passed to the Dataflow job (as used in the template).

project string

The project in which the resource belongs. If it is not provided, the provider project is used.

region string

The region in which the created job should run.

serviceAccountEmail string

The Service Account email used to create the job.

subnetwork string

The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.

zone string

The zone in which the created job should run. If it is not provided, the provider zone is used.

temp_gcs_location str

A writeable location on GCS for the Dataflow job to dump its temporary data.

template_gcs_path str

The GCS path to the Dataflow job template.

additional_experiments List[str]

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

ip_configuration str

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

labels Dict[str, Any]

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

machine_type str

The machine type to use for the job.

max_workers float

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

name str

A unique name for the resource, required by Dataflow.

network str

The network to which VMs will be assigned. If it is not provided, “default” will be used.

on_delete str

One of “drain” or “cancel”. Specifies behavior of deletion during pulumi destroy. See above note.

parameters Dict[str, Any]

Key/Value pairs to be passed to the Dataflow job (as used in the template).

project str

The project in which the resource belongs. If it is not provided, the provider project is used.

region str

The region in which the created job should run.

service_account_email str

The Service Account email used to create the job.

subnetwork str

The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.

zone str

The zone in which the created job should run. If it is not provided, the provider zone is used.

Outputs

All input properties are implicitly available as output properties. Additionally, the Job resource produces the following output properties:

Id string
The provider-assigned unique ID for this managed resource.
JobId string

The unique ID of this job.

State string

The current state of the resource, selected from the JobState enum

Type string

The type of this job, selected from the JobType enum

Id string
The provider-assigned unique ID for this managed resource.
JobId string

The unique ID of this job.

State string

The current state of the resource, selected from the JobState enum

Type string

The type of this job, selected from the JobType enum

id string
The provider-assigned unique ID for this managed resource.
jobId string

The unique ID of this job.

state string

The current state of the resource, selected from the JobState enum

type string

The type of this job, selected from the JobType enum

id str
The provider-assigned unique ID for this managed resource.
job_id str

The unique ID of this job.

state str

The current state of the resource, selected from the JobState enum

type str

The type of this job, selected from the JobType enum

Look up an Existing Job Resource

Get an existing Job resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.

public static get(name: string, id: Input<ID>, state?: JobState, opts?: CustomResourceOptions): Job
static get(resource_name, id, opts=None, additional_experiments=None, ip_configuration=None, job_id=None, labels=None, machine_type=None, max_workers=None, name=None, network=None, on_delete=None, parameters=None, project=None, region=None, service_account_email=None, state=None, subnetwork=None, temp_gcs_location=None, template_gcs_path=None, type=None, zone=None, __props__=None);
func GetJob(ctx *Context, name string, id IDInput, state *JobState, opts ...ResourceOption) (*Job, error)
public static Job Get(string name, Input<string> id, JobState? state, CustomResourceOptions? opts = null)
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
resource_name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.

The following state arguments are supported:

AdditionalExperiments List<string>

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

IpConfiguration string

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

JobId string

The unique ID of this job.

Labels Dictionary<string, object>

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

MachineType string

The machine type to use for the job.

MaxWorkers int

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

Name string

A unique name for the resource, required by Dataflow.

Network string

The network to which VMs will be assigned. If it is not provided, “default” will be used.

OnDelete string

One of “drain” or “cancel”. Specifies behavior of deletion during pulumi destroy. See above note.

Parameters Dictionary<string, object>

Key/Value pairs to be passed to the Dataflow job (as used in the template).

Project string

The project in which the resource belongs. If it is not provided, the provider project is used.

Region string

The region in which the created job should run.

ServiceAccountEmail string

The Service Account email used to create the job.

State string

The current state of the resource, selected from the JobState enum

Subnetwork string

The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.

TempGcsLocation string

A writeable location on GCS for the Dataflow job to dump its temporary data.

TemplateGcsPath string

The GCS path to the Dataflow job template.

Type string

The type of this job, selected from the JobType enum

Zone string

The zone in which the created job should run. If it is not provided, the provider zone is used.

AdditionalExperiments []string

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

IpConfiguration string

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

JobId string

The unique ID of this job.

Labels map[string]interface{}

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

MachineType string

The machine type to use for the job.

MaxWorkers int

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

Name string

A unique name for the resource, required by Dataflow.

Network string

The network to which VMs will be assigned. If it is not provided, “default” will be used.

OnDelete string

One of “drain” or “cancel”. Specifies behavior of deletion during pulumi destroy. See above note.

Parameters map[string]interface{}

Key/Value pairs to be passed to the Dataflow job (as used in the template).

Project string

The project in which the resource belongs. If it is not provided, the provider project is used.

Region string

The region in which the created job should run.

ServiceAccountEmail string

The Service Account email used to create the job.

State string

The current state of the resource, selected from the JobState enum

Subnetwork string

The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.

TempGcsLocation string

A writeable location on GCS for the Dataflow job to dump its temporary data.

TemplateGcsPath string

The GCS path to the Dataflow job template.

Type string

The type of this job, selected from the JobType enum

Zone string

The zone in which the created job should run. If it is not provided, the provider zone is used.

additionalExperiments string[]

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

ipConfiguration string

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

jobId string

The unique ID of this job.

labels {[key: string]: any}

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

machineType string

The machine type to use for the job.

maxWorkers number

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

name string

A unique name for the resource, required by Dataflow.

network string

The network to which VMs will be assigned. If it is not provided, “default” will be used.

onDelete string

One of “drain” or “cancel”. Specifies behavior of deletion during pulumi destroy. See above note.

parameters {[key: string]: any}

Key/Value pairs to be passed to the Dataflow job (as used in the template).

project string

The project in which the resource belongs. If it is not provided, the provider project is used.

region string

The region in which the created job should run.

serviceAccountEmail string

The Service Account email used to create the job.

state string

The current state of the resource, selected from the JobState enum

subnetwork string

The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.

tempGcsLocation string

A writeable location on GCS for the Dataflow job to dump its temporary data.

templateGcsPath string

The GCS path to the Dataflow job template.

type string

The type of this job, selected from the JobType enum

zone string

The zone in which the created job should run. If it is not provided, the provider zone is used.

additional_experiments List[str]

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

ip_configuration str

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

job_id str

The unique ID of this job.

labels Dict[str, Any]

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

machine_type str

The machine type to use for the job.

max_workers float

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

name str

A unique name for the resource, required by Dataflow.

network str

The network to which VMs will be assigned. If it is not provided, “default” will be used.

on_delete str

One of “drain” or “cancel”. Specifies behavior of deletion during pulumi destroy. See above note.

parameters Dict[str, Any]

Key/Value pairs to be passed to the Dataflow job (as used in the template).

project str

The project in which the resource belongs. If it is not provided, the provider project is used.

region str

The region in which the created job should run.

service_account_email str

The Service Account email used to create the job.

state str

The current state of the resource, selected from the JobState enum

subnetwork str

The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.

temp_gcs_location str

A writeable location on GCS for the Dataflow job to dump its temporary data.

template_gcs_path str

The GCS path to the Dataflow job template.

type str

The type of this job, selected from the JobType enum

zone str

The zone in which the created job should run. If it is not provided, the provider zone is used.

Package Details

Repository
https://github.com/pulumi/pulumi-gcp
License
Apache-2.0
Notes
This Pulumi package is based on the google-beta Terraform Provider.