Job
Creates a job on Dataflow, which is an implementation of Apache Beam running on Google Compute Engine. For more information see the official documentation for Beam and Dataflow.
Note on “destroy” / “apply”
There are many types of Dataflow jobs. Some Dataflow jobs run constantly, getting new data from (e.g.) a GCS bucket, and outputting data continuously. Some jobs process a set amount of data then terminate. All jobs can fail while running due to programming errors or other issues. In this way, Dataflow jobs are different from most other Google resources.
The Dataflow resource is considered ‘existing’ while it is in a nonterminal state. If it reaches a terminal state (e.g. ‘FAILED’, ‘COMPLETE’, ‘CANCELLED’), it will be recreated on the next ‘apply’. This is as expected for jobs which run continuously, but may surprise users who use this resource for other kinds of Dataflow jobs.
A Dataflow job which is ‘destroyed’ may be “cancelled” or “drained”. If “cancelled”, the job terminates - any data written remains where it is, but no new data will be processed. If “drained”, no new data will enter the pipeline, but any data currently in the pipeline will finish being processed. The default is “cancelled”, but if a user sets on_delete to "drain" in the configuration, you may experience a long wait for your pulumi destroy to complete.
Create a Job Resource
new Job(name: string, args: JobArgs, opts?: CustomResourceOptions);def Job(resource_name, opts=None, additional_experiments=None, ip_configuration=None, labels=None, machine_type=None, max_workers=None, name=None, network=None, on_delete=None, parameters=None, project=None, region=None, service_account_email=None, subnetwork=None, temp_gcs_location=None, template_gcs_path=None, zone=None, __props__=None);public Job(string name, JobArgs args, CustomResourceOptions? opts = null)- name string
- The unique name of the resource.
- args JobArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- opts ResourceOptions
- A bag of options that control this resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args JobArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args JobArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
Job Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Programming Model docs.
Inputs
The Job resource accepts the following input properties:
- Temp
Gcs stringLocation A writeable location on GCS for the Dataflow job to dump its temporary data.
- Template
Gcs stringPath The GCS path to the Dataflow job template.
- Additional
Experiments List<string> List of experiments that should be used by the job. An example value is
["enable_stackdriver_agent_metrics"].- Ip
Configuration string The configuration for VM IPs. Options are
"WORKER_IP_PUBLIC"or"WORKER_IP_PRIVATE".- Labels Dictionary<string, object>
User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with
goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.- Machine
Type string The machine type to use for the job.
- Max
Workers int The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
- Name string
A unique name for the resource, required by Dataflow.
- Network string
The network to which VMs will be assigned. If it is not provided, “default” will be used.
- On
Delete string One of “drain” or “cancel”. Specifies behavior of deletion during
pulumi destroy. See above note.- Parameters Dictionary<string, object>
Key/Value pairs to be passed to the Dataflow job (as used in the template).
- Project string
The project in which the resource belongs. If it is not provided, the provider project is used.
- Region string
The region in which the created job should run.
- Service
Account stringEmail The Service Account email used to create the job.
- Subnetwork string
The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.
- Zone string
The zone in which the created job should run. If it is not provided, the provider zone is used.
- Temp
Gcs stringLocation A writeable location on GCS for the Dataflow job to dump its temporary data.
- Template
Gcs stringPath The GCS path to the Dataflow job template.
- Additional
Experiments []string List of experiments that should be used by the job. An example value is
["enable_stackdriver_agent_metrics"].- Ip
Configuration string The configuration for VM IPs. Options are
"WORKER_IP_PUBLIC"or"WORKER_IP_PRIVATE".- Labels map[string]interface{}
User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with
goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.- Machine
Type string The machine type to use for the job.
- Max
Workers int The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
- Name string
A unique name for the resource, required by Dataflow.
- Network string
The network to which VMs will be assigned. If it is not provided, “default” will be used.
- On
Delete string One of “drain” or “cancel”. Specifies behavior of deletion during
pulumi destroy. See above note.- Parameters map[string]interface{}
Key/Value pairs to be passed to the Dataflow job (as used in the template).
- Project string
The project in which the resource belongs. If it is not provided, the provider project is used.
- Region string
The region in which the created job should run.
- Service
Account stringEmail The Service Account email used to create the job.
- Subnetwork string
The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.
- Zone string
The zone in which the created job should run. If it is not provided, the provider zone is used.
- temp
Gcs stringLocation A writeable location on GCS for the Dataflow job to dump its temporary data.
- template
Gcs stringPath The GCS path to the Dataflow job template.
- additional
Experiments string[] List of experiments that should be used by the job. An example value is
["enable_stackdriver_agent_metrics"].- ip
Configuration string The configuration for VM IPs. Options are
"WORKER_IP_PUBLIC"or"WORKER_IP_PRIVATE".- labels {[key: string]: any}
User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with
goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.- machine
Type string The machine type to use for the job.
- max
Workers number The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
- name string
A unique name for the resource, required by Dataflow.
- network string
The network to which VMs will be assigned. If it is not provided, “default” will be used.
- on
Delete string One of “drain” or “cancel”. Specifies behavior of deletion during
pulumi destroy. See above note.- parameters {[key: string]: any}
Key/Value pairs to be passed to the Dataflow job (as used in the template).
- project string
The project in which the resource belongs. If it is not provided, the provider project is used.
- region string
The region in which the created job should run.
- service
Account stringEmail The Service Account email used to create the job.
- subnetwork string
The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.
- zone string
The zone in which the created job should run. If it is not provided, the provider zone is used.
- temp_
gcs_ strlocation A writeable location on GCS for the Dataflow job to dump its temporary data.
- template_
gcs_ strpath The GCS path to the Dataflow job template.
- additional_
experiments List[str] List of experiments that should be used by the job. An example value is
["enable_stackdriver_agent_metrics"].- ip_
configuration str The configuration for VM IPs. Options are
"WORKER_IP_PUBLIC"or"WORKER_IP_PRIVATE".- labels Dict[str, Any]
User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with
goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.- machine_
type str The machine type to use for the job.
- max_
workers float The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
- name str
A unique name for the resource, required by Dataflow.
- network str
The network to which VMs will be assigned. If it is not provided, “default” will be used.
- on_
delete str One of “drain” or “cancel”. Specifies behavior of deletion during
pulumi destroy. See above note.- parameters Dict[str, Any]
Key/Value pairs to be passed to the Dataflow job (as used in the template).
- project str
The project in which the resource belongs. If it is not provided, the provider project is used.
- region str
The region in which the created job should run.
- service_
account_ stremail The Service Account email used to create the job.
- subnetwork str
The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.
- zone str
The zone in which the created job should run. If it is not provided, the provider zone is used.
Outputs
All input properties are implicitly available as output properties. Additionally, the Job resource produces the following output properties:
- Id string
- The provider-assigned unique ID for this managed resource.
- Job
Id string The unique ID of this job.
- State string
The current state of the resource, selected from the JobState enum
- Type string
The type of this job, selected from the JobType enum
- Id string
- The provider-assigned unique ID for this managed resource.
- Job
Id string The unique ID of this job.
- State string
The current state of the resource, selected from the JobState enum
- Type string
The type of this job, selected from the JobType enum
- id string
- The provider-assigned unique ID for this managed resource.
- job
Id string The unique ID of this job.
- state string
The current state of the resource, selected from the JobState enum
- type string
The type of this job, selected from the JobType enum
- id str
- The provider-assigned unique ID for this managed resource.
- job_
id str The unique ID of this job.
- state str
The current state of the resource, selected from the JobState enum
- type str
The type of this job, selected from the JobType enum
Look up an Existing Job Resource
Get an existing Job resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.
public static get(name: string, id: Input<ID>, state?: JobState, opts?: CustomResourceOptions): Jobstatic get(resource_name, id, opts=None, additional_experiments=None, ip_configuration=None, job_id=None, labels=None, machine_type=None, max_workers=None, name=None, network=None, on_delete=None, parameters=None, project=None, region=None, service_account_email=None, state=None, subnetwork=None, temp_gcs_location=None, template_gcs_path=None, type=None, zone=None, __props__=None);public static Job Get(string name, Input<string> id, JobState? state, CustomResourceOptions? opts = null)- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- resource_name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
The following state arguments are supported:
- Additional
Experiments List<string> List of experiments that should be used by the job. An example value is
["enable_stackdriver_agent_metrics"].- Ip
Configuration string The configuration for VM IPs. Options are
"WORKER_IP_PUBLIC"or"WORKER_IP_PRIVATE".- Job
Id string The unique ID of this job.
- Labels Dictionary<string, object>
User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with
goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.- Machine
Type string The machine type to use for the job.
- Max
Workers int The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
- Name string
A unique name for the resource, required by Dataflow.
- Network string
The network to which VMs will be assigned. If it is not provided, “default” will be used.
- On
Delete string One of “drain” or “cancel”. Specifies behavior of deletion during
pulumi destroy. See above note.- Parameters Dictionary<string, object>
Key/Value pairs to be passed to the Dataflow job (as used in the template).
- Project string
The project in which the resource belongs. If it is not provided, the provider project is used.
- Region string
The region in which the created job should run.
- Service
Account stringEmail The Service Account email used to create the job.
- State string
The current state of the resource, selected from the JobState enum
- Subnetwork string
The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.
- Temp
Gcs stringLocation A writeable location on GCS for the Dataflow job to dump its temporary data.
- Template
Gcs stringPath The GCS path to the Dataflow job template.
- Type string
The type of this job, selected from the JobType enum
- Zone string
The zone in which the created job should run. If it is not provided, the provider zone is used.
- Additional
Experiments []string List of experiments that should be used by the job. An example value is
["enable_stackdriver_agent_metrics"].- Ip
Configuration string The configuration for VM IPs. Options are
"WORKER_IP_PUBLIC"or"WORKER_IP_PRIVATE".- Job
Id string The unique ID of this job.
- Labels map[string]interface{}
User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with
goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.- Machine
Type string The machine type to use for the job.
- Max
Workers int The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
- Name string
A unique name for the resource, required by Dataflow.
- Network string
The network to which VMs will be assigned. If it is not provided, “default” will be used.
- On
Delete string One of “drain” or “cancel”. Specifies behavior of deletion during
pulumi destroy. See above note.- Parameters map[string]interface{}
Key/Value pairs to be passed to the Dataflow job (as used in the template).
- Project string
The project in which the resource belongs. If it is not provided, the provider project is used.
- Region string
The region in which the created job should run.
- Service
Account stringEmail The Service Account email used to create the job.
- State string
The current state of the resource, selected from the JobState enum
- Subnetwork string
The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.
- Temp
Gcs stringLocation A writeable location on GCS for the Dataflow job to dump its temporary data.
- Template
Gcs stringPath The GCS path to the Dataflow job template.
- Type string
The type of this job, selected from the JobType enum
- Zone string
The zone in which the created job should run. If it is not provided, the provider zone is used.
- additional
Experiments string[] List of experiments that should be used by the job. An example value is
["enable_stackdriver_agent_metrics"].- ip
Configuration string The configuration for VM IPs. Options are
"WORKER_IP_PUBLIC"or"WORKER_IP_PRIVATE".- job
Id string The unique ID of this job.
- labels {[key: string]: any}
User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with
goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.- machine
Type string The machine type to use for the job.
- max
Workers number The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
- name string
A unique name for the resource, required by Dataflow.
- network string
The network to which VMs will be assigned. If it is not provided, “default” will be used.
- on
Delete string One of “drain” or “cancel”. Specifies behavior of deletion during
pulumi destroy. See above note.- parameters {[key: string]: any}
Key/Value pairs to be passed to the Dataflow job (as used in the template).
- project string
The project in which the resource belongs. If it is not provided, the provider project is used.
- region string
The region in which the created job should run.
- service
Account stringEmail The Service Account email used to create the job.
- state string
The current state of the resource, selected from the JobState enum
- subnetwork string
The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.
- temp
Gcs stringLocation A writeable location on GCS for the Dataflow job to dump its temporary data.
- template
Gcs stringPath The GCS path to the Dataflow job template.
- type string
The type of this job, selected from the JobType enum
- zone string
The zone in which the created job should run. If it is not provided, the provider zone is used.
- additional_
experiments List[str] List of experiments that should be used by the job. An example value is
["enable_stackdriver_agent_metrics"].- ip_
configuration str The configuration for VM IPs. Options are
"WORKER_IP_PUBLIC"or"WORKER_IP_PRIVATE".- job_
id str The unique ID of this job.
- labels Dict[str, Any]
User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with
goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.- machine_
type str The machine type to use for the job.
- max_
workers float The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
- name str
A unique name for the resource, required by Dataflow.
- network str
The network to which VMs will be assigned. If it is not provided, “default” will be used.
- on_
delete str One of “drain” or “cancel”. Specifies behavior of deletion during
pulumi destroy. See above note.- parameters Dict[str, Any]
Key/Value pairs to be passed to the Dataflow job (as used in the template).
- project str
The project in which the resource belongs. If it is not provided, the provider project is used.
- region str
The region in which the created job should run.
- service_
account_ stremail The Service Account email used to create the job.
- state str
The current state of the resource, selected from the JobState enum
- subnetwork str
The subnetwork to which VMs will be assigned. Should be of the form “regions/REGION/subnetworks/SUBNETWORK”.
- temp_
gcs_ strlocation A writeable location on GCS for the Dataflow job to dump its temporary data.
- template_
gcs_ strpath The GCS path to the Dataflow job template.
- type str
The type of this job, selected from the JobType enum
- zone str
The zone in which the created job should run. If it is not provided, the provider zone is used.
Package Details
- Repository
- https://github.com/pulumi/pulumi-gcp
- License
- Apache-2.0
- Notes
- This Pulumi package is based on the
google-betaTerraform Provider.