ProjectSink
Manages a project-level logging sink. For more information see the official documentation, Exporting Logs in the API and API.
Note: You must have granted the “Logs Configuration Writer” IAM role (
roles/logging.configWriter) to the credentials used with this provider.Note You must enable the Cloud Resource Manager API
Create a ProjectSink Resource
new ProjectSink(name: string, args: ProjectSinkArgs, opts?: CustomResourceOptions);def ProjectSink(resource_name, opts=None, bigquery_options=None, destination=None, filter=None, name=None, project=None, unique_writer_identity=None, __props__=None);func NewProjectSink(ctx *Context, name string, args ProjectSinkArgs, opts ...ResourceOption) (*ProjectSink, error)public ProjectSink(string name, ProjectSinkArgs args, CustomResourceOptions? opts = null)- name string
- The unique name of the resource.
- args ProjectSinkArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- opts ResourceOptions
- A bag of options that control this resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args ProjectSinkArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args ProjectSinkArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
ProjectSink Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Programming Model docs.
Inputs
The ProjectSink resource accepts the following input properties:
- Destination string
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:
import * as pulumi from "@pulumi/pulumi";import pulumiusing Pulumi; class MyStack : Stack { public MyStack() { } }package main import ( "github.com/pulumi/pulumi/sdk/v2/go/pulumi" ) func main() { pulumi.Run(func(ctx *pulumi.Context) error { return nil }) }The writer associated with the sink must have access to write to the above resource.
- Bigquery
Options ProjectSink Bigquery Options Args Options that affect sinks exporting data to BigQuery. Structure documented below.
- Filter string
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- Name string
The name of the logging sink.
- Project string
The ID of the project to create the sink in. If omitted, the project associated with the provider is used.
- Unique
Writer boolIdentity Whether or not to create a unique identity associated with this sink. If
false(the default), then thewriter_identityused isserviceAccount:cloud-logs@system.gserviceaccount.com. Iftrue, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must setunique_writer_identityto true.
- Destination string
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:
import * as pulumi from "@pulumi/pulumi";import pulumiusing Pulumi; class MyStack : Stack { public MyStack() { } }package main import ( "github.com/pulumi/pulumi/sdk/v2/go/pulumi" ) func main() { pulumi.Run(func(ctx *pulumi.Context) error { return nil }) }The writer associated with the sink must have access to write to the above resource.
- Bigquery
Options ProjectSink Bigquery Options Options that affect sinks exporting data to BigQuery. Structure documented below.
- Filter string
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- Name string
The name of the logging sink.
- Project string
The ID of the project to create the sink in. If omitted, the project associated with the provider is used.
- Unique
Writer boolIdentity Whether or not to create a unique identity associated with this sink. If
false(the default), then thewriter_identityused isserviceAccount:cloud-logs@system.gserviceaccount.com. Iftrue, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must setunique_writer_identityto true.
- destination string
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:
import * as pulumi from "@pulumi/pulumi";import pulumiusing Pulumi; class MyStack : Stack { public MyStack() { } }package main import ( "github.com/pulumi/pulumi/sdk/v2/go/pulumi" ) func main() { pulumi.Run(func(ctx *pulumi.Context) error { return nil }) }The writer associated with the sink must have access to write to the above resource.
- bigquery
Options ProjectSink Bigquery Options Options that affect sinks exporting data to BigQuery. Structure documented below.
- filter string
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- name string
The name of the logging sink.
- project string
The ID of the project to create the sink in. If omitted, the project associated with the provider is used.
- unique
Writer booleanIdentity Whether or not to create a unique identity associated with this sink. If
false(the default), then thewriter_identityused isserviceAccount:cloud-logs@system.gserviceaccount.com. Iftrue, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must setunique_writer_identityto true.
- destination str
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:
import * as pulumi from "@pulumi/pulumi";import pulumiusing Pulumi; class MyStack : Stack { public MyStack() { } }package main import ( "github.com/pulumi/pulumi/sdk/v2/go/pulumi" ) func main() { pulumi.Run(func(ctx *pulumi.Context) error { return nil }) }The writer associated with the sink must have access to write to the above resource.
- bigquery_
options Dict[ProjectSink Bigquery Options] Options that affect sinks exporting data to BigQuery. Structure documented below.
- filter str
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- name str
The name of the logging sink.
- project str
The ID of the project to create the sink in. If omitted, the project associated with the provider is used.
- unique_
writer_ boolidentity Whether or not to create a unique identity associated with this sink. If
false(the default), then thewriter_identityused isserviceAccount:cloud-logs@system.gserviceaccount.com. Iftrue, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must setunique_writer_identityto true.
Outputs
All input properties are implicitly available as output properties. Additionally, the ProjectSink resource produces the following output properties:
- Id string
- The provider-assigned unique ID for this managed resource.
- Writer
Identity string The identity associated with this sink. This identity must be granted write access to the configured
destination.
- Id string
- The provider-assigned unique ID for this managed resource.
- Writer
Identity string The identity associated with this sink. This identity must be granted write access to the configured
destination.
- id string
- The provider-assigned unique ID for this managed resource.
- writer
Identity string The identity associated with this sink. This identity must be granted write access to the configured
destination.
- id str
- The provider-assigned unique ID for this managed resource.
- writer_
identity str The identity associated with this sink. This identity must be granted write access to the configured
destination.
Look up an Existing ProjectSink Resource
Get an existing ProjectSink resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.
public static get(name: string, id: Input<ID>, state?: ProjectSinkState, opts?: CustomResourceOptions): ProjectSinkstatic get(resource_name, id, opts=None, bigquery_options=None, destination=None, filter=None, name=None, project=None, unique_writer_identity=None, writer_identity=None, __props__=None);func GetProjectSink(ctx *Context, name string, id IDInput, state *ProjectSinkState, opts ...ResourceOption) (*ProjectSink, error)public static ProjectSink Get(string name, Input<string> id, ProjectSinkState? state, CustomResourceOptions? opts = null)- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- resource_name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
The following state arguments are supported:
- Bigquery
Options ProjectSink Bigquery Options Args Options that affect sinks exporting data to BigQuery. Structure documented below.
- Destination string
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:
import * as pulumi from "@pulumi/pulumi";import pulumiusing Pulumi; class MyStack : Stack { public MyStack() { } }package main import ( "github.com/pulumi/pulumi/sdk/v2/go/pulumi" ) func main() { pulumi.Run(func(ctx *pulumi.Context) error { return nil }) }The writer associated with the sink must have access to write to the above resource.
- Filter string
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- Name string
The name of the logging sink.
- Project string
The ID of the project to create the sink in. If omitted, the project associated with the provider is used.
- Unique
Writer boolIdentity Whether or not to create a unique identity associated with this sink. If
false(the default), then thewriter_identityused isserviceAccount:cloud-logs@system.gserviceaccount.com. Iftrue, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must setunique_writer_identityto true.- Writer
Identity string The identity associated with this sink. This identity must be granted write access to the configured
destination.
- Bigquery
Options ProjectSink Bigquery Options Options that affect sinks exporting data to BigQuery. Structure documented below.
- Destination string
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:
import * as pulumi from "@pulumi/pulumi";import pulumiusing Pulumi; class MyStack : Stack { public MyStack() { } }package main import ( "github.com/pulumi/pulumi/sdk/v2/go/pulumi" ) func main() { pulumi.Run(func(ctx *pulumi.Context) error { return nil }) }The writer associated with the sink must have access to write to the above resource.
- Filter string
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- Name string
The name of the logging sink.
- Project string
The ID of the project to create the sink in. If omitted, the project associated with the provider is used.
- Unique
Writer boolIdentity Whether or not to create a unique identity associated with this sink. If
false(the default), then thewriter_identityused isserviceAccount:cloud-logs@system.gserviceaccount.com. Iftrue, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must setunique_writer_identityto true.- Writer
Identity string The identity associated with this sink. This identity must be granted write access to the configured
destination.
- bigquery
Options ProjectSink Bigquery Options Options that affect sinks exporting data to BigQuery. Structure documented below.
- destination string
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:
import * as pulumi from "@pulumi/pulumi";import pulumiusing Pulumi; class MyStack : Stack { public MyStack() { } }package main import ( "github.com/pulumi/pulumi/sdk/v2/go/pulumi" ) func main() { pulumi.Run(func(ctx *pulumi.Context) error { return nil }) }The writer associated with the sink must have access to write to the above resource.
- filter string
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- name string
The name of the logging sink.
- project string
The ID of the project to create the sink in. If omitted, the project associated with the provider is used.
- unique
Writer booleanIdentity Whether or not to create a unique identity associated with this sink. If
false(the default), then thewriter_identityused isserviceAccount:cloud-logs@system.gserviceaccount.com. Iftrue, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must setunique_writer_identityto true.- writer
Identity string The identity associated with this sink. This identity must be granted write access to the configured
destination.
- bigquery_
options Dict[ProjectSink Bigquery Options] Options that affect sinks exporting data to BigQuery. Structure documented below.
- destination str
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:
import * as pulumi from "@pulumi/pulumi";import pulumiusing Pulumi; class MyStack : Stack { public MyStack() { } }package main import ( "github.com/pulumi/pulumi/sdk/v2/go/pulumi" ) func main() { pulumi.Run(func(ctx *pulumi.Context) error { return nil }) }The writer associated with the sink must have access to write to the above resource.
- filter str
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- name str
The name of the logging sink.
- project str
The ID of the project to create the sink in. If omitted, the project associated with the provider is used.
- unique_
writer_ boolidentity Whether or not to create a unique identity associated with this sink. If
false(the default), then thewriter_identityused isserviceAccount:cloud-logs@system.gserviceaccount.com. Iftrue, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must setunique_writer_identityto true.- writer_
identity str The identity associated with this sink. This identity must be granted write access to the configured
destination.
Supporting Types
ProjectSinkBigqueryOptions
- Use
Partitioned boolTables Whether to use BigQuery’s partition tables. By default, Logging creates dated tables based on the log entries’ timestamps, e.g. syslog_20170523. With partitioned tables the date suffix is no longer present and special query syntax has to be used instead. In both cases, tables are sharded based on UTC timezone.
- Use
Partitioned boolTables Whether to use BigQuery’s partition tables. By default, Logging creates dated tables based on the log entries’ timestamps, e.g. syslog_20170523. With partitioned tables the date suffix is no longer present and special query syntax has to be used instead. In both cases, tables are sharded based on UTC timezone.
- use
Partitioned booleanTables Whether to use BigQuery’s partition tables. By default, Logging creates dated tables based on the log entries’ timestamps, e.g. syslog_20170523. With partitioned tables the date suffix is no longer present and special query syntax has to be used instead. In both cases, tables are sharded based on UTC timezone.
- use
Partitioned boolTables Whether to use BigQuery’s partition tables. By default, Logging creates dated tables based on the log entries’ timestamps, e.g. syslog_20170523. With partitioned tables the date suffix is no longer present and special query syntax has to be used instead. In both cases, tables are sharded based on UTC timezone.
Package Details
- Repository
- https://github.com/pulumi/pulumi-gcp
- License
- Apache-2.0
- Notes
- This Pulumi package is based on the
google-betaTerraform Provider.