FolderSink
Manages a folder-level logging sink. For more information see the official documentation and Exporting Logs in the API.
Note that you must have the “Logs Configuration Writer” IAM role (roles/logging.configWriter)
granted to the credentials used with this provider.
Create a FolderSink Resource
new FolderSink(name: string, args: FolderSinkArgs, opts?: CustomResourceOptions);def FolderSink(resource_name, opts=None, bigquery_options=None, destination=None, filter=None, folder=None, include_children=None, name=None, __props__=None);func NewFolderSink(ctx *Context, name string, args FolderSinkArgs, opts ...ResourceOption) (*FolderSink, error)public FolderSink(string name, FolderSinkArgs args, CustomResourceOptions? opts = null)- name string
- The unique name of the resource.
- args FolderSinkArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- opts ResourceOptions
- A bag of options that control this resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args FolderSinkArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args FolderSinkArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
FolderSink Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Programming Model docs.
Inputs
The FolderSink resource accepts the following input properties:
- Destination string
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples: “storage.googleapis.com/[GCS_BUCKET]” “bigquery.googleapis.com/projects/[PROJECT_ID]/datasets/[DATASET]” “pubsub.googleapis.com/projects/[PROJECT_ID]/topics/[TOPIC_ID]” The writer associated with the sink must have access to write to the above resource.
- Folder string
The folder to be exported to the sink. Note that either [FOLDER_ID] or “folders/[FOLDER_ID]” is accepted.
- Bigquery
Options FolderSink Bigquery Options Args Options that affect sinks exporting data to BigQuery. Structure documented below.
- Filter string
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- Include
Children bool Whether or not to include children folders in the sink export. If true, logs associated with child projects are also exported; otherwise only logs relating to the provided folder are included.
- Name string
The name of the logging sink.
- Destination string
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples: “storage.googleapis.com/[GCS_BUCKET]” “bigquery.googleapis.com/projects/[PROJECT_ID]/datasets/[DATASET]” “pubsub.googleapis.com/projects/[PROJECT_ID]/topics/[TOPIC_ID]” The writer associated with the sink must have access to write to the above resource.
- Folder string
The folder to be exported to the sink. Note that either [FOLDER_ID] or “folders/[FOLDER_ID]” is accepted.
- Bigquery
Options FolderSink Bigquery Options Options that affect sinks exporting data to BigQuery. Structure documented below.
- Filter string
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- Include
Children bool Whether or not to include children folders in the sink export. If true, logs associated with child projects are also exported; otherwise only logs relating to the provided folder are included.
- Name string
The name of the logging sink.
- destination string
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples: “storage.googleapis.com/[GCS_BUCKET]” “bigquery.googleapis.com/projects/[PROJECT_ID]/datasets/[DATASET]” “pubsub.googleapis.com/projects/[PROJECT_ID]/topics/[TOPIC_ID]” The writer associated with the sink must have access to write to the above resource.
- folder string
The folder to be exported to the sink. Note that either [FOLDER_ID] or “folders/[FOLDER_ID]” is accepted.
- bigquery
Options FolderSink Bigquery Options Options that affect sinks exporting data to BigQuery. Structure documented below.
- filter string
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- include
Children boolean Whether or not to include children folders in the sink export. If true, logs associated with child projects are also exported; otherwise only logs relating to the provided folder are included.
- name string
The name of the logging sink.
- destination str
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples: “storage.googleapis.com/[GCS_BUCKET]” “bigquery.googleapis.com/projects/[PROJECT_ID]/datasets/[DATASET]” “pubsub.googleapis.com/projects/[PROJECT_ID]/topics/[TOPIC_ID]” The writer associated with the sink must have access to write to the above resource.
- folder str
The folder to be exported to the sink. Note that either [FOLDER_ID] or “folders/[FOLDER_ID]” is accepted.
- bigquery_
options Dict[FolderSink Bigquery Options] Options that affect sinks exporting data to BigQuery. Structure documented below.
- filter str
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- include_
children bool Whether or not to include children folders in the sink export. If true, logs associated with child projects are also exported; otherwise only logs relating to the provided folder are included.
- name str
The name of the logging sink.
Outputs
All input properties are implicitly available as output properties. Additionally, the FolderSink resource produces the following output properties:
- Id string
- The provider-assigned unique ID for this managed resource.
- Writer
Identity string The identity associated with this sink. This identity must be granted write access to the configured
destination.
- Id string
- The provider-assigned unique ID for this managed resource.
- Writer
Identity string The identity associated with this sink. This identity must be granted write access to the configured
destination.
- id string
- The provider-assigned unique ID for this managed resource.
- writer
Identity string The identity associated with this sink. This identity must be granted write access to the configured
destination.
- id str
- The provider-assigned unique ID for this managed resource.
- writer_
identity str The identity associated with this sink. This identity must be granted write access to the configured
destination.
Look up an Existing FolderSink Resource
Get an existing FolderSink resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.
public static get(name: string, id: Input<ID>, state?: FolderSinkState, opts?: CustomResourceOptions): FolderSinkstatic get(resource_name, id, opts=None, bigquery_options=None, destination=None, filter=None, folder=None, include_children=None, name=None, writer_identity=None, __props__=None);func GetFolderSink(ctx *Context, name string, id IDInput, state *FolderSinkState, opts ...ResourceOption) (*FolderSink, error)public static FolderSink Get(string name, Input<string> id, FolderSinkState? state, CustomResourceOptions? opts = null)- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- resource_name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
The following state arguments are supported:
- Bigquery
Options FolderSink Bigquery Options Args Options that affect sinks exporting data to BigQuery. Structure documented below.
- Destination string
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples: “storage.googleapis.com/[GCS_BUCKET]” “bigquery.googleapis.com/projects/[PROJECT_ID]/datasets/[DATASET]” “pubsub.googleapis.com/projects/[PROJECT_ID]/topics/[TOPIC_ID]” The writer associated with the sink must have access to write to the above resource.
- Filter string
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- Folder string
The folder to be exported to the sink. Note that either [FOLDER_ID] or “folders/[FOLDER_ID]” is accepted.
- Include
Children bool Whether or not to include children folders in the sink export. If true, logs associated with child projects are also exported; otherwise only logs relating to the provided folder are included.
- Name string
The name of the logging sink.
- Writer
Identity string The identity associated with this sink. This identity must be granted write access to the configured
destination.
- Bigquery
Options FolderSink Bigquery Options Options that affect sinks exporting data to BigQuery. Structure documented below.
- Destination string
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples: “storage.googleapis.com/[GCS_BUCKET]” “bigquery.googleapis.com/projects/[PROJECT_ID]/datasets/[DATASET]” “pubsub.googleapis.com/projects/[PROJECT_ID]/topics/[TOPIC_ID]” The writer associated with the sink must have access to write to the above resource.
- Filter string
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- Folder string
The folder to be exported to the sink. Note that either [FOLDER_ID] or “folders/[FOLDER_ID]” is accepted.
- Include
Children bool Whether or not to include children folders in the sink export. If true, logs associated with child projects are also exported; otherwise only logs relating to the provided folder are included.
- Name string
The name of the logging sink.
- Writer
Identity string The identity associated with this sink. This identity must be granted write access to the configured
destination.
- bigquery
Options FolderSink Bigquery Options Options that affect sinks exporting data to BigQuery. Structure documented below.
- destination string
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples: “storage.googleapis.com/[GCS_BUCKET]” “bigquery.googleapis.com/projects/[PROJECT_ID]/datasets/[DATASET]” “pubsub.googleapis.com/projects/[PROJECT_ID]/topics/[TOPIC_ID]” The writer associated with the sink must have access to write to the above resource.
- filter string
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- folder string
The folder to be exported to the sink. Note that either [FOLDER_ID] or “folders/[FOLDER_ID]” is accepted.
- include
Children boolean Whether or not to include children folders in the sink export. If true, logs associated with child projects are also exported; otherwise only logs relating to the provided folder are included.
- name string
The name of the logging sink.
- writer
Identity string The identity associated with this sink. This identity must be granted write access to the configured
destination.
- bigquery_
options Dict[FolderSink Bigquery Options] Options that affect sinks exporting data to BigQuery. Structure documented below.
- destination str
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples: “storage.googleapis.com/[GCS_BUCKET]” “bigquery.googleapis.com/projects/[PROJECT_ID]/datasets/[DATASET]” “pubsub.googleapis.com/projects/[PROJECT_ID]/topics/[TOPIC_ID]” The writer associated with the sink must have access to write to the above resource.
- filter str
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
- folder str
The folder to be exported to the sink. Note that either [FOLDER_ID] or “folders/[FOLDER_ID]” is accepted.
- include_
children bool Whether or not to include children folders in the sink export. If true, logs associated with child projects are also exported; otherwise only logs relating to the provided folder are included.
- name str
The name of the logging sink.
- writer_
identity str The identity associated with this sink. This identity must be granted write access to the configured
destination.
Supporting Types
FolderSinkBigqueryOptions
- Use
Partitioned boolTables Whether to use BigQuery’s partition tables. By default, Logging creates dated tables based on the log entries’ timestamps, e.g. syslog_20170523. With partitioned tables the date suffix is no longer present and special query syntax has to be used instead. In both cases, tables are sharded based on UTC timezone.
- Use
Partitioned boolTables Whether to use BigQuery’s partition tables. By default, Logging creates dated tables based on the log entries’ timestamps, e.g. syslog_20170523. With partitioned tables the date suffix is no longer present and special query syntax has to be used instead. In both cases, tables are sharded based on UTC timezone.
- use
Partitioned booleanTables Whether to use BigQuery’s partition tables. By default, Logging creates dated tables based on the log entries’ timestamps, e.g. syslog_20170523. With partitioned tables the date suffix is no longer present and special query syntax has to be used instead. In both cases, tables are sharded based on UTC timezone.
- use
Partitioned boolTables Whether to use BigQuery’s partition tables. By default, Logging creates dated tables based on the log entries’ timestamps, e.g. syslog_20170523. With partitioned tables the date suffix is no longer present and special query syntax has to be used instead. In both cases, tables are sharded based on UTC timezone.
Package Details
- Repository
- https://github.com/pulumi/pulumi-gcp
- License
- Apache-2.0
- Notes
- This Pulumi package is based on the
google-betaTerraform Provider.