ProjectSink

Manages a project-level logging sink. For more information see the official documentation, Exporting Logs in the API and API.

Note: You must have granted the “Logs Configuration Writer” IAM role (roles/logging.configWriter) to the credentials used with this provider.

Note You must enable the Cloud Resource Manager API

Create a ProjectSink Resource

def ProjectSink(resource_name, opts=None, bigquery_options=None, destination=None, filter=None, name=None, project=None, unique_writer_identity=None, __props__=None);
func NewProjectSink(ctx *Context, name string, args ProjectSinkArgs, opts ...ResourceOption) (*ProjectSink, error)
public ProjectSink(string name, ProjectSinkArgs args, CustomResourceOptions? opts = null)
name string
The unique name of the resource.
args ProjectSinkArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
resource_name str
The unique name of the resource.
opts ResourceOptions
A bag of options that control this resource's behavior.
ctx Context
Context object for the current deployment.
name string
The unique name of the resource.
args ProjectSinkArgs
The arguments to resource properties.
opts ResourceOption
Bag of options to control resource's behavior.
name string
The unique name of the resource.
args ProjectSinkArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.

ProjectSink Resource Properties

To learn more about resource properties and how to use them, see Inputs and Outputs in the Programming Model docs.

Inputs

The ProjectSink resource accepts the following input properties:

Destination string

The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:

import * as pulumi from "@pulumi/pulumi";
import pulumi
using Pulumi;

class MyStack : Stack
{
    public MyStack()
    {
    }

}
package main

import (
    "github.com/pulumi/pulumi/sdk/v2/go/pulumi"
)

func main() {
    pulumi.Run(func(ctx *pulumi.Context) error {
        return nil
    })
}

The writer associated with the sink must have access to write to the above resource.

BigqueryOptions ProjectSinkBigqueryOptionsArgs

Options that affect sinks exporting data to BigQuery. Structure documented below.

Filter string

The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.

Name string

The name of the logging sink.

Project string

The ID of the project to create the sink in. If omitted, the project associated with the provider is used.

UniqueWriterIdentity bool

Whether or not to create a unique identity associated with this sink. If false (the default), then the writer_identity used is serviceAccount:cloud-logs@system.gserviceaccount.com. If true, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must set unique_writer_identity to true.

Destination string

The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:

import * as pulumi from "@pulumi/pulumi";
import pulumi
using Pulumi;

class MyStack : Stack
{
    public MyStack()
    {
    }

}
package main

import (
    "github.com/pulumi/pulumi/sdk/v2/go/pulumi"
)

func main() {
    pulumi.Run(func(ctx *pulumi.Context) error {
        return nil
    })
}

The writer associated with the sink must have access to write to the above resource.

BigqueryOptions ProjectSinkBigqueryOptions

Options that affect sinks exporting data to BigQuery. Structure documented below.

Filter string

The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.

Name string

The name of the logging sink.

Project string

The ID of the project to create the sink in. If omitted, the project associated with the provider is used.

UniqueWriterIdentity bool

Whether or not to create a unique identity associated with this sink. If false (the default), then the writer_identity used is serviceAccount:cloud-logs@system.gserviceaccount.com. If true, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must set unique_writer_identity to true.

destination string

The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:

import * as pulumi from "@pulumi/pulumi";
import pulumi
using Pulumi;

class MyStack : Stack
{
    public MyStack()
    {
    }

}
package main

import (
    "github.com/pulumi/pulumi/sdk/v2/go/pulumi"
)

func main() {
    pulumi.Run(func(ctx *pulumi.Context) error {
        return nil
    })
}

The writer associated with the sink must have access to write to the above resource.

bigqueryOptions ProjectSinkBigqueryOptions

Options that affect sinks exporting data to BigQuery. Structure documented below.

filter string

The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.

name string

The name of the logging sink.

project string

The ID of the project to create the sink in. If omitted, the project associated with the provider is used.

uniqueWriterIdentity boolean

Whether or not to create a unique identity associated with this sink. If false (the default), then the writer_identity used is serviceAccount:cloud-logs@system.gserviceaccount.com. If true, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must set unique_writer_identity to true.

destination str

The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:

import * as pulumi from "@pulumi/pulumi";
import pulumi
using Pulumi;

class MyStack : Stack
{
    public MyStack()
    {
    }

}
package main

import (
    "github.com/pulumi/pulumi/sdk/v2/go/pulumi"
)

func main() {
    pulumi.Run(func(ctx *pulumi.Context) error {
        return nil
    })
}

The writer associated with the sink must have access to write to the above resource.

bigquery_options Dict[ProjectSinkBigqueryOptions]

Options that affect sinks exporting data to BigQuery. Structure documented below.

filter str

The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.

name str

The name of the logging sink.

project str

The ID of the project to create the sink in. If omitted, the project associated with the provider is used.

unique_writer_identity bool

Whether or not to create a unique identity associated with this sink. If false (the default), then the writer_identity used is serviceAccount:cloud-logs@system.gserviceaccount.com. If true, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must set unique_writer_identity to true.

Outputs

All input properties are implicitly available as output properties. Additionally, the ProjectSink resource produces the following output properties:

Id string
The provider-assigned unique ID for this managed resource.
WriterIdentity string

The identity associated with this sink. This identity must be granted write access to the configured destination.

Id string
The provider-assigned unique ID for this managed resource.
WriterIdentity string

The identity associated with this sink. This identity must be granted write access to the configured destination.

id string
The provider-assigned unique ID for this managed resource.
writerIdentity string

The identity associated with this sink. This identity must be granted write access to the configured destination.

id str
The provider-assigned unique ID for this managed resource.
writer_identity str

The identity associated with this sink. This identity must be granted write access to the configured destination.

Look up an Existing ProjectSink Resource

Get an existing ProjectSink resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.

public static get(name: string, id: Input<ID>, state?: ProjectSinkState, opts?: CustomResourceOptions): ProjectSink
static get(resource_name, id, opts=None, bigquery_options=None, destination=None, filter=None, name=None, project=None, unique_writer_identity=None, writer_identity=None, __props__=None);
func GetProjectSink(ctx *Context, name string, id IDInput, state *ProjectSinkState, opts ...ResourceOption) (*ProjectSink, error)
public static ProjectSink Get(string name, Input<string> id, ProjectSinkState? state, CustomResourceOptions? opts = null)
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
resource_name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.

The following state arguments are supported:

BigqueryOptions ProjectSinkBigqueryOptionsArgs

Options that affect sinks exporting data to BigQuery. Structure documented below.

Destination string

The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:

import * as pulumi from "@pulumi/pulumi";
import pulumi
using Pulumi;

class MyStack : Stack
{
    public MyStack()
    {
    }

}
package main

import (
    "github.com/pulumi/pulumi/sdk/v2/go/pulumi"
)

func main() {
    pulumi.Run(func(ctx *pulumi.Context) error {
        return nil
    })
}

The writer associated with the sink must have access to write to the above resource.

Filter string

The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.

Name string

The name of the logging sink.

Project string

The ID of the project to create the sink in. If omitted, the project associated with the provider is used.

UniqueWriterIdentity bool

Whether or not to create a unique identity associated with this sink. If false (the default), then the writer_identity used is serviceAccount:cloud-logs@system.gserviceaccount.com. If true, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must set unique_writer_identity to true.

WriterIdentity string

The identity associated with this sink. This identity must be granted write access to the configured destination.

BigqueryOptions ProjectSinkBigqueryOptions

Options that affect sinks exporting data to BigQuery. Structure documented below.

Destination string

The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:

import * as pulumi from "@pulumi/pulumi";
import pulumi
using Pulumi;

class MyStack : Stack
{
    public MyStack()
    {
    }

}
package main

import (
    "github.com/pulumi/pulumi/sdk/v2/go/pulumi"
)

func main() {
    pulumi.Run(func(ctx *pulumi.Context) error {
        return nil
    })
}

The writer associated with the sink must have access to write to the above resource.

Filter string

The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.

Name string

The name of the logging sink.

Project string

The ID of the project to create the sink in. If omitted, the project associated with the provider is used.

UniqueWriterIdentity bool

Whether or not to create a unique identity associated with this sink. If false (the default), then the writer_identity used is serviceAccount:cloud-logs@system.gserviceaccount.com. If true, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must set unique_writer_identity to true.

WriterIdentity string

The identity associated with this sink. This identity must be granted write access to the configured destination.

bigqueryOptions ProjectSinkBigqueryOptions

Options that affect sinks exporting data to BigQuery. Structure documented below.

destination string

The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:

import * as pulumi from "@pulumi/pulumi";
import pulumi
using Pulumi;

class MyStack : Stack
{
    public MyStack()
    {
    }

}
package main

import (
    "github.com/pulumi/pulumi/sdk/v2/go/pulumi"
)

func main() {
    pulumi.Run(func(ctx *pulumi.Context) error {
        return nil
    })
}

The writer associated with the sink must have access to write to the above resource.

filter string

The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.

name string

The name of the logging sink.

project string

The ID of the project to create the sink in. If omitted, the project associated with the provider is used.

uniqueWriterIdentity boolean

Whether or not to create a unique identity associated with this sink. If false (the default), then the writer_identity used is serviceAccount:cloud-logs@system.gserviceaccount.com. If true, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must set unique_writer_identity to true.

writerIdentity string

The identity associated with this sink. This identity must be granted write access to the configured destination.

bigquery_options Dict[ProjectSinkBigqueryOptions]

Options that affect sinks exporting data to BigQuery. Structure documented below.

destination str

The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples:

import * as pulumi from "@pulumi/pulumi";
import pulumi
using Pulumi;

class MyStack : Stack
{
    public MyStack()
    {
    }

}
package main

import (
    "github.com/pulumi/pulumi/sdk/v2/go/pulumi"
)

func main() {
    pulumi.Run(func(ctx *pulumi.Context) error {
        return nil
    })
}

The writer associated with the sink must have access to write to the above resource.

filter str

The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.

name str

The name of the logging sink.

project str

The ID of the project to create the sink in. If omitted, the project associated with the provider is used.

unique_writer_identity bool

Whether or not to create a unique identity associated with this sink. If false (the default), then the writer_identity used is serviceAccount:cloud-logs@system.gserviceaccount.com. If true, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must set unique_writer_identity to true.

writer_identity str

The identity associated with this sink. This identity must be granted write access to the configured destination.

Supporting Types

ProjectSinkBigqueryOptions

See the input and output API doc for this type.

See the input and output API doc for this type.

See the input and output API doc for this type.

UsePartitionedTables bool

Whether to use BigQuery’s partition tables. By default, Logging creates dated tables based on the log entries’ timestamps, e.g. syslog_20170523. With partitioned tables the date suffix is no longer present and special query syntax has to be used instead. In both cases, tables are sharded based on UTC timezone.

UsePartitionedTables bool

Whether to use BigQuery’s partition tables. By default, Logging creates dated tables based on the log entries’ timestamps, e.g. syslog_20170523. With partitioned tables the date suffix is no longer present and special query syntax has to be used instead. In both cases, tables are sharded based on UTC timezone.

usePartitionedTables boolean

Whether to use BigQuery’s partition tables. By default, Logging creates dated tables based on the log entries’ timestamps, e.g. syslog_20170523. With partitioned tables the date suffix is no longer present and special query syntax has to be used instead. In both cases, tables are sharded based on UTC timezone.

usePartitionedTables bool

Whether to use BigQuery’s partition tables. By default, Logging creates dated tables based on the log entries’ timestamps, e.g. syslog_20170523. With partitioned tables the date suffix is no longer present and special query syntax has to be used instead. In both cases, tables are sharded based on UTC timezone.

Package Details

Repository
https://github.com/pulumi/pulumi-gcp
License
Apache-2.0
Notes
This Pulumi package is based on the google-beta Terraform Provider.