Route logs to supported destinations (original) (raw)

This document explains how to create and manage log sinks, which route log entries that originate in a Google Cloud project to supported destinations.

A sink performs a write action and therefore it must be authorized to write to the destination. When the destination is a log bucket in the same project as the sink, the sink is automatically authorized. For all other destinations, the sink must be attached to aservice account that has been granted the permissions required to write data to the destination.

When a service account is required, Cloud Logging automatically creates and manages it. However, you might need to modify the permissions granted to the service account. You don't have to use the service account created by Logging. You can create and manage a service account that is used by sinks in multiple projects. For more information, seeConfigure log sinks with user-managed service accounts.

Overview

This page describes how to create a sink and how to configure the options you might see when using the Google Cloud console or the API.

Sinks belong to a given Google Cloud resource: a Google Cloud project, a billing account, a folder, or an organization. When the resource receives a log entry, every sink in the resource processes the log entry. When a log entry matches the filters of the sink, then the log entry is routed to the sink's destination.

Typically, sinks only route the log entries that originate in a resource. However, for folders and organizations you can create aggregated sinks, which route log entries from the folder or organization, and theresourcesit contains. This document doesn't discuss aggregated sinks. For more information, seeAggregated sinks overview.

To create and manage sinks, you can use the Google Cloud console, the Cloud Logging API, and the Google Cloud CLI. We recommend that you use the Google Cloud console:

The destination of a sink can be in a different resource than the sink. For example, you can use a log sink to route log entries from one project to a log bucket stored in a different project.

The following destinations are supported:

Google Cloud project

Select this destination when you want the log sinks in the destination project to reroute your log entries, or when you have created an intercepting aggregated sink. The log sinks in the project that is the sink destination can reroute the log entries to any supported destination except a project.

Log bucket

Select this destination when you want to store your log data in resources managed by Cloud Logging. Log data stored in log buckets can be viewed and analyzed using services like the Logs Explorer and Log Analytics.

If you want to join your log data with other business data, then you can store your log data in a log bucket and create a linked BigQuery dataset. A linked dataset is a read-only dataset that can be queried like any other BigQuery dataset.

BigQuery dataset

Select this destination when you want to join your log data with other business data. The dataset you specify must be write-enabled.Don't set the destination of a sink to be a linked BigQuery dataset. Linked datasets are read-only.

Cloud Storage bucket

Select this destination when you want long-term storage of your log data. The Cloud Storage bucket can be in the same project in which log entries originate, or in a different project. Log entries are stored as JSON files.

Pub/Sub topic

Select this destination when you want to export your log data from Google Cloud and then use third-party integrations like Splunk or Datadog. Log entries are formatted into JSON and then routed to a Pub/Sub topic.

Destination limitations

This section describes destination-specific limitations:

Before you begin

The instructions in this document describe creating and managing sinks at the Google Cloud project level. You can use the same procedure to create a sink that routes log entries that originate in an organization, folder, or billing account.

To get started, do the following:

  1. Enable the Cloud Logging API.
    Enable the API
  2. Make sure that your Google Cloud project contains log entries that you can see in the Logs Explorer.
  3. To get the permissions that you need to create, modify, or delete a sink, ask your administrator to grant you theLogs Configuration Writer (roles/logging.configWriter) IAM role on your project. For more information about granting roles, see Manage access to projects, folders, and organizations.
    You might also be able to get the required permissions through custom roles or other predefined roles.
    For information about granting IAM roles, see the Logging Access control guide.
  4. You have a resource in a supported destination or have the ability to create one.
    To route log entries to a destination, the destination must exist before you create the sink. You can create the destination in any Google Cloud project in any organization.
  5. Before you create a sink, review the limitations that apply for the sink destination. For more information, see theDestination limitations section in this document.
  6. Select the tab for how you plan to use the samples on this page:

Console

When you use the Google Cloud console to access Google Cloud services and APIs, you don't need to set up authentication.

gcloud

In the Google Cloud console, activate Cloud Shell.
Activate Cloud Shell
At the bottom of the Google Cloud console, aCloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.

REST

To use the REST API samples on this page in a local development environment, you use the credentials you provide to the gcloud CLI.
After installing the Google Cloud CLI,initialize it by running the following command:
gcloud init
If you're using an external identity provider (IdP), you must first sign in to the gcloud CLI with your federated identity.
For more information, seeAuthenticate for using REST in the Google Cloud authentication documentation.

Create a sink

This section describes how to create a sink in a Google Cloud project. You can create up to 200 sinks per Google Cloud project. To view the number and volume of log entries that are routed, view thelogging.googleapis.com/exports/ metrics.

You use the Logging query language to create a filter expression that matches the log entries you want to include. Don't put sensitive information in sink filters. Sink filters are treated as service data.

When a query contains multiple statements, you can either specify how those statements are joined or rely on Cloud Logging implicitly adding the conjunctive restriction, AND, between the statements. For example, suppose a query or filter dialog contains two statements,resource.type = "gce_instance" and severity >= "ERROR". The actual query is resource.type = "gce_instance" AND severity >= "ERROR". Cloud Logging supports both disjunctive restrictions, OR, and conjunctive restrictions, AND. When you use OR statements, we recommend that you group the clauses with parentheses.

To create a sink, do the following:

Console

  1. In the Google Cloud console, go to the Log Router page:
    Go to Log Router
    If you use the search bar to find this page, then select the result whose subheading isLogging.
  2. Select the Google Cloud project in which the log entries that you want to route originate.
    For example, if you want to route your Data Access log entries from the project named Project-A to a log bucket in the project namedProject-B, then select Project-A.
  3. Select Create sink.
  4. In the Sink details panel, enter the following details:
    • Sink name: Provide an identifier for the sink; note that after you create the sink, you can't rename the sink but you can delete it and create a new sink.
    • Sink description (optional): Describe the purpose or use case for the sink.
  5. In the Sink destination panel, select the sink service and destination by using the Select sink service menu. Do one of the following:
    • To route log entries to a service that is in the same Google Cloud project, select one of the following options:
      * Cloud Logging bucket: Select or create aLogging bucket.
      * BigQuery dataset: Select or create the writeable dataset to receive the routed log entries. You also have the option to usepartitioned tables.
      * Cloud Storage bucket: Select or create the particular Cloud Storage bucket to receive the routed log entries.
      * Pub/Sub topic: Select or create the particular topic to receive the routed log entries.
      * Splunk: Select the Pub/Sub topic for yourSplunkservice.
    • To route log entries to a different Google Cloud project, select Google Cloud project, and then enter the fully-qualified name for the destination:
    logging.googleapis.com/projects/DESTINATION_PROJECT_ID  
    • To route log entries to a service that is in a different Google Cloud project, do the following:
      1. Select Other resource.
      2. Enter the fully-qualified name for the destination. For information about the syntax, see theDestination path formats.
  6. Specify the log entries to include:
    1. Go to the Choose logs to include in sink panel.
    2. In the Build inclusion filter field, enter a filter expression that matches the log entries you want to include. To learn more about the syntax for writing filters, seeLogging query language.
      If you don't set a filter, all log entries from your selected resource are routed to the destination.
      For example, to route all Data Access log entries to a Logging bucket, you can use the following filter:
    log_id("cloudaudit.googleapis.com/data_access") OR log_id("externalaudit.googleapis.com/data_access")  

    The length of a filter can't exceed 20,000 characters.
    3. To verify you entered the correct filter, select Preview logs. The Logs Explorer opens in a new tab with the filter pre-populated.

  7. (Optional) Configure an exclusion filter to eliminate some of the included log entries:
    1. Go to the Choose logs to filter out of sink panel.
    2. In the Exclusion filter name field, enter a name.
    3. In the Build an exclusion filter field, enter afilter expression that matches the log entries you want to exclude. You can also use thesample functionto select a portion of the log entries to exclude.
      You can create up to 50 exclusion filters per sink. Note that the length of a filter can't exceed 20,000 characters.
  8. Select Create sink.
  9. Grant the service account for the sink the permission to write log entries to your sink's destination. For more information, seeSet destination permissions.

gcloud

To create a sink, do the following:

  1. Run the following gcloud logging sinks createcommand:
    gcloud logging sinks create SINK_NAME SINK_DESTINATION
    Before running the command, make the following replacements:
    • SINK_NAME: The name of the log sink. You can't change the name of a sink after you create it.
    • SINK_DESTINATION: The service or project to where you want your log entries routed. Set SINK_DESTINATIONwith the appropriate path, as described inDestination path formats.
      For example, if your sink destination is a Pub/Sub topic, then SINK_DESTINATION looks like the following:
      pubsub.googleapis.com/projects/PROJECT_ID/topics/TOPIC_ID
      You can also provide the following options:
    • --log-filter : Use this option to set afilter that matches the log entries you want to include in your sink. If you don't provide a value for the inclusion filter, then the this filter matches all log entries.
    • --exclusion: Use this option to set an exclusion filter for log entries that you want to exclude your sink from routing. You can also use thesample functionto select a portion of the log entries to exclude. This option can be repeated; you can create up to 50 exclusion filters per sink.
    • --description: Use this option to describe the purpose or use case for the sink.
      For example, to create a sink to a Logging bucket, your command might look like this:
      gcloud logging sinks create my-sink logging.googleapis.com/projects/myproject123/locations/global/buckets/my-bucket \
      --log-filter='logName="projects/myproject123/logs/matched"' --description="My first sink"
      For more information on creating sinks using theGoogle Cloud CLI, see thegcloud logging sinks reference.
  2. If the command response contains a JSON key labeled "writerIdentity", then grant the service account of the sink the permission to write to the sink destination. For more information, see Set destination permissions.
    You don't need to set destination permissions when the response doesn't contain a JSON key labeled "writerIdentity".

REST

  1. To create a logging sink in your Google Cloud project, useprojects.sinks.create in the Logging API. In theLogSinkobject, provide the appropriate required values in the method request body:
    • name: An identifier for the sink. Note that after you create the sink, you can't rename the sink, but you can delete it and create a new sink.
    • destination: The service and destination to where you want your log entries routed. To route log entries to a different project, or to a destination that is in another project, set the destination field with the appropriate path, as described inDestination path formats.
      For example, if your sink destination is a Pub/Sub topic, then the destination looks like the following:
      pubsub.googleapis.com/projects/PROJECT_ID/topics/TOPIC_ID
  2. In the LogSink object, provide the appropriate optional information:
    • filter : Set the filterfield to match the log entries you want to include in your sink. If you don't set a filter, all log entries from your Google Cloud project are routed to the destination. Note that the length of a filter can't exceed 20,000 characters.
    • exclusions: Set this field to match the log entries that you want to exclude from your sink. You can also use thesample functionto select a portion of the log entries to exclude. You can create up to 50 exclusion filters per sink.
    • description: Set this field to describe the purpose or use case for the sink.
  3. Call projects.sinks.create to create the sink.
  4. If the API response contains a JSON key labeled "writerIdentity", then grant the service account of the sink the permission to write to the sink destination. For more information, see Set destination permissions.
    You don't need to set destination permissions when the API response doesn't contain a JSON key labeled "writerIdentity".

For more information on creating sinks using the Logging API, see the LogSink reference.

If you receive error notifications, then seeTroubleshoot routing and sinks.

Destination path formats

If you route log entries to a service that is in another project, then you must provide the sink with the fully-qualified name for the service. Similarly, if you route log entries to a different Google Cloud project, then you must provide the sink with the fully-qualified name of the destination project:

Set destination permissions

This section describes how to grant Logging the Identity and Access Management permissions to write log entries to your sink's destination. For the full list of Logging roles and permissions, see Access control.

Cloud Logging creates a shared service account for a resource when a sink is created, unless the required service account already exists. The service account might exist because the same service account is used for all sinks in the underlying resource. Resources can be a Google Cloud project, an organization, a folder, or a billing account.

The writer identity of a sink is the identifier of the service account associated with that sink. All sinks have a writer identity except for sinks that write to a log bucket in the same Google Cloud project in which the log entry originates. For the latter configuration, a service account isn't required and therefore the sink's writer identity field is listed as None in the console. The API and the Google Cloud CLI commands don't report a writer identity.

The following instructions apply to projects, folders, organizations, and billing accounts:

Console

  1. Make sure that you have Owner access on the Google Cloud project that contains the destination. If you don't have Owner access to the destination of the sink, then ask a project owner to add the writer identity as a principal.
  2. To get the sink's writer identity—an email address—from the new sink, do the following:
    1. In the Google Cloud console, go to the Log Router page:
      Go to Log Router
      If you use the search bar to find this page, then select the result whose subheading isLogging.
    2. In the toolbar, select the project that contains the sink.
    3. Select Menu and then selectView sink details. The writer identity appears in theSink details panel.
  3. If the value of the writerIdentity field contains an email address, then proceed to the next step. When the value is None, you don't need to configure destination permissions for the sink.
  4. Copy the sink's writer identity into your clipboard.
    The email address identifies the principal. The prefix, serviceAccount:, specifies the account type.
  5. Grant the principal specified in the sink's writer identity the permission to write log data to the destination:
    1. In the Google Cloud console, go to the IAM page:
      Go to IAM
      If you use the search bar to find this page, then select the result whose subheading isIAM & Admin.
    2. In the toolbar, make sure that the selected project is either the project that stores the destination or is the sink destination. For example, if the destination is a log bucket, then make sure that the toolbar displays the project that stores the log bucket.
    3. Click Grant access.
    4. Grant the principal specified in the sink's writer identity an IAM role based on the destination of the log sink:

gcloud

  1. Make sure that you have Owner access on the Google Cloud project that contains the destination. If you don't have Owner access to the destination of the sink, then ask a project owner to add the writer identity as a principal.
  2. Get the service account from the writerIdentity field in your sink:
    gcloud logging sinks describe SINK_NAME
  3. Locate the sink whose permissions you want to modify, and if the sink details contain a line with writerIdentity, then proceed to the next step. When the details don't include a writerIdentityfield, you don't need to configure destination permissions for the sink.
    The writer identity for the service account looks similar to the following:
    serviceAccount:service-123456789012@gcp-sa-logging.iam.gserviceaccount.com
  4. Grant the sink's writer identity the permission to write log data to the destination by calling thegcloud projects add-iam-policy-binding command.
    Before using the following command, make the following replacements:
    • PROJECT_ID: The identifier of the project. Specify the project which stores the destination of the log sink. When the destination is a project, specify that project.
    • PRINCIPAL: An identifier for the principal that you want to grant the role to. Principal identifiers usually have the following form:PRINCIPAL-TYPE:ID. For example, user:my-user@example.com. For a full list of the formats that PRINCIPAL can have, see Principal identifiers.
    • ROLE: An IAM role. Grant the sink's writer identity an IAM role based on the destination of the log sink:
      * Google Cloud project: Grant theLogs Writer role (roles/logging.logWriter). Specifically, a principal needs thelogging.logEntries.route permission.
      * Log bucket: Grant theLogs Bucket Writer role (roles/logging.bucketWriter).
      * Cloud Storage bucket: Grant theStorage Object Creator role (roles/storage.objectCreator).
      * BigQuery dataset: Grant theBigQuery Data Editor role (roles/bigquery.dataEditor).
      * Pub/Sub topic, including Splunk: Grant thePub/Sub Publisher role (roles/pubsub.publisher).

Execute thegcloud projects add-iam-policy-bindingcommand:

gcloud projects add-iam-policy-binding PROJECT_ID --member=PRINCIPAL --role=ROLE  

REST

We recommend that you use the Google Cloud console or the Google Cloud CLI to grant a role to service account.

Manage sinks

After your sinks are created, you can perform the following actions on them. Any changes made to a sink might take a few minutes to apply:

Following are the instructions for managing a sink in a Google Cloud project. Instead of a Google Cloud project, you can specify a billing account, folder, or organization:

Console

  1. In the Google Cloud console, go to the Log Router page:
    Go to Log Router
    If you use the search bar to find this page, then select the result whose subheading isLogging.
  2. In the toolbar, select the resource that contains your sink. The resource can be a project, folder, organization, or billing account.

The Log Router page displays the sinks in the selected resource. Each table row contains information about a sink's properties:

For each table row, the More actions menu provides the following options:

To sort the table by a column, select the column name.

gcloud

REST

Stop storing log entries in log buckets

You can disable the _Default sink and any user-defined sinks. When you disable a sink, the sink stops routing log entries to its destination. For example, if you disable the _Default sink, then no log entries are routed to the _Default bucket. The_Default bucket becomes empty when all of the previously stored log entries have fulfilled the bucket'sretention period.

The following instructions illustrate how to disable your Google Cloud project sinks that route log entries to the_Default log buckets:

Console

  1. In the Google Cloud console, go to the Log Router page:
    Go to Log Router
    If you use the search bar to find this page, then select the result whose subheading isLogging.
  2. To find all the sinks that route log entries to the _Default log bucket, filter the sinks by destination, and then enter _Default.
  3. For each sink, select Menu and then select Disable sink.
    The sinks are now disabled and your Google Cloud project sinks no longer route log entries to the _Default bucket.

To reenable a disabled sink and restart routing log entries to the sink's destination, do the following:

  1. In the Google Cloud console, go to the Log Router page:
    Go to Log Router
    If you use the search bar to find this page, then select the result whose subheading isLogging.
  2. To find all the sinks that route log entries to the _Default log bucket, filter the sinks by destination, and then enter _Default.
  3. For each sink, select Menu and then select Enable sink.

gcloud

  1. To view your list of sinks for your Google Cloud project, use thegcloud logging sinks listcommand, which corresponds to the Logging API methodprojects.sinks.list:
    gcloud logging sinks list
  2. Identify any sinks that are routing to the _Default log bucket. To describe a sink, including seeing the destination name, use thegcloud logging sinks describecommand, which corresponds to the Logging API methodprojects.sinks.get:
    gcloud logging sinks describe SINK_NAME
  3. Run thegcloud logging sinks updatecommand and include the --disabled option. For example, to disable the_Default sink, use the following command:
    gcloud logging sinks update _Default --disabled
    The _Default sink is now disabled; it no longer routes log entries to the _Default log bucket.

To disable the other sinks in your Google Cloud project that are routing to the _Default bucket, repeat the previous steps.

To reenable a sink, use thegcloud logging sinks updatecommand, remove the --disabled option, and include the --no-disabledoption:

gcloud logging sinks update _Default --no-disabled

REST

  1. To view the sinks for your Google Cloud project, call the Logging API methodprojects.sinks.list.
    Identify any sinks that are routing to the _Default bucket.
  2. For example, to disable the _Default sink, set the disabled field in the LogSink object to true, and then call projects.sink.update.
    The _Default sink is now disabled; it no longer routes log entries to the _Default bucket.

To disable the other sinks in your Google Cloud project that are routing to the _Default bucket, repeat the previous steps.

To reenable a sink, set the disabled field in the LogSink object to false, and then call projects.sink.update.

Code samples

To use client library code to configure sinks in your chosen languages, seeLogging client libraries: Log sinks.

Filter examples

Following are some filter examples that are particularly useful when creating sinks. For additional examples that might be useful as you build your inclusion filters and exclusion filters, seeSample queries.

Restore the _Default sink filter

If you edited the filter for the _Default sink, then you might want to restore this sink to its original configuration. When created, the _Default sink is configured with the following inclusion filter and an empty exclusion filter:

  NOT log_id("cloudaudit.googleapis.com/activity") AND NOT \
  log_id("externalaudit.googleapis.com/activity") AND NOT \
  log_id("cloudaudit.googleapis.com/system_event") AND NOT \
  log_id("externalaudit.googleapis.com/system_event") AND NOT \
  log_id("cloudaudit.googleapis.com/access_transparency") AND NOT \
  log_id("externalaudit.googleapis.com/access_transparency")

Exclude Google Kubernetes Engine container and pod logs

To exclude Google Kubernetes Engine container and pod log entries for GKE system namespaces, use the following filter:

resource.type = ("k8s_container" OR "k8s_pod")
resource.labels.namespace_name = (
"cnrm-system" OR
"config-management-system" OR
"gatekeeper-system" OR
"gke-connect" OR
"gke-system" OR
"istio-system" OR
"knative-serving" OR
"monitoring-system" OR
"kube-system")

To exclude Google Kubernetes Engine node log entries for GKE system logNames, use the following filter:

resource.type = "k8s_node"
logName:( "logs/container-runtime" OR
"logs/docker" OR
"logs/kube-container-runtime-monitor" OR
"logs/kube-logrotate" OR
"logs/kube-node-configuration" OR
"logs/kube-node-installation" OR
"logs/kubelet" OR
"logs/kubelet-monitor" OR
"logs/node-journal" OR
"logs/node-problem-detector")

To view the volume of Google Kubernetes Engine node, pod, and container log entries stored in log buckets, use Metrics Explorer:

Exclude Dataflow logs not required for supportability

To exclude Dataflow log entries that aren't required forsupportability, use the following filter:

resource.type="dataflow_step"
labels."dataflow.googleapis.com/log_type"!="system" AND labels."dataflow.googleapis.com/log_type"!="supportability"

To view the volume of Dataflow logs stored in log buckets, use Metrics Explorer.

Supportability

Although Cloud Logging lets you exclude log entries and prevent them from being stored in a log bucket, you might want to consider keeping log entries that help with supportability. Using these log entries can help you troubleshoot and identify issues with your applications.

For example, GKE system log entries are useful to troubleshoot your GKE applications and clusters because they are generated for events that happen in your cluster. These log entries can help you determine if your application code or the underlying GKE cluster is causing your application error. GKE system logs also include Kubernetes Audit Logging generated by the Kubernetes API Server component, which includes changes made using the kubectl command and Kubernetes events.

For Dataflow, we recommended that you, at a minimum, write your system logs (labels."dataflow.googleapis.com/log_type"="system") and supportability logs (labels."dataflow.googleapis.com/log_type"="supportability") to log buckets. These logs are essential for developers to observe and troubleshoot their Dataflow pipelines, and users might not be able to use the DataflowJob details page to view job logs.

What's next