Query and analyze logs with Log Analytics (original) (raw)
Query and analyze logs with Log Analytics
Stay organized with collections Save and categorize content based on your preferences.
This document describes how to query and analyze the log data stored in log buckets that have been upgraded to use Log Analytics. You can query logs in these buckets by using SQL, which lets you filter and aggregate your logs. To view your query results, you can use the tabular form, or you can visualize the data with charts. These tables and charts can be saved to your custom dashboards.
You can query either a log view on a log bucket or ananalytics view. When you query a log view, the schema corresponds to that of the LogEntry data structure. Because the creator of an analytics view determines the schema, one use case for analytics views is to transform log data from theLogEntry
format into a format that is more suitable for you.
You can use the Logs Explorer to view log entries stored in log buckets in your project, whether or not the log bucket has been upgraded to use Log Analytics.
Log Analytics doesn't deduplicate log entries, which might affect how you write your queries. Also, there are some restrictions when using Log Analytics. For more information about these topics, see the following documents:
- Troubleshoot: There are duplicate log entries in my Log Analytics results.
- Log Analytics: Restrictions.
About linked datasets
Log Analytics supports the creation oflinked BigQuery datasets, which let BigQuery have read access to the underlying data. If you choose to create a linked dataset, then you can do the following:
- Join log entry data with other BigQuery datasets.
- Query log data from another service like theBigQuery Studio page or Looker Studio.
- Improve the performance of the queries that you run from the Log Analyticsby running them on yourBigQuery reserved slots.
- Create an alerting policy that monitors the result of a SQL query. For more information, seeMonitor your SQL query results with an alerting policy.
This document doesn't describe how to create a linked dataset or how to configure the Log Analytics to run queries on reserved slots. If you are interested in those topics, then seeQuery a linked dataset in BigQuery.
Before you begin
This section describes steps that you must complete before you can use Log Analytics.
Configure log buckets
Ensure that your log buckets have been upgraded to use Log Analytics:
- In the Google Cloud console, go to the Logs Storage page:
Go to Logs Storage
If you use the search bar to find this page, then select the result whose subheading isLogging. - For each log bucket that has a log view that you want to query, ensure that theLog Analytics available column displays Open. IfUpgrade is shown, then click Upgrade and complete the dialog.
Configure IAM roles and permissions
This section describes the IAM roles or permissions that are required to use Log Analytics:
- To get the permissions that you need to use Log Analytics and query log views, ask your administrator to grant you the following IAM roles on your project:
- To query the
_Required
and_Default
log buckets:Logs Viewer (roles/logging.viewer
) - To query all log views in a project:Logs View Accessor (
roles/logging.viewAccessor
)
You can restrict a principal to a specific log view either by adding an IAM condition to the Logs View Accessor role grant made at the project level, or by adding an IAM binding to the policy file of the log view. For more information, seeControl access to a log view.
These are the same permissions that you need to view log entries on the Logs Explorer page. For information about additional roles that you need to query views on user-defined buckets or to query the_AllLogs
view of the_Default
log bucket, seeCloud Logging roles.
- To query the
- To get the permissions that you need to query analytics views, ask your administrator to grant you theObservability Analytics User (
roles/observability.analyticsUser
) IAM role on your project.
Query a log view or an analytics view
When you are troubleshooting a problem, you might want tocount the log entries with a field that match a patternorcompute average latency for HTTP request. You can perform these actions by running a SQL query on a log view.
To issue a SQL query to a log view, do the following:
- In the Google Cloud console, go to the Log Analytics page:
Go to Log Analytics
If you use the search bar to find this page, then select the result whose subheading isLogging. - If you want to load the default query, then do the following:
- In the Views menu, go to the Logs orAnalytics Views section, and select the view that you want to query.
To find a view, you can use theFilter bar or you can scroll through the list:- Log views are listed by
BUCKET_ID.LOG_VIEW_ID
, where these fields refer to the IDs of the log bucket and log view. - Analytics views are listed by
LOCATION.ANALYTICS_VIEW_ID
, where these fields refer to the location and ID of an analytics view. Analytics views are in Public Preview.
- Log views are listed by
- In the Schema toolbar, click Query.
The Query pane is updated with a SQL query that is querying the analytics view you selected.
- In the Views menu, go to the Logs orAnalytics Views section, and select the view that you want to query.
- If you want enter a query, then do the following:
- To specify a time range, we recommend that you use the time-range selector. If you can add a
WHERE
clause that specifies thetimestamp
field, then that value overrides the setting in the time-range selector and that selector is disabled. - For examples, see Sample queries.
- To query a log view, the
FROM
clause for your query should have the following format:
FROM `PROJECT_ID.LOCATION.BUCKET_ID.LOG_VIEW_ID`
- To query an analytics view, the
FROM
clause for your query should have the following format:
FROM `analytics_view.PROJECT_ID.LOCATION.ANALYTICS_VIEW_ID`
- To specify a time range, we recommend that you use the time-range selector. If you can add a
The following describes the meaning of the fields in the previous expressions:
- PROJECT_ID: The identifier of the project.
- LOCATION: The location of the log view or the analytics view.
- BUCKET_ID: The name or ID of the log bucket.
- LOG_VIEW_ID: The identifier of the log view, which is limited to 100 characters and can include only letters, digits, underscores, and hyphens.
- ANALYTICS_VIEW_ID: The ID of the analytics view, which is limited to 100 characters and can include only letters, digits, underscores, and hyphens.
- In the toolbar, ensure that a button labeled Run query is displayed.
If the toolbar displays Run in BigQuery, then click Settings and selectLog Analytics (default). - Run your query.
The query is executed and the result of the query is shown in theResults tab.
You can use the toolbar options to format your query, clear the query, and open the BigQuery SQL reference documentation. - Explore the query results. You can view the results as a table or as a a chart. Charts can be saved to a custom dashboard. For more information, seeChart SQL query results.
Display the schema
The schema defines its structure and the data type for each field. This information is important to you because it determines how you construct your queries. For example, suppose you want to compute the average latency of HTTP requests. You need to know how to access the latency field and whether it is stored as an integer like 100
or stored as a string like "100"
. When the latency data is stored as a string, the query must cast the value to a numeric value before computing an average.
Log Analytics automatically infers the fields of a column when the data type is JSON. To view how often these inferred fields appear in your data, clickOptions and select View info and description.
To identify the schema, do the following:
- In the Google Cloud console, go to the Log Analytics page:
Go to Log Analytics
If you use the search bar to find this page, then select the result whose subheading isLogging. - In the Views pane, find the log view or analytics view, and then select the view.
The schema is displayed. For log views, the schema is fixed and corresponds to the LogEntry data structure. For analytics views, you can modify the SQL query to change the schema.