Stay organized with collections Save and categorize content based on your preferences.
The Cloud Logging API lets you programmatically accomplish logging-related tasks, including reading and writing log entries, creating log-based metrics, and managing sinks to route logs.
See the following reference documentation for the Logging API:
For details on the limits that apply to your usage of the Logging API, see Logging API quotas and limits.
Enable the Logging APIThe Logging API must be enabled before it can be used. For instructions, see Enable the Logging API.
Access the Logging APIYou can indirectly invoke the Logging API by using a command-line interface or a client library written to support a high-level programming language. For more information, see the following reference documentation:
gcloud logging
command.Following are some tips for using the Logging API effectively.
Read and list logs efficientlyTo efficiently use your entries.list
quota, try the following:
Set a large pageSize
: In the request body, you can set the pageSize
parameter up to and including the maximum value of an int32
(2,147,483,647). Setting the pageSize
parameter to a higher value lets Logging return more entries per query, reducing the number of queries needed to retrieve the full set of entries that you're targeting.
Set a large deadline: When a query nears its deadline, Logging prematurely terminates and returns the log entries scanned thus far. If you set a large deadline, then Logging can retrieve more entries per query.
Retry quota errors with exponential backoff: If your use case isn't time-sensitive, then you can wait for the quota to replenish before retrying your query. The pageToken
parameter is still valid after a delay.
To efficiently use your entries.write
quota, increase your batch volume to support a larger number of log entries per request, which helps reduce the number of writes made per request. Logging supports requests with up to 10MB of data.
The method you use to retrieve log entries is entries.list
, but this method isn't intended for high-volume retrieval of log entries. Using this method in this way might quickly exhaust your quota for read requests.
If you need contemporary or continuous querying, or bulk retrieval of log entries, then configure sinks to send your log entries to Pub/Sub. When you create a Pub/Sub sink, you send the log entries that you want to process to a Pub/Sub topic, and then consume the log entries from there.
This approach has the following advantages:
You can create Pub/Sub sinks to route log entries to a variety of analytics platforms. For an example, see Scenarios for routing Cloud Logging data: Splunk.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-08-11 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-11 UTC."],[],[]]
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4