Export logs to a Google Cloud Storage
This task is designed to send logs to a Google Cloud Storage.
yaml
type: "io.kestra.plugin.ee.gcp.gcs."
Ship logs to GCP
yaml
id: log_shipper
namespace: company.team
triggers:
- id: daily
type: io.kestra.plugin.core.trigger.Schedule
cron: "@daily"
tasks:
- id: log_export
type: io.kestra.plugin.ee.core.log.LogShipper
logLevelFilter: INFO
lookbackPeriod: P1D
logExporters:
- id: GCPLogExporter
type: io.kestra.plugin.ee.gcp.gcs.LogExporter
projectId: myProjectId
format: JSON
maxLinesPerFile:10000
bucket: my-bucket
logFilePrefix: kestra-log-file
chunk: 1000
Dynamic YES
GCS Bucket to upload logs files.
The bucket where log files are going to be imported
Dynamic NO
Validation RegExp ^[a-zA-Z0-9][a-zA-Z0-9_-]*
Min length 1
Dynamic YES
Default 1000
The chunk size for every bulk request.
Dynamic YES
Default JSON
Possible Values
IONJSON
Format of the exported files
The format of the exported files
Dynamic YES
Default kestra-log-file
Prefix of the log files
The prefix of the log files name. The full file name will be logFilePrefix-localDateTime.json/ion
Dynamic YES
Default 100000
Maximum number of lines per file
The maximum number of lines per file
Dynamic YES
The GCP project ID.
SubType string
Dynamic YES
Default ["https://www.googleapis.com/auth/cloud-platform"]
The GCP scopes to be used.
Dynamic YES
The GCP service account key.