Use Case Summary:
In this use case, we check forwarding log messages from a VM to GCP. This is useful when:
- the local system is incapable of storing the log messages
- Compliance or security requirements mandate consolidating all logs on a single system, and having them forwarded to GCP
The scope of this use case is to:
- Handle RFC 3164 from remote rsyslog clients
- Parse message field as json, if we detect JSON, else go to google
- Rename the location field –> city
- Send parsed logs to Google
Pre-requisites:
- Configure your Rsyslog clients to forward messages to Stanza’s IP on port 5140/udp
- Restart Rsyslog and generate logs using the logger command
- You can send an example message using Net Cat. Be sure to update the IP address and timestamp.
- Google service account with a service account with the logs writer role assigned with a JSON key file. See roles/logging.logWriter for more information.
Architecture:
- One or more Linux systems with rsyslog forwarding configured:
- Forward to the Stanza VM’s IP address on port 5140 UDP
- One Stanza VM with the config in this directory
- Handle custom parsing by leveraging the JSON parser and move operators
- Output to Google Cloud Logging
Configuration:
pipeline: # Use the Rsyslog plugin to listen on port 5140/udp # for RFC 3164 syslog messages – type: rsyslog listen_port: 5140 connection_type: udp protocol: rfc3164 (BSD) location: UTC # If the syslog message field is json, route to the json # parser, otherwise go straight to Google output – type: router default: google_cloud_output routes: – output: json_parser expr: ‘$record.message matches “^{.*}\\s*$”‘ – type: json_parser parse_from: $record.message # Rename the location field to city, if it exists – type: move if: ‘$record.location != nil’ from: location to: city # Google output does not require a project or credential # file when running with the cloud logging scope – type: google_cloud_output |