Chat now with support
Chat with Support

syslog-ng Premium Edition 7.0.34 - Administration Guide

Preface Introduction to syslog-ng The concepts of syslog-ng Installing syslog-ng PE The syslog-ng PE quick-start guide The syslog-ng PE configuration file Collecting log messages — sources and source drivers
How sources work default-network-drivers: Receive and parse common syslog messages internal: Collecting internal messages file: Collecting messages from text files google-pubsub: collecting messages from the Google Pub/Sub messaging service wildcard-file: Collecting messages from multiple text files linux-audit: Collecting messages from Linux audit logs mssql, oracle, sql: collecting messages from an SQL database network: Collecting messages using the RFC3164 protocol (network() driver) office365: Fetching logs from Office 365 osquery: Collect and parse osquery result logs pipe: Collecting messages from named pipes program: Receiving messages from external applications python: writing server-style Python sources python-fetcher: writing fetcher-style Python sources snmptrap: Read Net-SNMP traps syslog: Collecting messages using the IETF syslog protocol (syslog() driver) system: Collecting the system-specific log messages of a platform systemd-journal: Collecting messages from the systemd-journal system log storage systemd-syslog: Collecting systemd messages using a socket tcp, tcp6,udp, udp6: Collecting messages from remote hosts using the BSD syslog protocol udp-balancer: Receiving UDP messages at very high rate unix-stream, unix-dgram: Collecting messages from UNIX domain sockets windowsevent: Collecting Windows event logs
Sending and storing log messages — destinations and destination drivers
elasticsearch2>: Sending messages directly to Elasticsearch version 2.0 or higher (DEPRECATED) elasticsearch-http: Sending messages to Elasticsearch HTTP Event Collector file: Storing messages in plain-text files google_bigquery(): Sending logs to a Google BigQuery table google_bigquery_managedaccount(): Sending logs to a Google BigQuery table authenticated by Google Cloud managed service account google_pubsub(): Sending logs to the Google Cloud Pub/Sub messaging service google_pubsub-managedaccount(): Sending logs to the Google Cloud Pub/Sub messaging service authenticated by Google Cloud managed service account hdfs: Storing messages on the Hadoop Distributed File System (HDFS) http: Posting messages over HTTP kafka(): Publishing messages to Apache Kafka (Java implementation) (DEPRECATED) kafka-c(): Publishing messages to Apache Kafka using the librdkafka client (C implementation) logstore: Storing messages in encrypted files mongodb: Storing messages in a MongoDB database network: Sending messages to a remote log server using the RFC3164 protocol (network() driver) pipe: Sending messages to named pipes program: Sending messages to external applications python: writing custom Python destinations sentinel(): Sending logs to the Microsoft Azure Sentinel cloud snmp: Sending SNMP traps smtp: Generating SMTP messages (email) from logs splunk-hec: Sending messages to Splunk HTTP Event Collector sql(): Storing messages in an SQL database stackdriver: Sending logs to the Google Stackdriver cloud syslog: Sending messages to a remote logserver using the IETF-syslog protocol syslog-ng(): Forward logs to another syslog-ng node tcp, tcp6, udp, udp6: Sending messages to a remote log server using the legacy BSD-syslog protocol (tcp(), udp() drivers) unix-stream, unix-dgram: Sending messages to UNIX domain sockets usertty: Sending messages to a user terminal — usertty() destination Client-side failover
Routing messages: log paths, flags, and filters Global options of syslog-ng PE TLS-encrypted message transfer Advanced Log Transport Protocol Reliability and minimizing the loss of log messages Manipulating messages parser: Parse and segment structured messages Processing message content with a pattern database Correlating log messages Enriching log messages with external data Monitoring statistics and metrics of syslog-ng Multithreading and scaling in syslog-ng PE Troubleshooting syslog-ng Best practices and examples The syslog-ng manual pages Glossary

Error messages you may encounter while using the google_bigquery() destination

The error messages will be sent back in a JSON file, and will have the following format:

            {
                "kind": string,
                "insertErrors": [
                  {
                    "index": integer,
                    "errors": [
                      {
                        object (ErrorProto)
                      }
                    ]
                  }
                ]
              }
        

The following table describes the possible error messages that you may encounter while using the google_bigquery() destination.

Error Description
                        {"reason":"invalid","location":"non_existent_field","message":"no such field: <field-name>."}
                    

This error indicates that a field does not exist but data was given to it. The field's name in question will be placed where the <field-name> is located in the example. This error can only happen when ignore_unknown_values(False) is set.

                        {"message":"Not found: Project <project-name>","domain":"global","reason":"notFound"}
                    

This error indicates that the given project <project-name> inside the project() parameter does not exist on Google Cloud.

                        {"reason":"invalid","message":"Missing required field: Msg_0_CLOUD_QUERY_TABLE.<field-name>."}
                    

This error indicates that a value was either not specified for the <field-name> inside the fields() parameter, or that syslog-ng does not have any values for it. This only happens for fields where the field's mode is 'REQUIRED' inside the schema.

                        {"message":"Not found: Dataset <project-name>:<dataset-name>","domain":"global","reason":"notFound"}
                    

This error indicates that the <project-name> project exists, but the dataset-name dataset given inside the dataset() parameter does not exist in BigQuery.

                        {"message":"Not found: Table <project-name>:<dataset-name>.<table-name>","domain": "global","reason": "notFound"}
                    

This error indicates that the <project-name> project and the <dataset-name> dataset exist, but the <table-name> table given inside the table() parameter does not exist inside the <project-name>:<dataset-name> dataset BigQuery.

                        {"reason":"invalid","location":"<field-name>","message":"Invalid <type> string <given-value>"}
                    

This error indicates that the value of <field-name> inside the fields() parameter is not valid for BigQuery's <type> field type. The error also contains the exact value for the <given-value>.

google_bigquery_managedaccount(): Sending logs to a Google BigQuery table authenticated by Google Cloud managed service account

From syslog-ng Premium Edition (syslog-ng PE) version 7.0.34, you can use the the google_bigquery() destination to generate your own data inside a Google BigQuery Table with syslog-ng PE by utilizing the HTTP REST interface of the service. This authentication method requires syslog-ng PE to run inside Google Cloud (for example, on a VM instance running in Google Cloud Compute Engine).

The google_bigquery_managedaccount() destination is equivalent to syslog-ng PE's google_bigquery(): Sending logs to a Google BigQuery table, only the authentication method is different.

For more information about Google BigQuery, see What Is BigQuery?. The rest of this section and its subsections assume that you are familiar with the Google BigQuery service, and its concepts and terminology.

Limitations

The current implementation of the google_bigquery() and google_bigquery_managedaccount() destinations has the following limitations:

The Table where the messages will be sent must be created manually, including the owner dataset. In addition, the message must be converted to the targeted BigQuery table's schema using the fields() parameter.

NOTE: To convert the key-value pairs to match the schema of the Table, see Specifying data types in value-pairs.

NOTE: The google_bigquery() and google_bigquery_managedaccount() destinations cannot fetch logs.

google_pubsub(): Sending logs to the Google Cloud Pub/Sub messaging service

From syslog-ng Premium Edition (syslog-ng PE) version 7.0.21, you can use the the google_pubsub() destination to generate your own messaging Google Pub/Sub infrastructure with syslog-ng PE as a "Publisher" entity, utilizing the HTTP REST interface of the service.

Similarly to syslog-ng PE's stackdriver() destination, the google_pubsub() destination is an asynchronous messaging service connected to Google's infrastructure.

For more information about Google Pub/Sub's messaging service, see What Is Pub/Sub?. The rest of this section and its subsections assume that you are familiar with the Google Pub/Sub messaging service, and its concepts and terminology.

Related Documents

The document was helpful.

Select Rating

I easily found the information I needed.

Select Rating