Logstash is an open-source data collection and processing tool. It’s used to collect, parse, enrich, and transform log data from various sources, including log files, event streams, and databases.

Install Logstash

Download Logstash from the official Elastic website and install it per the instructions.

Configure Logstash

A standard Logstash configuration consists of three sections: input, filter, and output. The input and filter sections depend on the sources of logs.

To send logs to Gcore Managed Logging, configure Logstash with Kafka output and enable the Kafka Integration Plugin in your Logstash installation.

1. Configure Logstash with Kafka output by adding the following data to the logstash.conf file:

output {

        kafka {

          codec => json

          topic_id => "yourlogin.yourtopic"

          bootstrap_servers => "endpoint"

          sasl_mechanism => "SCRAM-SHA-512"

          security_protocol => "SASL_SSL"

          sasl_jaas_config => "org.apache.kafka.common.security.scram.ScramLoginModule required username='yourlogin'password='yourpassword'; "

          key_serializer => "org.apache.kafka.common.serialization.StringSerializer"

          value_serializer => "org.apache.kafka.common.serialization.StringSerializer"

        }

    }

2. Customize the highlighted values:

  • yourlogin.yourtopic: Your username and topic name, separated with a dot (.)
  • endpoint: Kafka endpoint
  • yourlogin.yourtopic: Your login
  • yourpassword: Your password

Tip

You can find your username, password, login, and topic name information in the Gcore Customer Portal on the Logging page. Learn more about logging configuration in our dedicated guide.

For more settings, check out the Kafka output plugin documentation.

3. Save the changes in the Logstash configuration file.

4. Restart Logstash. Logstash will start sending logs to Gcore Managed Logging.