1080*80 ad

Setting up Basic Authentication in Logstash for Elasticsearch

How to Set Up Basic Authentication for Logstash and Elasticsearch

Securing your data pipeline is not an option—it’s a necessity. When using the Elastic Stack, one of the most critical security steps is ensuring that every component communicates securely. A common vulnerability is an unprotected connection between Logstash and Elasticsearch, which can allow unauthorized data to be indexed, potentially corrupting your database or exposing sensitive information.

This guide will walk you through the essential process of configuring basic authentication, using a username and password, to secure the data flow from Logstash to your Elasticsearch cluster.

Why Authentication is a Critical First Step

By default, a new Logstash instance might be able to send data to an Elasticsearch cluster without any credentials. This creates a significant security risk. Implementing basic authentication ensures that only authorized Logstash instances can write data to Elasticsearch. This fundamental practice helps prevent:

  • Unauthorized Data Injection: Malicious actors can’t push junk or harmful data into your indices.
  • Accidental Misconfiguration: Prevents other, non-production Logstash instances from accidentally pointing to and writing over your production data.
  • Meeting Compliance Standards: Most security and data privacy regulations require access control and authentication for all database interactions.

Prerequisites

Before you begin, ensure you have the following in place:

  • A running and accessible Elasticsearch cluster.
  • A functioning Logstash instance.
  • Administrative access to your Logstash configuration files, typically logstash.conf.
  • The ability to create users and roles in Elasticsearch (usually requires administrator-level access).

Step-by-Step Guide to Securing Your Connection

Follow these steps to establish a secure, authenticated connection between Logstash and Elasticsearch.

Step 1: Create a Dedicated Elasticsearch User

First, it is a best practice to avoid using the elastic superuser for a data pipeline. Instead, you should create a dedicated user with the minimum permissions required for Logstash to function. This adheres to the principle of least privilege.

  1. Create a Role: In Kibana, navigate to Stack Management > Security > Roles. Create a new role, for example, logstash_writer.
  2. Assign Privileges: This role needs permissions to write data to the indices that Logstash will manage. A common configuration is to grant the write, create_doc, and create_index privileges for the relevant index patterns (e.g., logstash-* or my-app-*).
  3. Create a User: Navigate to Stack Management > Security > Users and create a new user, such as logstash_internal. Assign a strong, unique password and assign the logstash_writer role you just created.

You now have a dedicated user (logstash_internal) with a secure password and just enough permissions to do its job.

Step 2: Configure the Logstash Output Plugin

Next, you need to update your Logstash pipeline configuration to use these new credentials. This is done in the output section of your logstash.conf file.

Your existing, unsecured configuration might look like this:

output {
  elasticsearch {
    hosts => ["https://your-elasticsearch-host:9200"]
    index => "your-index-name-%{+YYYY.MM.dd}"
  }
}

To add basic authentication, you simply need to add the user and password parameters.

output {
  elasticsearch {
    hosts => ["https://your-elasticsearch-host:9200"]
    index => "your-index-name-%{+YYYY.MM.dd}"
    user => "logstash_internal"
    password => "your_strong_password_here"
  }
}

While this works, hardcoding passwords directly in configuration files is a poor security practice. A much better approach is to use the Logstash Keystore.

Step 3 (Recommended): Secure Credentials with the Logstash Keystore

The Logstash Keystore allows you to securely store sensitive values like passwords without exposing them in plain text files.

  1. Add Your Credentials to the Keystore:
    From your Logstash home directory, run the following commands. You will be prompted to enter the values securely.

    # Add the username
    bin/logstash-keystore add ES_USER
    
    # Add the password
    bin/logstash-keystore add ES_PASS
    
  2. Update Your Configuration to Use Keystore Variables:
    Now, modify your logstash.conf file to reference these values using variable syntax ${...}.

    output {
      elasticsearch {
        hosts => ["https://your-elasticsearch-host:9200"]
        index => "your-index-name-%{+YYYY.MM.dd}"
        user => "${ES_USER}"
        password => "${ES_PASS}"
      }
    }
    

This configuration is far more secure, as your credentials are no longer visible in the pipeline file.

Step 4: Restart Logstash and Verify the Connection

For the changes to take effect, you must restart your Logstash service.

After restarting, verify that the connection is working correctly:

  1. Check the Logstash Logs: Look for log entries indicating a successful connection to Elasticsearch. If there are authentication errors, they will appear here.
  2. Check for New Data: Send some test logs through your pipeline and confirm they appear in the target Elasticsearch index using Kibana’s Discover tool. If data is flowing, your authenticated connection is working.

Common Troubleshooting Tips

  • 401 Unauthorized Error: This almost always means the username or password is incorrect. Double-check the credentials in your Logstash Keystore. It can also mean the user does not have the necessary privileges. Review the permissions on your logstash_writer role in Elasticsearch.
  • Connection Refused: This is likely a network issue, not an authentication problem. Ensure the Elasticsearch hosts URL is correct and that there are no firewalls blocking the connection on port 9200.
  • SSL/TLS Certificate Errors: If your Elasticsearch cluster uses HTTPS (which it should), you may need to configure SSL settings in the Logstash output plugin, including providing a path to your CA certificate using the cacert option.

By implementing basic authentication, you’ve taken a simple yet powerful step toward hardening your data infrastructure and protecting your valuable information.

Source: https://kifarunix.com/configure-logstash-elasticsearch-basic-authentication/

900*80 ad

      1080*80 ad