Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 22 Next »

Setting Up ELK for Use with Data Connector

To set up ELK for use with Data Connector, do the following:

  1. Depending on your setup, create a new input configuration file or update an existing file. Logstash supports different plugins for accepting messages from Data Connector. Infoblox recommends using TCP input with CEF codec. The TCP port should match the TCP port configured on the Syslog destination.

    Sample input configuration:

    input {
      tcp {
            port => 5534
            codec => cef{ }
            type => syslog
            tags => ["cdc"]
      }
    }
    
    filter {
      if "cdc" in [tags] {
      }
    }
    
    
    
    output {
      elasticsearch {
             hosts => ["127.0.0.1:9200"]
             index => "cdc-syslog"
      }
    }


    Sample input configuration:

    input {
      tcp {
            port => 5534
            codec => cef{ }
            type => syslog
            tags => ["cdc"]
      }
    }
    
    filter {
      if "cdc" in [tags] {
      }
    }
    
    
    
    output {
      elasticsearch {
             hosts => ["127.0.0.1:9200"]
             index => "cdc-syslog"
      }
    }
  2. Restart Logstash.
  3. Configure the Syslog destination in the Cloud Services Portal:
    • FQDN/IP: the IP address or hostname of Logstash
    • Port: the TCP port specified in the input configuration
    • Format: the CEF output format
    • Insecure Mode: selected

     
Image: The Syslog destinaton configuration screen in the Cloud Services Portal.

For more information, see Configuring Destinations.

Configuring a Traffic Flow

To push traffic to ELK, create a traffic flow and select the created destination as a destination for the traffic flow. For more information, see Configuring Traffic Flows.

Checking Events in Kibana

To check events in Kibana, do the following:

  1. To view the configuration, in the side menu under Kibana, go to KibanaManagement > Index Patterns.
  2. In the Index pattern field, type in “cdc-syslog”.
  3. Click Next step.
  4. In Time Filter field name, select @timestamp.
  5. Click Create index pattern.

    The screenshot shows Kibana, where the Management tab is open and shows the Create Index Pattern pane, which shows the timestamp selected in Time Filter Field Name. The pane also contains the Create Index Pattern button.
    Image: The a configuration screen from the Kibana  "Create index pattern" screen.


  6. In the side menu under Kibana, go to Discover and select cdc-syslog index.

    The screenshot shows Kibana, where the Discover tab is open and cdc-syslog index is selected.
    Image: The configuration screen from Kibana displaying the sdc syslog responses.


  7. Continue sending the data to Logstash:

    The screenshot shows Kibana, where the Discover tab is open and shows data being sent.
    Image
    The configuration screen from Kibana displaying the data being sent to Logstash.

Estimating Performance

Event types will be processed according to Logstash, the ELK configuration and Data Connector VM parameters, and the Data Connector configuration and load. The maximum Data Connector performance of Logstash via Syslog TCP is 18,000 events per second (EPS). For performance of Logstash to reach the maximum, the CEF codec was disabled and the output was configured in a file in /dev/null.

To estimate EPS, you need to summarize all event types received by Data Connector. For example, if on NIOS DNS Query where DNS Response and RPZ logs are enabled, then EPS will be calculated by doubling the average of DNS QPS and adding the average of RPZ hits per second.

  • No labels