Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 28 Next »

The Infoblox Data Connector requires a BloxOne host to be tethered with. Prior to following steps in this guide you will need to deploy a host that meets minimum requirements, including a 750 GB disk. For details on requirements and deployment options, see Minimum System Requirements for Hosts. Dara Connector works with various components in a hybrid cloud environment to deliver source data to configured destinations. To deploy the end-to-end Data Connector solution, set up your environment, sources, data types, destinations, data filters, and traffic flows according to the requirements and instructions provided by Infoblox.

Note

This release of Data Connector supports only IPv4 addresses.

To deploy the end-to-end Data Connector solution, do the following:

Deploying a Data Connector VM

  1. Review the prerequisites and requirements, then set up your environment and components. For details, see BloxOne Connectivity and Service Requirements.
  2. If you do not already have a join token, create one. For details, see Creating Join Tokens.
  3. Set up a Data Connector VM (virtual machine) by using either the Docker or OVA package (for container and VM) that Infoblox provides, and connect the virtual machine to the Cloud Services Portal using the join token. For information on all installer packages, including the installers for Docker or the OVA package, see Downloading BloxOne Apps. All installer packages are available at Administration > Downloads of the Cloud Services Portal. Within the Cloud Services Portal, all available installer packages are listed in the drop-down menu of the Hosts section of the page.
    Note that you deploy a Data Connector VM as a host running the Data Connector service either in BloxOne DDI or BloxOne Threat Defense. For details, see Deploying Hosts.

Generate and Install a Self-signed Certificate

A self-signed certificate is not the only option available, however it will be useful for quickly getting started. The self-signed certificate will be used further in NIOS Grid Manager and while configuring the source in Data Connector in the Cloud Services Portal. The pem file will be used in the Data Connector Source
Configuration for RPZ logs. In order to generate and install a self-signed certificate do the following:

1. Create CA certificates by performing the following command:

     openssl req - ×509 sha256 - days 365 nodes newkey rsa:2048
subj "/C=IN/ST=KA/L=Bglr/0=Infoblox/OU=Cloud/CN=*"
keyout rootCA.key -out rootA.crt

2. Create a key and a certificate signing request:

     openssl reg batch -new -newkey rsa: 2048 nodes -kevout server.key
out rpz.csr -subj " /C=IN/ST=KA/L=Bglr /0=Infoblox /

3. Create a server certificate by signing it with the CA:

     openssl x509 -req -in rpz.csr -CA rootCA.crt -CAkey rootCA.key
- CAcreateserial -out server.crt

4. Create a new .pem file by copying the server.crt and server.key file contents. Use this new .pem file as a certificate for RPZ logs in the GUI.

     cat server.crt server.key > rpz.pem

5. Use the rootCA.cert in the NIOS to configure Secure TCP and the rpz.pem in the source for Traffic Flow configuration under Data Connector.

Creating a Data Connector Service

  1. Log in to the Cloud Services Portal.
  2. Create a Data Connector service instance and associate it with a configured host. For details, see Creating Services.

Configuring Traffic Flows

  1. Before you configure traffic flows for Data Connector, you must first set up the sources from which you want Data Connector to collect data. Note that BloxOne Threat Defense Cloud is preconfigured as the source and destination, and BloxOne DDI is preconfigured as the source. No configuration is required on your part. For details, see Configuring Sources.
  2. You must also configure the destinations to which you want Data Connector to send source data. For details, see Configuring Destinations.
  3. Optionally, you can add ETL (Extract, Transform, Load) filters to extract specific source data before Data Connector sends the data to the configured destinations.
    For details, see Configuring ETL Filters.
  4. Once you have configured sources, destinations, and ETL filters, you can configure not only the traffic flows that define the types of source data that will be collected from the sources but also the destinations to which the data will be sent. For details, see Configuring Traffic Flows.

Known Limitations
● You can only assign one destination for every traffic flow you create (see sections “Adding Traffic Flows” and “Adding destinations” for more information on traffic flows and destinations).

  • No labels