Configuring the Export via Portal
  • 13 Nov 2023
  • Dark
    Light

Configuring the Export via Portal

  • Dark
    Light

Article Summary

Access the Data Export configuration page via the following link to the HUMAN Portal

Existing Integrations

A welcome screen shows all existing integrations configured for the selected customer account.

2506

For each integration the following is indicated:

  • Integration name - a descriptive, custom name given to an integration
  • Integration type - to which end provider does this integration configured to (DataDog, Splunk, Http, Syslog, S3, SumoLogic)
  • Data type - either Logs or Metrics
  • Toggle button to turn the integration on/off

Configuring an Integration

By adding a new integration (in the upper right corner), or selecting an existing integration, you will be routed to the integration details page. In this page you will be able to configure the integration as follows:

Integration Details

In the Integration name field give a custom descriptive name to your integration. In the Applications field select one or more applications that belong to your account to be associated with this integration.

1760

Data Type

Select the required data stream to configure. More information on the data exported for each data stream is available in the Data Schema (Metrics) & Data Schema (Logs) sections.

1764

Integrations

Choose the integration type (DataDog, Splunk, Http, Syslog, S3, SumoLogic). Once all connection details (see below example) are filled, you can use “Test Connection” button to verify that we are able to send messages to the customer’s endpoint

1760

Authentication method Can be either:

  • Public - No authentication (no further actions required)
  • WAF - Web application firewall, meaning the required endpoint is behind a firewall. To enable access to HUMAN, please add the listed IPs to the firewall's whitelist

1680

Available Integrations

Datadog

We support sending both Metrics and Logs data schemes to Datadog. For additional information on this integration go to Datadog Integration.

Please Note

For the datastream type Metrics, only Authentication method - Public - No authentication, is supported.

  • API Key - DataDog provided API Key
  • Region - US or EU
  • Endpoint URL - A default one will be set according to a chosen region. It can be changed later to a custom url

Syslog

Support sending only the Logs data scheme.

1756

  • Host - Endpoint host
  • Port - Endpoint port
  • Protocol - TLS or TCP
  • SSL certificate - If the protocol is TLS, please upload the certificate to allow a secure connection
  • Expiry date - If the certificate is valid, the date is automatically extracted
  • Severity - Link
  • Facility - Link

Splunk Cloud (HEC)

We support sending both Metrics and Logs data schemes to Splunk Cloud (HEC). For additional information on this integration go to Splunk Integration.

1768

  • URL - A url to connect to Splunk
  • Headers - Usually “Authorization” header

SumoLogic

Support sending only the Logs data scheme with POST requests.

1760

Please provide url that contains the api endpoint and unique collector code of the format. You can get this url from your Sumologic dashboard.

AWS S3

Support sending only the Logs data scheme.

1794

  • Bucket - The name of the receiving S3 bucket
  • Region - AWS region the bucket resides at
  • SSE - Bucket encryption method
  • Access key id - Bucket access key id
  • Secret access key - Bucket secret access key
  • Path pattern - file's path (does not affect file name). Supported patterns:

Pattern

Path Value

{yyyy}

4 digits year(now) - e.g 2022

{mm}

2 digits month(now) - e.g 02

{dd}

2 digits day(now) - e.g 16

{appid}

activity's appId

path pattern example (for date 16.02.2022) - {yyyy}/{mm}/{dd}/production/{appid} -> 2022/02/16/production/HUMAN1q2w3e4r

Allowed characters: a-z,A-Z,0-9, -/!_.*'()

file name example - 7bd6079a-b5e4-4788-93c5-b939549ce5be_HUMAN3tHq532g_1645014120000000000_1645014180000000000

  1. Guid - 7bd6079a-b5e4-4788-93c5-b939549ce5be
  2. AppId - HUMAN1q2w3e4r
  3. From timestamp (UNIX epoch) - 1645014120000000000
  4. To timestamp (UNIX epoch) - 1645014180000000000

HTTP Web Hook

Support sending only the Logs data scheme.

1766

  • URL - The http endpoint url
  • Method - Http method to send the data (usually POST)
  • Body pattern - "{ "data": %s }". If the data sent needs to be in a very specific format. Use %s as a “data” placeholder
  • Headers - API keys, authentication headers, custom headers - can all be set here

Data Streams

Metrics

In the case of a metrics data type, you will be able to toggle export of the various metrics mentioned in the Data Schema (Metrics) section

891

Logs

In the case of a logs data type, you will be able to toggle the type as well as the specific fields exported in each activity type

920

Once an activity type is expanded, you will be able to choose the specific fields or bulk select/deselect all fields (as shown in the ‘Captcha’ activity below) or the default exported fields (as shown in ‘Block’ activity below)

901


Was this article helpful?