Configuring Log Streaming

Introduction

SGNL can stream events to leading SIEM and storage providers while still making logs available within the SGNL Console and APIs. SGNL logs are formatted as individual JSON entries with a well-defined schema. An example access decision log entry takes the form of:

{
   "accessDecision": "Allow",
   "action": "access",
   "assetId": "aws::arn:1111",
   "clientId": "a5c5f108-1111-4b9a-2222-ed9787e3ce6b",
   "eventType": "sgnl.accessSvc.decision",
   "integrationDisplayName": "AWS",
   "integrationId": "a5c5f108-3333-4b9a-4444-ed9787e3ce6b",
   "level": "info",
   "msg": "Access search service decision",
   "principalId": "[email protected]",
   "requestId": "a5c5f108-5555-4b9a-6666-ed9787e3ce6b",
   "tenantId": "a5c5f108-7777-4b9a-8888-ed9787e3ce6b",
   "timeAtEvaluation": "2024-06-28T20:05:03Z",
   "time_now": "2024-06-28T20:05:03.289737017Z",
   "ts": "2024-06-28T20:05:03Z"
}

To get started with Log Streaming, simply head over to the Admin section of the SGNL Console and start adding integrations.

Available Log Streaming Integrations

Splunk

SGNL uses Splunk HEC to stream logs. To get started, log into SGNL and into your Splunk console.

In Splunk:

  1. Choose to Add Data from the Splunk Launcher
  2. Choose Monitor to add log data from an HTTP endpoint
  3. Choose the HTTP Event Collector method for receiving data, and give the collector a descriptive name, such as your SGNL and your clientName
  4. On the next page, choose the Automatic source type and select which indices you’d like SGNL log data to flow into
  5. On the final page, review your settings and ensure you copy your token – you’ll need this to configure SGNL in a moment

In SGNL:

  1. Login to the Console and choose Admin -> Add Log Stream -> Choose Splunk
  2. Give the Log Stream a name and optionally a description
  3. Enter your HEC Collector Address and Port, e.g. https://sgnl-log-stream.splunkcloud.com:8088
  • Note SGNL will auto-append relevant path information to this URL
  1. Paste the token that you copied from Splunk exactly as you copied it and save the configuration

SGNL - Splunk Log Stream Configuration

The next set of events that are generated will start to stream logs to Splunk – you should start to see them showing up in Splunk Search. You can trigger logs by making access evaluation requests, configuring and synchronizing a System of Record, or creating triggers, rules, and actions inside of the CAEP Hub

AWS S3

SGNL can also stream logs to AWS S3 Buckets. Setup is straightforward, but may depend on your chosen authentication method:

  1. Login to the Console and choose Admin -> Add Log Stream -> Choose AWS S3

  2. Give the Log Stream a name and optionally a description

  3. Enter your Bucket Name, e.g. ‘sgnl-logs’

  4. Enter the Region where your S3 Bucket is instantiated

  5. Choose the Auth Method, either Access Key or Assume Role:

    If using the ‘Access Key’ method:

    • Enter your AWS Access Key ID (e.g. AKIA311111111111111DYX)
    • Enter your AWS Secret Key
    • Click Save

    If using the ‘Assume Role’ method:

    With the ‘Assume Role’ method, you will then have to go and configure a Trust Policy in AWS, to allow SGNL to Assume the Role in your AWS Account. A Trust Policy may look something like the below:

       {
          "Version": "2012-10-17",
          "Statement": [
             {
                   "Effect": "Allow",
                   "Principal": {
                      "AWS": [
                         "arn:aws:iam::059615723535:role/nqr-1-log-forwarder-role",
                         "arn:aws:iam::059615723535:role/nqr-2-log-forwarder-role"
                      ]
                   },
                   "Action": [
                      "sts:AssumeRole",
                      "sts:TagSession"
                   ]
             }
          ]
       }
    

    In particular, note the AWS Principals that are detailed here. These are comprised of SGNL’s Account ID, and the identifier for the shard you are using, in the example above nqr.

    To determine your shard Id, if you don’t know it, you can ask SGNL Support, or perform a dns lookup on your client name.

    % dig myclient.sgnl.cloud
    
    ; <<>> DiG 9.10.6 <<>> myclient.sgnl.cloud
    ;; global options: +cmd
    ;; Got answer:
    ;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 845
    ;; flags: qr rd ra; QUERY: 1, ANSWER: 5, AUTHORITY: 0, ADDITIONAL: 1
    
    ;; OPT PSEUDOSECTION:
    ; EDNS: version: 0, flags:; udp: 1232
    ;; QUESTION SECTION:
    ;myclient.sgnl.cloud.	IN	A
    
    ;; ANSWER SECTION:
    myclient.sgnl.cloud. 300	IN	CNAME	nqr.sgnl.cloud.
    nqr.sgnl.cloud.		60	IN	CNAME	nqr-1.sgnl.cloud.
    nqr-1.sgnl.cloud.	300	IN	CNAME	k8s-istioing-istioigw-f95a003938-f4568858c6540cfb.elb.us-east-1.amazonaws.com.
    

    Depending on your deployment, the AWS Account ID may be different for your SGNL Client. Please contact support if you need further information.

    Note the answer section, the first result will give you the 3-letter code for your shard, in this case: nqr.sgnl.cloud. would result in SGNL AWS Roles of:

    • “arn:aws:iam::059615723535:role/nqr-1-log-forwarder-role”
    • “arn:aws:iam::059615723535:role/nqr-2-log-forwarder-role”

Datadog

Prerequisites

Before configuring Datadog log streaming in SGNL, ensure you have:

  • A Datadog account with appropriate permissions
  • A Datadog API key for log ingestion
  • Your Datadog site identifier (e.g., datadoghq.com, datadoghq.eu, us3.datadoghq.com)

Configuring SGNL

  1. Login to the Console and choose Admin -> Add Log Stream -> Choose Datadog
  2. Give the Log Stream a name and optionally a description
  3. Enter your Datadog Site (e.g., datadoghq.com, datadoghq.eu, us3.datadoghq.com, us5.datadoghq.com)
  4. Enter your Datadog API Key for log ingestion
  5. (Optional) If you are using an on-premises or custom Datadog deployment, enter the custom endpoint URL. This field should typically be left blank for standard Datadog cloud deployments
  6. Click Save

The Datadog log stream will begin forwarding SGNL events to your Datadog instance. You can view and query these logs in the Datadog Logs Explorer.

Loki

Prerequisites

Before configuring Loki log streaming in SGNL, ensure you have:

  • A Loki instance deployed and accessible
  • The Loki endpoint URL (e.g., http://loki.example.com:3100)
  • Authentication credentials:
    • For Bearer token authentication: A valid bearer token
    • For Basic authentication: A username and password
  • (Optional) If using multi-tenancy: Your Loki tenant ID

Understanding Loki Multi-Tenancy

Loki supports multi-tenancy through the use of tenant IDs. For more information about Loki’s multi-tenancy features, see the Loki Multi-Tenancy Documentation.

Configuring SGNL

  1. Login to the Console and choose Admin -> Add Log Stream -> Choose Loki
  2. Give the Log Stream a name and optionally a description
  3. Enter your Loki Endpoint URL (e.g., http://loki.example.com:3100)
  4. Select the Authentication Strategy:
    • Bearer: Uses a bearer token for authentication
    • Basic: Uses username and password for authentication
  5. Enter the appropriate authentication credentials:
    • For Bearer authentication: Enter your bearer token in the Auth Token field
    • For Basic authentication: Enter your username and password
  6. (Optional) Customize the API Path if your Loki instance uses a non-standard path. The default is /loki/api/v1/push
  7. (Optional) If your Loki instance has multi-tenancy enabled, enter your Tenant ID
  8. Click Save

Azure Blob Storage

Prerequisites

Before configuring Azure Blob Storage log streaming in SGNL, ensure you have:

  • An Azure Storage account
  • A container created within your storage account for storing logs
  • An Azure Storage connection string with appropriate permissions

Obtaining Your Azure Storage Connection String

Your connection string can be found in the Azure Portal:

  1. Navigate to your Storage Account
  2. Select “Access keys” from the left navigation menu
  3. Copy the connection string from either key1 or key2

The connection string will look similar to:

DefaultEndpointsProtocol=https;AccountName=mylogstorage;AccountKey=storageaccountkeybase64encoded;EndpointSuffix=core.windows.net

Configuring SGNL

  1. Login to the Console and choose Admin -> Add Log Stream -> Choose Azure Blob Storage
  2. Give the Log Stream a name and optionally a description
  3. Enter your Azure Storage Connection String
    • Example: DefaultEndpointsProtocol=https;AccountName=mylogstorage;AccountKey=storageaccountkeybase64encoded;EndpointSuffix=core.windows.net
  4. Enter your Container Name (e.g., sgnl-logs)
  5. Click Save

Logs will be stored as objects within the specified container, organized by date and time to facilitate easy retrieval and analysis.