Azure Sentinel Logs

The Observo AI Azure Sentinel Logs destination forwards enriched security and observability data to Microsoft Sentinel via the Logs Ingestion API, enabling real-time threat detection, investigation, and response within Azure's Log Analytics workspace using JSON-formatted data and secure authentication.

Purpose

The Observo AI Azure Sentinel Logs destination is designed to forward enriched and normalized security and observability data from Observo AI into Microsoft Sentinel. This integration enables real-time threat detection, investigation, and response within the Azure security ecosystem. By leveraging Microsoft Sentinel’s analytics and Observo AI’s data processing, organizations gain enhanced visibility and actionable insights across their cloud and on-prem environments.

Prerequisites

Before configuring the Azure Sentinel Logs destination in Observo AI, ensure the following requirements are met to facilitate seamless data ingestion into Microsoft Sentinel:

  1. Azure Log Analytics Workspace:

    • Create a Log Analytics workspace in the Azure portal if one does not already exist. This workspace serves as the storage and analysis hub for your data in Microsoft Sentinel (Log Analytics Workspace).

    • Enable Microsoft Sentinel in the workspace to activate its SIEM capabilities (Enable Microsoft Sentinel).

  2. Data Collection Endpoint (DCE):

  3. Data Collection Rule (DCR):

    • Create a Data Collection Rule to define how data is collected and routed to specific tables in Log Analytics such as CommonSecurityLog, SecurityEvents, Syslog, WindowsEvents.

    • Ensure the DCR schema aligns with your data’s field names to prevent dropped events (Data Collection Rules).

    • Record the immutableID of the DCR from the Azure portal’s JSON view for configuration purposes.

  4. Azure AD Application Registration:

    • Register an application in Azure Active Directory (Azure AD) to handle authentication for data ingestion.

      • Navigate to “App registrations” in the Azure portal, create a new registration, and note the Application (client) ID and Directory (tenant) ID (Register an Application).

      • Create a client secret under “Certificates & secrets” and securely store its value.

  5. Role Assignment:

  6. Schema Alignment:

    • Ensure that the field names in your data match the schema defined in the DCR to avoid ingestion issues (Supported Tables).

Prerequisite
Description
Notes

Log Analytics Workspace

Storage and analysis hub for Sentinel data

Must have Microsoft Sentinel enabled

Data Collection Endpoint

Ingestion point for data

Region must match workspace

Data Collection Rule

Defines data routing and schema

Record immutableID

Azure AD Application

Handles authentication

Store Client ID, Tenant ID, Client Secret

Role Assignment

Grants permissions to application

Assign “Monitoring Metrics Publisher” role

Schema Alignment

Ensures data compatibility

Match field names to DCR schema

Integration

To configure the Azure Sentinel Logs destination in Observo AI, follow these steps to set up and test the data flow to Microsoft Sentinel:

  1. Log in to Observo AI:

    • Navigate to Destinations Tab

    • Click on “Add Destinations” button and select “Create New

    • Choose “Azure Sentinel Logs” from the list of available destinations to begin configuration.

  2. General Settings:

    • Name: Add a unique identifier such as sentinel-dest-1

    • Description (Optional): Add description

    • Tenant ID: Enter The Directory (tenant) ID for an Entra Application

      Example

      5ce893d9-2c32-4b6c-91a9-b0887c2de2d6

    • Client ID: Input Client ID is the application secret that you created in the app registration portal for your app

      Example

      5ce893d9-2c32-4b6c-91a9-b0887c2de2d6

    • Client Secret: Enter the Client secrets is the application secret that you created in the app registration portal for your app

    • Data collection endpoint (DCE): Enter the Data collection endpoint (DCE) provides an endpoint for the application to send to.

      Example

      https://xxxxxx.eastus-1.ingest.monitor.azure.com/

    • Data Collection Rule (DCR) Immutable ID: Enter Data Collection Rule (DCR) Immutable ID

      Example

      5ce893d9-2c32-4b6c-91a9-b0887c2de2d6

    • Log Type: The record type of the data that is being submitted. Can only contain letters, numbers, and underscores (_), and may not exceed 100 characters.

      Examples

      MyTableName

      MyRecordType

  3. Encoding (Optional):

    • Add Fields to exclude from serialization (Add): Transformations to prepare an event for serialization. List of fields that are excluded from the encoded event.

      Example

      message.payload

    • Fields to include in serialization (Add): Transformations to prepare an event for serialization. List of fields that are included in the encoded event. Other fields will be ignored. "Fields to exclude from serialization" and "Fields to include in serialization" are mutually exclusive; both cannot contain values simultaneously.

      Example

      message.payload

    • Encoding Timestamp Format: Format used for timestamp fields. Default: RFC 3339 timestamp

      Select the option:
      Description

      RFC 3339 timestamp

      Human-readable date format with timezone support

      Unix timestamp (Float)

      Seconds since epoch, includes fractional seconds

      Unix timestamp (Milliseconds)

      Milliseconds since epoch, integer precision time

      Unix timestamp (Nanoseconds)

      Nanoseconds since epoch, high-resolution timestamp

      Unix timestamp (Microseconds)

      Microseconds since epoch, fine-grained time format

      Unix timestamp

      Seconds since epoch, standard POSIX time format

  4. TLS Configuration (Optional):

    • TLS CA: Provide the CA certificate in PEM format.

    • TLS CRT: Provide the client certificate in PEM format.

    • TLS Key: Provide the private key in PEM format.

    • Verify Certificate (False): Enables certificate verification.

    • Certificates must be valid in terms of not being expired, and being issued by a trusted issuer. This verification operates in a hierarchical manner, checking validity of the certificate, the issuer of that certificate and so on until reaching a root certificate. Relevant for both incoming and outgoing connections. Do NOT set this to false unless you understand the risks of not verifying the validity of certificates.

    • Verify Hostname: Enables hostname verification. If enabled, the hostname used to connect to the remote host must be present in the TLS certificate presented by the remote host, either as the Common Name or as an entry in the Subject Alternative Name extension. Only relevant for outgoing connections. Do NOT set this to false unless you understand the risks of not verifying the remote hostname

  5. Batching Requirements (Default):

    • Batch Max Bytes: The maximum size of a batch that will be processed by a sink. This is based on the uncompressed size of the batched events, before they are serialized / compressed. Default: 500000

    • Batch Max Events: The maximum size of a batch before it is flushed. Default: 1000

    • Batch Timeout Secs: The maximum age of a batch before it is flushed. Default: 300

  6. Advanced Settings (Optional):

    • Azure Authority Host: The Azure authority host for identity Default: Empty

      Examples

      https://login.microsoftonline.com

      https://login.microsoftonline.de

    • Compression: Compression algorithm to use for the request body. Default: Gzip compression

      Options
      Description

      Gzip compression

      DEFLATE compression with headers for file storage

      None

      Data stored and transmitted in original form

    • Healthcheck (False): Whether or not to check the health of the sink when Observo Agent starts up.

    • Time Generated Key (Empty): Use this option to customize the log field used as TimeGenerated in Azure. The setting of log_schema.timestamp_key, usually timestamp, is used here by default. This field should be used in rare cases where TimeGenerated should point to a specific log field. For example, use this field to set the log field source_timestamp as holding the value that should be used as TimeGenerated on the Azure side.

      Example

      time_generated

  7. Save and Test Configuration:

    • Save the configuration settings in Observo AI.

    • Send sample data and verify that it reaches the specified Log Analytics table in Microsoft Sentinel.

Example Scenarios

RetailRiser, a fictitious global retail enterprise, operates a vast network of stores and an e-commerce platform, generating extensive security and observability data, including transaction logs, customer interaction events, and system alerts in JSON format. To enhance its threat detection and compliance capabilities, RetailRiser aims to forward this data to Microsoft Sentinel via the Observo AI platform, utilizing a Log Analytics workspace in the Azure project retailriser-security-2025. A Data Collection Endpoint (DCE) and Data Collection Rule (DCR) are configured to route data to the CommonSecurityLog table, with an Azure AD application providing secure authentication. The configuration below outlines the steps to set up the Azure Sentinel Logs destination in Observo AI, adhering to the required fields specified in the Integration section of the provided document, enabling real-time threat detection and analytics for RetailRiser.

Standard Azure Sentinel Logs Destination Setup

Here is a standard Azure Sentinel Logs Destination configuration example. Only the required sections and their associated field updates are displayed in the table below:

General Settings

Field
Value
Description

Name

retailriser-sentinel-logs

Unique identifier for the Sentinel destination

Description

Forward transaction and security logs to Azure Sentinel for RetailRiser

Optional description of the destination

Tenant ID

5ce893d9-2c32-4b6c-91a9-b0887c2de2d6

Azure AD Directory (tenant) ID

Client ID

7d9f4a2b-3d4e-5c7f-8e3a-2b9c8d1e0f4b

Azure AD Application (client) ID

Client Secret

gJalrXUtnRETAILRISERKEY2025

Client secret for Azure AD application

Data Collection Endpoint (DCE)

https://retailriser-dce.eastus-1.ingest.monitor.azure.com/

Ingestion endpoint for data

Data Collection Rule (DCR) Immutable ID

9e8d7c6b-4f5e-6a8f-9b2c-3d1e0f4a2b9c

Immutable ID of the DCR

Log Type

CommonSecurityLog

Record type for data submitted to Sentinel

Encoding

Field
Value
Description

Add Fields to Exclude from Serialization

message.payload

Excludes specified field from encoded event

Fields to Include in Serialization

None

Not used as exclude fields are specified

Encoding Timestamp Format

RFC 3339 timestamp

Formats timestamps in human-readable RFC3339 format

TLS Configuration

Field
Value
Description

TLS CA

/opt/observo/certs/ca.crt

Path to CA certificate for server verification

TLS CRT

/opt/observo/certs/retailriser.crt

Path to client certificate for authentication

TLS Key

/opt/observo/certs/retailriser.key

Path to private key for authentication

Verify Certificate

True

Enables certificate verification

Verify Hostname

True

Verifies hostname in the TLS certificate

Batching Configuration

Field
Value
Description

Batch Max Bytes

500000

Maximum batch size (500KB) before flushing

Batch Max Events

1000

Maximum number of events in a batch

Batch Timeout Secs

300

Maximum age of a batch before flushing

Advanced Settings

Field
Value
Description

Azure Authority Host

https://login.microsoftonline.com

Azure authority host for identity

Compression

Gzip compression

Applies Gzip compression to request body

Healthcheck

True

Checks sink health on Observo Agent startup

Time Generated Key

timestamp

Uses default timestamp field for TimeGenerated

Additional Configuration

  • Save and Test: Save the configuration and send sample transaction log data to the CommonSecurityLog table in Microsoft Sentinel.

  • Verify data ingestion in the Observo AI Analytics tab and query the Log Analytics workspace in Azure to confirm successful data flow.

Outcome

With this configuration, RetailRiser successfully forwards transaction logs and security events to Microsoft Sentinel via Observo AI, enabling real-time threat detection, incident investigation, and compliance monitoring, thereby enhancing the security and operational efficiency of its retail operations.

Troubleshooting

If issues arise with the Azure Sentinel Logs destination, use the following steps to diagnose and resolve them:

  • Verify Configuration Settings:

    • Ensure all fields such as Ingestion URL, Client ID, Tenant ID, Client Secret, Table Name are correctly entered and match Azure configurations.

    • Confirm that the table name is supported such as CommonSecurityLog, SecurityEvents and matches the DCR schema.

  • Check Authentication:

    • Verify that the client secret is valid and has not expired.

    • Confirm that the Azure AD application has the “Monitoring Metrics Publisher” role assigned for the DCR.

  • Monitor Logs:

    • Check Observo AI’s logs for errors or warnings related to data transmission.

    • In the Azure portal, navigate to the Log Analytics workspace and query the target table to confirm data arrival.

  • Validate Data Format and Schema:

    • Ensure the data’s field names align with the DCR schema to prevent dropped events.

    • If using custom tables, verify that the DCR is configured to accept custom data (Data Transformation).

  • Network and Connectivity:

    • Check for firewall rules or network policies that may block communication between Observo AI and Azure.

    • Verify that the DCE and Log Analytics workspace are accessible.

  • Test Data Flow:

    • Send sample data and monitor its arrival in Microsoft Sentinel.

    • Use the Analytics tab in the targeted Observo AI pipeline to monitor data volume and ensure expected throughput

    • In Azure, use the Resource Graph Explorer to query endpoints or manually verify the ingestion URL (Resource Graph Explorer).

Issue
Possible Cause
Resolution

Data not appearing in Sentinel

Incorrect Ingestion URL or table name

Verify URL and table name in configuration

Authentication errors

Expired or incorrect client secret

Regenerate secret and update configuration

Dropped events

Schema mismatch in DCR

Align data fields with DCR schema

Connection failures

Network or firewall issues

Check network policies and connectivity

Slow data transfer

Backpressure or rate limiting

Adjust backpressure settings or retry policies

Resources

For additional guidance and detailed information, refer to the following resources:

Last updated

Was this helpful?