Azure Monitor Logs
The Observo AI Azure Monitor Logs destination enables seamless transmission of security and telemetry data to Azure Log Analytics workspaces via the Data Collector API, supporting JSON-formatted data for centralized log analytics, advanced querying, visualization, and threat detection.
Purpose
The Observo AI Azure Monitor Logs destination enables sending security and telemetry data to Azure Monitor Logs for centralized log analytics and monitoring. It integrates with Azure Log Analytics workspaces to facilitate advanced querying, visualization, and threat detection. This destination helps organizations consolidate and analyze data for actionable insights and operational intelligence.
Prerequisites
Before configuring the Azure Monitor Logs destination in Observo AI, ensure the following requirements are met to facilitate seamless data ingestion:
Observo AI Platform Setup:
The Observo AI Site must be installed and available.
Verify that the platform can send data in formats compatible with Azure Log Analytics, such as JSON.
Azure Log Analytics Workspace:
Create a Log Analytics workspace in the Azure portal if one does not already exist. This workspace serves as the storage and analysis hub for your data (Log Analytics Workspace).
Data Collection Endpoint (DCE):
Create a Data Collection Endpoint in Azure Monitor to serve as the ingestion point for data. Ensure the region matches your Log Analytics workspace (Data Collection Endpoints).
Note the Logs Ingestion URL provided by the DCE for configuration.
Data Collection Rule (DCR):
Create a Data Collection Rule to define how data is collected and routed to specific tables in Log Analytics (e.g., Custom Logs, CommonSecurityLog).
Ensure the DCR schema aligns with your data’s field names to prevent dropped events (Data Collection Rules).
Record the immutableID of the DCR from the Azure portal’s JSON view for configuration.
Azure AD Application Registration:
Register an application in Azure Active Directory (Azure AD) to handle authentication for data ingestion.
Navigate to “App registrations” in the Azure portal, create a new registration, and note the Application (client) ID and Directory (tenant) ID (Register an Application).
Create a client secret under “Certificates & secrets” and securely store its value.
Role Assignment:
Assign the “Monitoring Metrics Publisher” role to the Azure AD application for the DCR to grant necessary permissions (Role Assignments).
Verify the role assignment in the DCR’s “Access control (IAM)” section.
Network and Connectivity:
Ensure Observo AI can communicate with Azure Monitor endpoints over HTTPS (port 443).
If using private endpoints or firewall rules, configure them to allow access (Azure Private Link).
Observo AI Platform
Must support Azure Monitor Logs
Verify data format compatibility
Log Analytics Workspace
Storage and analysis hub
Create via Azure portal
Data Collection Endpoint
Ingestion point for data
Region must match workspace
Data Collection Rule
Defines data routing and schema
Record immutableID
Azure AD Application
Handles authentication
Store Client ID, Tenant ID, Client Secret
Role Assignment
Grants permissions to application
Assign “Monitoring Metrics Publisher” role
Network
HTTPS connectivity
Allow port 443, configure private endpoints if needed
Integration
To configure Azure Monitor Logs as a destination in Observo AI, follow these steps to set up and test the data flow:
Log in to Observo AI:
Navigate to Destinations Tab
Click on “Add Destinations” button and select “Create New”
Choose “Azure Monitor Logs” from the list of available destinations to begin configuration.
General Settings:
Name: Add a unique identifier such as monitor-logs-dest-1
Description (Optional): Add description
Workspace ID: Add unique identifier for the Log Analytics workspace
Example5ce893d9-2c32-4b6c-91a9-b0887c2de2d6
Workspace Key: Add primary or the secondary key for the Log Analytics workspace
ExamplesSERsIYhgMVlJB6uPsq49gCxNiruf6v0vhMYE+lfzbSGcXjdViZdV/e5pEMTYtw9f8SkVLf4LFlLCc2KxtRZfCA==
${AZURE_MONITOR_SHARED_KEY_ENV_VAR}
Log Type: Add record type of the data that is being submitted. Can only contain letters, numbers, and underscores (_), and may not exceed 100 characters.
ExamplesMyTableName
MyRecordType
Host (Optional): Add alternative host for dedicated Azure regions.
Examplesods.opinsights.azure.us
ods.opinsights.azure.com
ods.opinsights.azure.cn
Encoding (Optional):
Fields to exclude from serialization (Add): Transformations to prepare an event for serialization. List of fields that are excluded from the encoded event.
Examplemessage.payload
Encoding Timestamp Format: Format used for timestamp fields. Default: RFC 3339 timestamp.
OptionsRFC 3339 timestamp
Unix timestamp(Float)
Unix timestamp(Milliseconds)
Unix timestamp(Nanoseconds)
Unix timestamp(Microseconds)
Unix timestamp
TLS Configuration (Optional):
TLS CA (Empty): The CA certificate provided as an inline string in PEM format.
TLS CRT (Empty): The certificate as a string in PEM format.
TLS Key (Empty): The key provided as a string in PEM format.
TLS Key Pass (Empty): Passphrase used to unlock the encrypted key file. This has no effect unless key_file is set.
Examples${KEY_PASS_ENV_VAR}
PassWord1
TLS Verify Certificate (False): Enables certificate verification. Certificates must be valid in terms of not being expired, and being issued by a trusted issuer. This verification operates in a hierarchical manner, checking validity of the certificate, the issuer of that certificate and so on until reaching a root certificate. Relevant for both incoming and outgoing connections. Do NOT set this to false unless you understand the risks of not verifying the validity of certificates.
TLS Verify Hostname (False): Enables hostname verification. Hostname used to connect to the remote host must be present in the TLS certificate presented by the remote host, either as the Common Name or as an entry in the Subject Alternative Name extension. Only relevant for outgoing connections. NOT recommended to set this to false unless you understand the risks.
Batching Configurations (Default):
Batch Max Bytes: The maximum size of a batch that will be processed by a sink. This is based on the uncompressed size of the batched events, before they are serialized / compressed. Default: 10000000
Batch Max Events: The maximum size of a batch before it is flushed. Default: 1000
Batch Timeout Secs: The maximum age of a batch before it is flushed. Default: 300
Advanced Settings (Optional):
Azure Resource ID (Empty): The Resource ID of the Azure resource the data should be associated with.
Example/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/otherResourceGroup/providers/Microsoft.Storage/storageAccounts/examplestorage
Healthcheck (False): Whether or not to check the health of the sink when Observo Agent starts up.
Time Generated Key (Empty): Use this option to customize the log field used as TimeGenerated in Azure. The setting of log_schema.timestamp_key, usually timestamp, is used here by default. This field should be used in rare cases where TimeGenerated should point to a specific log field. For example, use this field to set the log field source_timestamp as holding the value that should be used as TimeGenerated on the Azure side.
Exampletime_generated
Save and Test Configuration:
Save the configuration settings.
Send sample data and verify that it reaches the specified Log Analytics table.
Example Scenarios
SecureCorp, a fictitious enterprise focused on cybersecurity, aims to integrate Observo with Azure Monitor Logs to centralize security telemetry data for advanced analytics and threat detection. They have set up a Log Analytics workspace in the westus2 region, created a Data Collection Endpoint (DCE), and configured a Data Collection Rule (DCR) for a custom log table named SecureCorpCustomLog. An Azure AD application has been registered to handle authentication, and the necessary permissions have been assigned.
Standard Azure Monitor Logs Destination Setup
Here is a standard Azure Monitor Logs Destination configuration example. Only the required sections and their associated field updates are displayed in the table below:
Name
monitor-logs-securecorp-1
Unique identifier for the destination.
Description
Centralizes SecureCorp's security telemetry data in Azure Monitor Logs for analytics and threat detection.
Provides context for the destination's purpose.
Workspace ID
6d8f2a1c-4e5b-4a7c-9d2e-7b1a3f9c8e0d
Unique identifier for the Log Analytics workspace in Azure.
Workspace Key
XzY9qWvRtKlMnOpQrStUvWxYz1234567890AbCdEfGhIjKlMnOpQrStUvWxYz==
Primary or secondary key for the Log Analytics workspace, used for authentication.
Log Type
SecureCorpCustomLog
Record type of the data being submitted, matching the DCR-configured custom log table.
Host
ods.opinsights.azure.com
Specifies the Azure Monitor Logs ingestion endpoint.
Encoding
Encoding Timestamp Format
RFC3339
Specifies the timestamp format for data serialization, defaulting to RFC 3339.
Test Configuration:
Save settings, send sample data, verify ingestion in SecureCorpCustomLog table via Azure Log Analytics.
Saves configuration, tests data flow to the specified table, and confirms data is queryable.
Notes:
Ensure the Azure AD application has the “Monitoring Metrics Publisher” role assigned to the DCR, with correct Client ID, Tenant ID, and Client Secret for authentication.
Verify the DCR is configured to accept data for the SecureCorpCustomLog table, with a schema matching the sent data to prevent dropped events.
Confirm HTTPS connectivity (port 443) to Azure Monitor endpoints (ods.opinsights.azure.com).
Monitor Observo’s Notifications tab and query the Log Analytics workspace in the Azure portal to verify data ingestion and troubleshoot errors.
This configuration enables SecureCorp to transmit security telemetry data from Observo to Azure Monitor Logs for centralized log analytics and threat detection.
Troubleshooting
If issues arise with the Azure Monitor Logs destination in Observo AI, use the following steps to diagnose and resolve them:
Verify Configuration Settings:
Ensure all fields such as Ingestion URL, DCR Immutable ID, Table Name, Authentication are correctly entered and match the Azure setup.
Confirm that the table name is supported and matches the DCR structures (Supported Tables).
Check Authentication:
Verify that the Client ID, Tenant ID, Client Secret, and Scope are correct.
Ensure the Azure AD application has the “Monitoring Metrics Publisher” role assigned for the DCR (Role Assignments).
Monitor Logs:
Check Observo AI’s Notifications tab for errors or warnings related to data transmission.
In the Azure portal, query the Log Analytics workspace to confirm data arrival (Query Logs).
Validate Data Format and Schema:
Ensure the data’s field names align with the DCR schema to prevent dropped events.
If using custom tables, verify that the DCR is configured to accept custom data (Data Transformation).
Network and Connectivity:
Ensure Observo AI can reach Azure Monitor endpoints over HTTPS (port 443).
If using private endpoints, verify their configuration (Azure Private Link).
Common Error Messages:
“Authorization failure”: Indicates invalid credentials or missing permissions. Verify the Client Secret and role assignments.
“Table not found”: Check the Table Name and ensure it exists in the DCR and workspace.
“No data ingested”: Confirm data is being sent and matches the DCR schema. Check the Ingestion URL and DCR Immutable ID.
Test Data Flow:
Send sample data and verify ingestion.
Use the Analytics tab in the targeted Observo AI pipeline to monitor data volume and ensure expected throughput
Data not ingested
Incorrect Ingestion URL or Table Name
Verify configuration settings
Authorization errors
Invalid Client Secret or permissions
Check OAuth credentials and roles
“Table not found”
Incorrect or unsupported table
Confirm table exists in DCR
Connectivity issues
Firewall or private endpoint issues
Allow HTTPS on port 443, verify endpoints
Dropped events
Schema mismatch in DCR
Align data fields with DCR schema
Resources
For additional guidance and detailed information, refer to the following resources:
Last updated
Was this helpful?

