Azure Blob Storage via Event Hub
This source reads logs from Azure Blob Storage containers, triggered by Azure Event Hub notifications when new blobs are created. This integration enables real-time ingestion of log data as soon as files are uploaded to Azure Blob Storage.
The system subscribes to Blob Create events published by Azure Blob Storage and handled by Azure Event Hub. When it receives a notification, it reads the logs from the corresponding blob and processes them for ingestion into Observo AI.
Purpose
The Observo AI Azure Blob via Event Hub source enables real-time ingestion of data from Azure Blob Storage containers into the Observo AI platform for processing and analysis. This event-driven approach ensures that data is processed immediately upon arrival, supporting various file formats such as JSON, CSV, and compressed logs. The integration facilitates efficient, scalable data retrieval from Azure's storage service for monitoring, observability, and analytics.
Prerequisites
Before configuring the Azure Blob via Event Hub source in Observo AI, ensure the following requirements are met to facilitate seamless data ingestion:
Observo AI Platform Setup:
The Observo AI Site must be installed and operational.
Verify that the platform can process expected file formats, such as JSON, CSV, or compressed logs (gzip).
Azure Storage Account:
An active Azure subscription with a storage account and at least one container created (Create a Storage Account).
The storage account must be accessible to Observo AI. Ensure public network access is enabled, or if access is restricted to specific VNETs/IPs, whitelist the Observo Dataplane IP address.
Required permissions: The account or application used by Observo AI must have read access to the container, typically via a Connection String or Storage Account Key (Azure Blob Storage Access Control).
Azure Event Hub:
An Event Hub namespace and Event Hub instance must be created (Create an Event Hub).
A shared access policy with Listen permissions must be configured to obtain the Event Hub connection string.
Azure Event Grid:
Event Grid must be configured to route Blob Create events from the storage account to the Event Hub (Learn more about Event Grid).
An event subscription must be created that filters for Blob Created events and targets the Event Hub endpoint.
Network and Connectivity:
Ensure Observo AI can communicate with Azure Blob Storage and Event Hub endpoints, typically over HTTPS (port 443).
If using private endpoints or firewall rules, configure them to allow access from Observo AI (Azure Private Link for Blob Storage).
Observo AI Platform
Must support Azure Blob via Event Hub
Verify data format compatibility
Azure Storage Account
Active account with container
Create via Azure portal
Azure Event Hub
Event Hub namespace and instance
Configure Listen permissions
Azure Event Grid
Routes Blob Create events to Event Hub
Event subscription required
Network
HTTPS connectivity
Allow port 443, configure private endpoints if needed
Phased Approach
The guide covers the end-to-end setup of Azure Blob Storage and Event Hubs to enable Observo to ingest log events. The architecture consists of a Storage Account (where logs land) and an Event Hub (which notifies Observo of new files).

The guide has a phased approach:
Phase 1: Setup Azure Resources
Phase 2: Integration (Observo Source Configuration)
Phase 3: Observo Parser Settings
Complete the Verification Checklist to finalize the process.
Phase 1: Setup Azure Resources
This section provides step-by-step guidance for configuring Azure Blob Storage, Event Hub, and Event Grid to enable Observo AI to ingest log events in real-time.
Prepare the Storage Account:
Most customers will already have a Storage Account. Choose the scenario below that applies to you
Scenario A: Using an Existing Storage Account:
Identify the Container: Note the exact name of the container where logs are stored (e.g., observo-events).

Select service 
Storage 
Storage_next 
Storage_container Check Networking: Go to Networking in the Azure Portal. Ensure "Public network access" is enabled. Note: If you restrict access to specific VNETs/IPs, you must whitelist the Observo Dataplane IP.
Get the Key: Go to Access keys (left menu) and copy the Connection string. Keep this for later: This is your
Storage Key, which is needed for thePullaction from blob.

Scenario B: Creating New Storage(If needed):
Create a standard Azure Storage Account (e.g.,
observoarchive).Inside the account, create a Container (e.g.,
observo-events).Ensure the container accessible by the Observo Site (public access or whitelist)
Go to Access keys and copy the Connection string.
Set up the Event Hub:This process involves creating a Namespace first, then the Event Hub inside it.
Step A: Create the Namespace:
Search for Event Hubs in Azure and click Create.
Name the Namespace (e.g.
observo-eventhub-notifications).Select your Subscription/Resource Group and click Review + create.
NOTE: Ensure your EventHub and your Storage Account/Container are in the same Region (e.g.
East US)
Storage_eh 
Storage_eha 
Storage_ehb
Step B: Create the Event Hub Instance:
Open the Namespace you just created.
Click Event Hub (top menu).
Name the Event Hub Instance (e.g., observo-triggers).
Click Review + create.

Storage_EI 
Storage_EIa 
Storage_EIb
Step c: Get the Connection String:
Open the new Event Hub Instance (observo-triggers).
Click Shared access policies (left menu).
Click Add, name it observo-reader, and check the Listen box.
Click on the new policy and copy the Primary connection string.
Keep this for later: This is your "Event Hub Key" - You will need this for the "Notify" action, so Observo knows when NEW events are available for collection.

Storage_shared 
Storage_shareda 
Storage_sharedb
"Wire" them together (Event Grid):
Go back to your Storage Account > Events.
Click + Event Subscription.
Event Types: Filter for "Blob Created" only.
Endpoint Type: Select Event Hub
Endpoint: Select the Event Hub Instance you created in Step 2B (observo-triggers).
Result: Every time a file hits the bucket, Azure sends a "ping" to the Event Hub.

Storage_menu 
Storage_sb 
Storage_sc 
Storage_sd
Phase 2: Integration (Observo Source Configuration)
The Integration section outlines the configuration steps for Azure Blob via Event Hub as a source in Observo AI. Follow these steps to set up and test the data flow:
Log in to Observo AI:
Click on “Add Sources” button and select “Create New”
Choose “Azure Blob Storage Receiver” from the list of available destinations to begin configuration.
General Settings:
Name: A unique identifier for the source such as Bob to EventHub : Observo Logs.
Description: Optional description for the source.
Mode : The mode to choose, whether to use direct Azure Blob directory traversal or rely on EventHub for Blob notifications. Default: Use Event-Hub for Blob Notifications.
Event Hub Endpoint:: This is your “Event Hub Key”
Authentication Method::Connection String
Connection String: This is your “Storage Key”
Advanced > Container Name:. Enter the Name of your Container (e.g., observo-events).

Config
Mode
azure-event-hub-novagrid-2
Event Hub Endpoint
Paste the Event Hub Key(from Step 2C).
CRITICAL: Ensure the string ends with ;EntityPath=observo-triggers (or your hub name).
Authentication Method
Select Connection String.
Connection String
Paste the Storage Key (from Step 1).
Starts with DefaultEndpointsProtocol=https...
Advanced > Logs Container Name
Enter the exact container name
(e.g., observo-events).
IMPORTANT WARNING: This source uses the $Default Consumer Group. Azure only allows one active reader for this group.
Do not keep the Azure Portal "Data Explorer" tab open while Observo is running.
Do not run other debugging tools connected to this Hub.
If another tool is "watching" the data, Observo may be blocked and will report 0 events.
NOTE: Source Data Formatting (Crucial) To ensure logs are ingested as events (and not fragmented lines), the source data in the blob container should be new line delimited. Example:
Format: Use NDJSON (Newline Delimited JSON).
Structure: One complete JSON object per line.
No Pretty-Print: Disable pretty-printing (do not use line breaks within a single JSON object).
Example of Valid Log File:
Phase3: Observo Parser Settings
Go to Parser Config in the source settings.
Set Enable Source Log parser to True.
Select JSON.
As needed: use “Explode Array Events” and “Merge Exploded with original Event”

Storage_menu
Before Parser:
After Parser:
Verification Checklist:
Save the Observo Source.
Start the Data Plane Tap: dataplane tap [SOURCE_ID].
Upload a new test file (formatted as NDJSON) to the Azure container.
Success: You will see the clean JSON objects scroll in the tap immediately.
Troubleshooting
Use these steps to resolve common issues during setup and operation:
Verify Configuration Settings:
Confirm Azure Storage Account, Event Hub, and Event Grid are correctly configured.
Verify the Event Hub connection string includes; EntityPath=.
Check Connection Strings:
Ensure the Event Hub Endpoint and Storage Connection String are correctly pasted in Observo AI.
Verify that credentials have not expired and have appropriate permissions.
Verify Event Subscription:
Go to the Storage Account → Events and verify the event subscription is active.
Upload a test file to the container and check the Event Hub Metrics to confirm events are being routed.
Test Data Flow:
Save the Observo source configuration.
Start the Data Plane Tap: dataplane tap [SOURCE_ID].
Upload a new test file (formatted as NDJSON) to the Azure container.
Success: You should see the parsed JSON objects appear in the tap immediately.
Data not ingested
Event Hub connection string incorrect or missing EntityPath
Verify Event Hub Endpoint includes ;EntityPath=
Authorization errors
Invalid or expired credentials
Check Storage Connection String and Event Hub credentials
Connectivity issues
Firewall or private endpoint restrictions
Allow HTTPS on port 443, verify network access
Events not received
Event Grid subscription not configured or inactive
Verify event subscription targets correct Event Hub
Consumer group conflict
Another tool using $Default consumer group
Close Azure Portal Data Explorer and other debugging tools
Resources
For additional guidance and detailed information, refer to the following resources:
Last updated
Was this helpful?

