Log Archival
Introduction
Basic Setup
name: "event-archival"
description: "Event data archival to Azure Blob Storage"
container_name: "eventdata"
storage_account: "myarchivalaccount"JSON Configuration
name: "json-archival"
description: "JSON event archival with GZip compression"
container_name: "jsoneventdata"
storage_account: "myarchivalaccount"
encoding:
codec: "JSON Encoding"
compression: "GZip"
blob_prefix: "year=%Y/month=%m/day=%d"
blob_time_format: "%s"
batch_max_events: 1000
batch_timeout_secs: 300Parquet Configuration
Advanced Configuration Tips
Batching Optimization
Blob Naming Strategy
Health Monitoring
Sample Output Files
Parquet Configuration with observo_record Example
Sample Configuration
Example Input Log
Resulting Parquet File Content
Key Benefits of This Approach
Example Queries
Another Example with Application Metrics
Input Log
Resulting Parquet Storage
Best Practices for Parquet Schema Design with observo_record
Best Practices
Conclusion
Last updated
Was this helpful?

