Lookup Tables
Overview
Observo supports the use of lookup tables to enhance data context within observability pipelines. By allowing integration with external data sources, lookup tables serve as a mechanism to enrich logs and metrics with additional information, such as geographic data or user attributes. This feature enables more nuanced and informed analysis by correlating external data with observability data, thus improving the accuracy and depth of insights generated from the systems being monitored.
Creating a Lookup Table
Within Observo, navigate to Settings > Files > Add File.
Types of Lookup Tables
File Based Lookup Tables

File Based Lookup Tables in Observo allow you to upload a CSV file and specify which columns to use for lookups. This option is particularly useful when you have static data that doesn't change frequently and can be captured in a file. By mapping the columns during the upload process, you can efficiently retrieve data based on predefined keys.
File Schema
A file schema in Observo defines the structure of data files, specifying how data is organized and what types of data are included. For example, a file schema for a CSV file might specify that it contains two columns: "IP" (as a string value), and "ThreatLevel" (as an numeric value). To define this schema in Observo, you would create a schema object referencing each field with appropriate data types and constraints. This ensures that every file adheres to the expected format, allowing reliable data parsing and processing.
Dynamic Lookup Tables
Dynamic Lookup Tables in Observo enable users to define scripts that fetch data from third-party sources, such as external APIs based on a cron expression. This feature is particularly useful when dealing with tables that are constantly changing. For example, a threat intel feed of IP address may be updated every few hours and it's crucial to have the most updated context when enriching your observability data.
Script
The script defined as part of the Dynamic Lookup Table is executed based on the cron defined. This can be any bash script that pulls a csv file from an external source and prints it out to stdout.

#!/bin/bash
# Function to fetch threat intel CSV
API_URL="https://api.example.com/threat-feed"
fetch_threat_intel() {
# Use curl to make the API request
# -s for silent mode, -H to set headers
curl -s -H "Authorization: Bearer $API_KEY" "$API_URL"
}
# Call the function to get CSV data
csv_data=$(fetch_threat_intel)
# Check if the API call was successful
if [ $? -eq 0 ]; then
echo "$csv_data"
else
echo "Error: Failed to retrieve threat intelligence data."
exit 1
fiCron Expression
Cron expressions are used to schedule the execution of scripts at specific intervals. In the context of dynamic lookup tables, they determine how frequently data updates occur by setting the timing for when the script runs.
Here is an example of a cron expression:
0 */6 * * * This expression schedules a script to run every 6 hours. The components of the cron expression are as follows:
0- Minute field, set to 0.*/6- Hour field, runs every 6 hours.*- Day of the month field, runs every day.*- Month field, runs every month.*- Day of the week field, runs every day of the week.

Secret File
In Observo, you can select existing Secret values to be used as environment variables within your dynamic lookup script. Once selected, these values will be configured to load as environment variables, ensuring they are securely accessed in your script execution environment. This integration allows your dynamic lookup logic to leverage sensitive data without hardcoding it into the script, thus maintaining security and flexibility in your data handling processes.
File Schema
A file schema in Observo defines the structure of data files, specifying how data is organized and what types of data are included. For example, a file schema for a CSV file might specify that it contains two columns: "IP" (as a string value), and "ThreatLevel" (as an numeric value). To define this schema in Observo, you would create a schema object referencing each field with appropriate data types and constraints. This ensures that every file adheres to the expected format, allowing reliable data parsing and processing. The output CSV file of the script defined as part of the Dynamic Lookup Table should adhere to the scheme defined here.
Using Lookup Tables in Pipelines
Last updated
Was this helpful?

