Observo Sample Logs

Produce various kinds of mock log entries. Beneficial for testing purposes.

Purpose

The purpose of the Observo AI Source - Observo Sample Logs is to enable users to ingest a predefined collection of simulated log data into the Observo AI platform for testing, demonstration, and training purposes. It provides sample logs in formats such as JSON, CSV, or text, allowing organizations to experiment with data pipelines, validate configurations, and explore observability and analytics features without needing live data from external systems. This integration helps users simulate real-world scenarios, test processing workflows, and gain insights to enhance system visibility and operational intelligence within a controlled environment.

Prerequisites

Before configuring the Observo Sample Logs source in Observo AI, ensure the following requirements are met to facilitate seamless data ingestion:

  • Observo AI Platform Setup:

    • The Observo AI Site must be installed and available.

    • Validate data formats, such as JSON, CSV, or text, for compatibility with sample logs provided by Observo AI.

  • Observo Sample Logs Access:

    • Access to Observo AI’s built-in sample log dataset, a predefined collection of simulated logs for testing and demonstration purposes.

    • Obtain the sample log identifier or path from the Observo AI documentation or admin interface.

  • Network and Connectivity:

    • Ensure Observo AI can internally access the sample log dataset, which is typically hosted within the platform.

Prerequisite
Description
Notes

Observo AI Platform

Must support Observo Sample Logs source

Verify data format compatibility

Observo Sample Logs

Access to built-in sample log dataset

Obtain identifier or path from documentation

Network

Connectivity to sample log location

Allow internal access or check custom endpoint

Integration

The Integration section outlines configurations. To configure Observo Sample Logs as a source in Observo AI, follow these steps:

  1. Log in to Observo AI:

    • Navigate to Sources Tab

    • Click on "Add Sources" button and select "Create New"

    • Choose "Observo Sample Logs" from the list of available sources to begin configuration.

  2. General Settings:

    • Name: Add a unique identifier such as observo-sample-logs-1

    • Description (Optional): Add description

    • Log Format:Select from the options.

      Options

      Apache Common Logs

      HTTP server logs in json format

      Syslog RFC 3164 format

      Syslog RFC 5424 format

    • No of lines to output. Defaults to infinity: Increment as needed

    • Time in seconds to pause between batches.: Default: 1. Increment as needed.

  3. Framing (Optional):

    • Framing Delimiter (Empty): The character that delimits byte sequences.

    • Framing Max Length (None): The maximum length of the byte buffer. This length does not include the trailing delimiter. By default, there is no maximum length enforced. If events are malformed, this can lead to additional resource usage as events continue to be buffered in memory, and can potentially lead to memory exhaustion in extreme cases. If there is a risk of processing malformed data, such as logs with user-controlled input, consider setting the maximum length to a reasonably large value as a safety net. This will ensure that processing is not truly unbounded.

    • Framing Method (Empty): The framing method.

      Options

      Byte Frames

      Character Delimited

      Length Delimited

      Byte frames which are delimited by a newline character.

      Octet Counting

    • Framing Newline Delimited Max Length (None): The maximum length of the byte buffer. This length does not include the trailing delimiter. By default, there is no maximum length enforced. If events are malformed, this can lead to additional resource usage as events continue to be buffered in memory, and can potentially lead to memory exhaustion in extreme cases. If there is a risk of processing malformed data, such as logs with user-controlled input, consider setting the maximum length to a reasonably large value as a safety net. This will ensure that processing is not truly unbounded.

    • Framing Octet Counting Max Length (None): The maximum length of the byte buffer.

  4. Parser Config:

    • Enable Source Log Parser: (False)

    • Toggle Enable Source Log Parser Switch to enable

    • Select appropriate Parser from the Source Log Parser dropdown

    • Add additional Parsers as needed

  5. Pattern Extractor:

    • See Pattern Extractor for details.

  6. Archival Destination:

    • Toggle Enable Archival on Source Switch to enable

    • Under Archival Destination, select from the list of Archival Destinations (Required)

  7. Save and Test Configuration:

    • Save the configuration settings.

    • Verify that data is being ingested from the Observo Sample Logs dataset.

Example Scenarios

RetailTrend Innovations is a fictitious mid-sized retail chain specializing in eco-friendly clothing and accessories, operating both physical stores and an e-commerce platform. To enhance their observability and analytics capabilities, RetailTrend Innovations uses the Observo AI platform to monitor and analyze system performance, customer interactions, and transaction logs. For testing and training purposes, the IT team decides to configure the Observo Sample Logs source to simulate HTTP server logs in JSON format, mimicking web traffic and transaction data from their e-commerce platform. This allows them to test data pipelines, validate configurations, and train staff on observability features in a controlled environment without relying on live customer data.

Standard Observo Sample Logs Destination Setup

Here is a standard Observo Sample Logs Source configuration example. Only the required sections and their associated field updates are displayed in the table below:

General Settings

Field
Value
Notes

Name

observo-sample-logs-retailtrend-1

Unique identifier for the source, specific to RetailTrend Innovations' configuration.

Description

Simulated HTTP server logs for e-commerce platform testing

Optional description to clarify the purpose of the source.

Log Format

HTTP server logs in JSON format

Selected to simulate web traffic and transaction logs for the e-commerce platform.

No of lines to output

10000

Set to 10,000 lines to limit the dataset for testing, overriding the default of infinity.

Time in seconds to pause between batches

2

Set to 2 seconds to allow controlled data ingestion for testing purposes.

Framing

Field
Value
Notes

Framing Delimiter

,

Comma as the character that delimits byte sequences, suitable for JSON log entries.

Framing Max Length

1024

Set to 1024 bytes to limit buffer size, preventing memory issues with malformed data.

Framing Method

Character Delimited

Selected as specified to frame logs using a comma delimiter.

Framing Newline Delimited Max Length

1024

Set to 1024 bytes to ensure consistent buffer limits, aligning with Framing Max Length.

Framing Octet Counting Max Length

1024

Set to 1024 bytes for consistency across framing configurations.

Parser Config

Description
Details

Enable the parser

Toggle the "Enable Source Log Parser" switch to true.

JSON Parser

Select the JSON Parser from the Source Log Parser dropdown to match the HTTP server logs format.

None

No additional parsers are needed for this configuration.

Troubleshooting

If issues arise with the Observo Sample Logs source in Observo AI, use the following steps to diagnose and resolve them:

  • Verify Configuration Settings:

    • Ensure Log Dataset and other fields are correctly configured and match the Observo AI sample log setup.

    • Confirm that the dataset identifier or path exists and contains sample data.

    • Ensure the setup allows open access to the sample logs.

  • Monitor Logs:

    • Check Observo AI’s Logs tab for errors or warnings related to data ingestion.

    • Verify sample log activity using Observo AI’s built-in tools or preview features.

  • Validate Connectivity:

    • Ensure Observo AI can access the sample log dataset, hosted internally.

  • Common Error Messages:

    • "Dataset not found": Indicates an invalid or missing dataset identifier. Verify the Log Dataset name or path.

    • "No data ingested": Confirm the sample dataset contains logs and check Max Records setting for limits.

  • Test Data Flow:

    • Verify data ingestion within the targeted Observo AI pipeline.

    • Use the Analytics tab in the targeted Observo AI pipeline to monitor data volume and ensure expected throughput.

Issue
Possible Cause
Resolution

Data not ingested

Incorrect dataset identifier

Verify dataset exists and path is correct

No data in dataset

Empty sample dataset

Verify logs exist in dataset

Resources

For additional guidance and detailed information, refer to the following resources:

Last updated

Was this helpful?