OCSF Serializer

Overview

The OCSF Serializer is a powerful data transformation tool in Observo AI that formats your security logs into the OCSF standard, enabling consistent log analysis across multiple data sources.

Purpose

The OCSF Serializer is a specialized transform within the Observo AI platform that converts security event data from various sources into a standardized OCSF schema format. This transformation happens in real-time as data flows through your Observo AI pipeline, before the logs are ingested into your destination observability platform or SIEM.

Usage

The OCSF Serializer operates as part of Observo AI's intelligent data pipeline:

  1. Data Ingestion: Observo AI receivers collect raw security logs from various sources (Okta, Netskope, Wiz, AWS CloudTrail, Cisco Duo, Proofpoint, etc.)

  2. Transform Layer: As logs flow through the pipeline, the OCSF Serializer transform intercepts and processes each event

  3. Source Detection: Observo AI automatically identifies the log source and applies the appropriate serializer version

  4. Field Mapping: Source-specific Lua scripts map native log fields to standardized OCSF schema fields

  5. Data Normalization:

    • Timestamps are converted to consistent formats

    • Severity levels are standardized

    • Field names and data types are normalized to OCSF specifications

  6. Custom Enrichment: Optional Lua scripts add custom fields, business context, or additional metadata

  7. Validation: Transformed data is validated against OCSF schema requirements

  8. Output Delivery: OCSF-formatted logs are delivered to your destination system (SIEM, data lake, observability platform)

Adding Transforms to Your Pipeline

  1. Go to your pipline, Click on edit pipeline

  2. Click on the + sign to add transforms to your data pipeline

  3. Select and add transforms to optimize and enrich your data

  4. Add the OCSF Serializer to format your logs into OCSF standard

Transforms allow you to normalize, enrich, and standardize your security event data before ingestion into your observability platform.

OCSF Serializer Configuration

Field Descriptions

The OCSF Serializer transform includes the following configuration options:

Enabled

  • Type: Boolean

  • Description: Enable or disable this transform

  • Default: True

Serializer

  • Type: Select (dropdown)

  • Category: Advanced Settings

  • Description: Determines what serializer is used to serialize the event form a specific source

  • Required: Yes

Lua Script

  • Type: Code (Lua language)

  • Description: Custom Lua script to be executed for each incoming event.

  • Customizable: Yes - users can add custom field mapping logic

Parallelism

  • Type: Number (integer)

  • Description: Number of desired instances of Lua transform runtimes

  • Default: 1

Supported OCSF Serializers

The following table shows the available OCSF serializers, their configurations, and supported data sources:

Observo Source Name
OCSF Serializer Name
Severity Attribute

Okta Log Collector

Okta (1.0.0)

severity

Netskope Alerts

Netskope (1.5.0)

None

Wiz Graph API Collector

Wiz Issue (1.0.0-rc3)

severity

Proofpoint SIEM API Collector

Proofpoint (1.0.0)

None

Cisco Duo Logs Collector

Cisco Duo (1.0.0)

None

AWS S3

AWS CloudTrail (1.0.0-rc3)

None

Time Attributes for Data Ingestion

Each OCSF serializer uses a specific time field from the source data for ingestion:

  • Okta (1.0.0): Uses the published field as the timestamp

  • Netskope (1.5.0): Uses the timestamp field

  • Wiz Issue (1.0.0-rc3): Uses the updatedAt field

  • Proofpoint (1.0.0): Uses messageTime for Message Events and clickTime for Click Events

  • Cisco Duo (1.0.0): Uses the isotimestamp field

  • AWS CloudTrail (1.0.0-rc3): Uses the eventTime field

These time attributes are critical for proper chronological ordering and time-based queries in your observability platform.

Lua Script Customization

Overview

Each OCSF serializer has different Lua scripts for field mapping that are specific to the data source format. These scripts handle the transformation from the native log format to OCSF schema.

Key Features

  • Source-Specific Mapping: Each serializer (Netskope, Okta, Wiz, Proofpoint, Cisco Duo, CloudTrail) has its own Lua script tailored to map that source's fields to OCSF

  • Fully Customizable: Users can modify the Lua scripts to add custom field mappings

Testing Transformations

Edit Mode Testing

Transformations can be tested with sample data in edit mode before deployment:

  1. Navigate to the transform configuration in edit mode

  2. Provide sample event data that matches your source format

  3. Execute the transformation to see the OCSF-formatted output

  4. Validate that all required fields are properly mapped

  5. Verify custom field mappings work as expected

  6. Review the output for any errors or warnings

Best Practices for Testing

  • Use real production-like sample data

  • Test edge cases (missing fields, null values, unexpected formats)

  • Validate timestamp conversions

  • Verify severity mappings (if applicable)

  • Check nested object transformations

  • Ensure custom enrichments function correctly

Implementation Workflow

Step 1: Access the Pipelines Section

  1. Log into your Observo account.

  2. Navigate to the “Pipelines” section from the left-hand panel.

Step 2: Create a New Pipeline

  1. Click on “Create Pipeline”.

  2. From the dropdown, select the source of your logs (e.g., S3 bucket).

  3. In the Destination section, select the AWS Security Lake destination previously configured.

Step 3: Add OCSF Serializer and Other Transforms

  1. Click on the + sign to add transforms.

  2. Select and add transforms to optimize and enrich your data.

  3. Add the OCSF Serializer to format your logs into OCSF standard.

  4. Select the Appropriate Serializer

    • Choose the serializer that matches your data source

    • Ensure the version number for compatibility

  5. Configure Time and Severity Mappings

    • Verify the time attribute matches your data source's timestamp field

    • Configure severity mapping if applicable to your use case

  6. Test with Sample Data

    • Use edit mode to test with representative sample data

    • Validate all mappings work correctly

    • Check for any errors or warnings

  7. Set Parallelism

    • Configure the number of parallel Lua runtime instances

Step 4: Deploy the Pipeline

  1. After configuring the pipeline (source, transforms, OCSF serializer, and destination), click “Deploy Pipeline”.

  2. Monitor logs and statuses to ensure data flows without issues.

Security

  • Access to system resources is restricted

  • Scripts cannot access external networks or file systems

  • All custom code should be reviewed for security best practices

Resources

For additional assistance with OCSF Serializer configuration:

  • Refer to the OCSF Schema documentation at ocsf.io

Last updated

Was this helpful?