OCSF Serializer
Overview
The OCSF Serializer is a powerful data transformation tool in Observo AI that formats your security logs into the OCSF standard, enabling consistent log analysis across multiple data sources.
Purpose
The OCSF Serializer is a specialized transform within the Observo AI platform that converts security event data from various sources into a standardized OCSF schema format. This transformation happens in real-time as data flows through your Observo AI pipeline, before the logs are ingested into your destination observability platform or SIEM.
Usage
The OCSF Serializer operates as part of Observo AI's intelligent data pipeline:
Data Ingestion: Observo AI receivers collect raw security logs from various sources (Okta, Netskope, Wiz, AWS CloudTrail, Cisco Duo, Proofpoint, etc.)
Transform Layer: As logs flow through the pipeline, the OCSF Serializer transform intercepts and processes each event
Source Detection: Observo AI automatically identifies the log source and applies the appropriate serializer version
Field Mapping: Source-specific Lua scripts map native log fields to standardized OCSF schema fields
Data Normalization:
Timestamps are converted to consistent formats
Severity levels are standardized
Field names and data types are normalized to OCSF specifications
Custom Enrichment: Optional Lua scripts add custom fields, business context, or additional metadata
Validation: Transformed data is validated against OCSF schema requirements
Output Delivery: OCSF-formatted logs are delivered to your destination system (SIEM, data lake, observability platform)
Adding Transforms to Your Pipeline
Go to your pipline, Click on edit pipeline
Click on the + sign to add transforms to your data pipeline
Select and add transforms to optimize and enrich your data
Add the OCSF Serializer to format your logs into OCSF standard
Transforms allow you to normalize, enrich, and standardize your security event data before ingestion into your observability platform.
OCSF Serializer Configuration
Field Descriptions
The OCSF Serializer transform includes the following configuration options:
Enabled
Type: Boolean
Description: Enable or disable this transform
Default: True
Serializer
Type: Select (dropdown)
Category: Advanced Settings
Description: Determines what serializer is used to serialize the event form a specific source
Required: Yes
Lua Script
Type: Code (Lua language)
Description: Custom Lua script to be executed for each incoming event.
Customizable: Yes - users can add custom field mapping logic
Parallelism
Type: Number (integer)
Description: Number of desired instances of Lua transform runtimes
Default: 1
Supported OCSF Serializers
The following table shows the available OCSF serializers, their configurations, and supported data sources:
Okta Log Collector
Okta (1.0.0)
severity
Netskope Alerts
Netskope (1.5.0)
None
Wiz Graph API Collector
Wiz Issue (1.0.0-rc3)
severity
Proofpoint SIEM API Collector
Proofpoint (1.0.0)
None
Cisco Duo Logs Collector
Cisco Duo (1.0.0)
None
AWS S3
AWS CloudTrail (1.0.0-rc3)
None
Time Attributes for Data Ingestion
Each OCSF serializer uses a specific time field from the source data for ingestion:
Okta (1.0.0): Uses the
publishedfield as the timestampNetskope (1.5.0): Uses the
timestampfieldWiz Issue (1.0.0-rc3): Uses the
updatedAtfieldProofpoint (1.0.0): Uses
messageTimefor Message Events andclickTimefor Click EventsCisco Duo (1.0.0): Uses the
isotimestampfieldAWS CloudTrail (1.0.0-rc3): Uses the
eventTimefield
These time attributes are critical for proper chronological ordering and time-based queries in your observability platform.
Lua Script Customization
Overview
Each OCSF serializer has different Lua scripts for field mapping that are specific to the data source format. These scripts handle the transformation from the native log format to OCSF schema.
Key Features
Source-Specific Mapping: Each serializer (Netskope, Okta, Wiz, Proofpoint, Cisco Duo, CloudTrail) has its own Lua script tailored to map that source's fields to OCSF
Fully Customizable: Users can modify the Lua scripts to add custom field mappings
Testing Transformations
Edit Mode Testing
Transformations can be tested with sample data in edit mode before deployment:
Navigate to the transform configuration in edit mode
Provide sample event data that matches your source format
Execute the transformation to see the OCSF-formatted output
Validate that all required fields are properly mapped
Verify custom field mappings work as expected
Review the output for any errors or warnings
Best Practices for Testing
Use real production-like sample data
Test edge cases (missing fields, null values, unexpected formats)
Validate timestamp conversions
Verify severity mappings (if applicable)
Check nested object transformations
Ensure custom enrichments function correctly
Implementation Workflow
Step 1: Access the Pipelines Section
Log into your Observo account.
Navigate to the “Pipelines” section from the left-hand panel.
Step 2: Create a New Pipeline
Click on “Create Pipeline”.
From the dropdown, select the source of your logs (e.g., S3 bucket).
In the Destination section, select the AWS Security Lake destination previously configured.
Step 3: Add OCSF Serializer and Other Transforms
Click on the
+sign to add transforms.Select and add transforms to optimize and enrich your data.
Add the OCSF Serializer to format your logs into OCSF standard.
Select the Appropriate Serializer
Choose the serializer that matches your data source
Ensure the version number for compatibility
Configure Time and Severity Mappings
Verify the time attribute matches your data source's timestamp field
Configure severity mapping if applicable to your use case
Test with Sample Data
Use edit mode to test with representative sample data
Validate all mappings work correctly
Check for any errors or warnings
Set Parallelism
Configure the number of parallel Lua runtime instances
Step 4: Deploy the Pipeline
After configuring the pipeline (source, transforms, OCSF serializer, and destination), click “Deploy Pipeline”.
Monitor logs and statuses to ensure data flows without issues.
Security
Access to system resources is restricted
Scripts cannot access external networks or file systems
All custom code should be reviewed for security best practices
Resources
For additional assistance with OCSF Serializer configuration:
Refer to the OCSF Schema documentation at ocsf.io
Last updated
Was this helpful?

