Sentiment Analyzer

The Sentiment Analyzer extracts sentiment from incoming log events and attaches it as tags.

Purpose

Use the Sentiment Analyzer when you want to automatically process security log data—by extracting and quantifying sentiment—to distinguish normal, secure activities from anomalous or potentially harmful events, thereby enabling security teams to prioritize critical alerts, reduce alert fatigue, and gain deeper contextual insights for a more efficient and proactive incident response.

Usage

Select Sentiment Analyzer transform. Add Name (required) and Description (optional).

Config: Enabled: Defaults to disabled, meaning it does not evaluate all events. Toggle Enabled on to allow event processing to feed data to the downstream Transforms.

Use logpath already set for log locations: By default, this setting is enabled, so it uses the default log path along with any additional raw log locations added via the Add button. Disable the setting to rely solely on the raw log locations specified within the Transform.

Note: The default log path is specified in the Log Payload Paths section of the Log Metadata Enricher Configurations within the Source definition that initiates the pipeline.

Repeatedly click the Add button to include additional Locations for raw log fields.

Advanced:

Field for populating sentiment: Field name which will be populated with sentiment value: positive or negative. Default name: sentiment.

Negative Words: Negative words which signify a negative sentiment.

Repeatedly click the Add button to include additional Negative Words.

Default Negative Words

abnormal

abort

broken

...

unstable

Negative Regexes: Negative regexes which signify a negative sentiment.

Repeatedly click the Add button to include additional Negative Regexes.

Default Negative Regexes

time[\s_-]*out

timed[\s_-]*out

(?i)bad[\s_-]*gateway

...

(?i)too[\s_-]*many

Examples

Add Additional Locations for Raw Logs

Scenario: Add additional locations for raw logs

The Observo AI Pattern Extractor starts with the transformation of raw log data into actionable insights built into configuration at each Source configuration. Initially, incoming log data is enriched with contextual metadata—such as timestamps, host details, and event identifiers—based on flexible configurations. This added context sets the stage for precise pattern identification.

Pattern Extractor This section walks through an example to illustrate how the Pattern Extractor, leveraging memory-efficient algorithms, processes enriched logs in real time to identify recurring patterns, group similar events and apply sentiment analysis.

Log Metadata Enricher Configurations

Log Payload Paths

message

Pattern Extractor Enricher

Pattern Extractor Configs

sentiment

appname

Sentiment Analyzer

Enabled
Use logpath already set for log locations
Locations for raw log

Toggled on

Toggled on

message

Log Metadata Enricher Configurations

Log Payload Paths

message

Pattern Extractor Enricher

Pattern Extractor Configs

sentiment

appname

Sentiment Analyzer

Enabled
Use logpath already set for log locations
Locations for raw log

Toggled on

Toggled on

message

In this example, the Log Payload Paths in the Log Metadata Enricher Configurations definition is set to the parsed message field, while the Pattern Extractor Configs in the Pattern Extractor Enricher definition is configured with sentiment and appname. The Sentiment Analyzer is enabled, with the "Use logpath already set for log locations" option turned on, and the Location for raw logs is set to message.

Within the Sentiment Analyzer transform, the severity log field is added to the Locations for raw log entry.

Sentiment Analyzer > Config

Enabled: Toggled on

Use logpath already set for log locations: Toggled on

Locations for raw log: severity (Use Add button)

Results: The Pattern Extractor adds sentiment tagging based on content in the message or severity fields.

These results are reflected in the Data Insights Dashboard. See Data Insights Dashboard subsection within Analytics for more details about each panel.

The resulting output is reflected in the accompanying panels:

  • Log Data Summary By Key: Organizes log data based on specific keys for structured analysis.

  • Tags Trends for Patterns: Tracks trends in tagged log data to identify recurring behaviors.

  • Patterns Trend: Analyzes recurring log patterns over time to detect operational shifts.

Best Practices

Based on Observo AI’s approach, best practices for sentiment analysis on log data focus on ensuring the data itself is high-quality and richly annotated:

  • High-Fidelity Data Collection: Retain the full detail of log entries (such as timestamps, IP addresses, event types) to preserve context and support both historical analysis and real-time alerting.

  • Structured and Consistent Data Formatting: Standardize log formats and enforce consistent schema across all sources so that each entry can be reliably tokenized and analyzed.

  • Domain-Specific Vocabulary and Lexicons: Curate and regularly update sentiment lexicons tailored to security logs that distinguish routine operations (like “login successful”) from anomalous events (such as “failed login attempts”).

  • Contextual Metadata Enrichment: Enhance raw log data with additional metadata (such as error codes, user IDs, and severity levels) to provide deeper insight into the sentiment behind each event.

  • Regular Data Quality Checks: Implement processes to detect and clean noisy or incomplete log entries, ensuring that sentiment analysis is performed on accurate, relevant data.

By focusing on these data-centric practices, organizations can improve the accuracy and reliability of sentiment analysis, enabling security and DevOps teams to quickly identify and act on critical events.

Last updated

Was this helpful?