LogoLogo
AI HubCommunityWebsite
  • Start Here
  • AI Hub
    • Overview
    • Quickstart
    • Teams
    • Device Farm
    • Browser Inference
    • Model Zoo
      • Hailo
      • Intel
      • MemryX
      • BrainChip
      • Google
      • DeGirum
      • Rockchip
    • View and Create Model Zoos
    • Model Compiler
    • PySDK Integration
  • PySDK
    • Overview
    • Quickstart
    • Installation
    • Runtimes and Drivers
      • Hailo
      • OpenVINO
      • MemryX
      • BrainChip
      • Rockchip
      • ONNX
    • PySDK User Guide
      • Core Concepts
      • Organizing Models
      • Setting Up an AI Server
      • Loading an AI Model
      • Running AI Model Inference
      • Model JSON Structure
      • Command Line Interface
      • API Reference Guide
        • PySDK Package
        • Model Module
        • Zoo Manager Module
        • Postprocessor Module
        • AI Server Module
        • Miscellaneous Modules
      • Older PySDK User Guides
        • PySDK 0.16.0
        • PySDK 0.15.2
        • PySDK 0.15.1
        • PySDK 0.15.0
        • PySDK 0.14.3
        • PySDK 0.14.2
        • PySDK 0.14.1
        • PySDK 0.14.0
        • PySDK 0.13.4
        • PySDK 0.13.3
        • PySDK 0.13.2
        • PySDK 0.13.1
        • PySDK 0.13.0
    • Release Notes
      • Retired Versions
    • EULA
  • DeGirum Tools
    • Overview
      • Streams
        • Streams Base
        • Streams Gizmos
      • Compound Models
      • Inference Support
      • Analyzers
        • Clip Saver
        • Event Detector
        • Line Count
        • Notifier
        • Object Selector
        • Object Tracker
        • Zone Count
  • DeGirumJS
    • Overview
    • Get Started
    • Understanding Results
    • Release Notes
    • API Reference Guides
      • DeGirumJS 0.1.3
      • DeGirumJS 0.1.2
      • DeGirumJS 0.1.1
      • DeGirumJS 0.1.0
      • DeGirumJS 0.0.9
      • DeGirumJS 0.0.8
      • DeGirumJS 0.0.7
      • DeGirumJS 0.0.6
      • DeGirumJS 0.0.5
      • DeGirumJS 0.0.4
      • DeGirumJS 0.0.3
      • DeGirumJS 0.0.2
      • DeGirumJS 0.0.1
  • Orca
    • Overview
    • Benchmarks
    • Unboxing and Installation
    • M.2 Setup
    • USB Setup
    • Thermal Management
    • Tools
  • Resources
    • External Links
Powered by GitBook

Get Started

  • AI Hub Quickstart
  • PySDK Quickstart
  • PySDK in Colab

Resources

  • AI Hub
  • Community
  • DeGirum Website

Social

  • LinkedIn
  • YouTube

Legal

  • PySDK EULA
  • Terms of Service
  • Privacy Policy

© 2025 DeGirum Corp.

On this page
  • Notification Analyzer Module Overview
  • Classes
  • EventNotifier

Was this helpful?

  1. DeGirum Tools
  2. Overview
  3. Analyzers

Notifier

This API Reference is based on DeGirum Tools version 0.16.6.

Notification Analyzer Module Overview

This module provides tools for generating and delivering notifications based on AI inference events. It implements the EventNotifier analyzer for triggering notifications and optional clip saving on events.

Key Features

  • Event-Based Triggers: Generates notifications when user-defined event conditions are met

  • Message Formatting: Supports Python format strings for dynamic notification content

  • Holdoff Control: Configurable time/frame windows to suppress repeat notifications

  • Video Clip Saving: Optional video clip saving with local or cloud storage

  • Visual Overlay: Annotates active notification status on images

  • File Management: Handles temporary file cleanup and storage integration

Typical Usage

  1. Configure notification service. For external services, use Apprise URL or config file. For console output, use "json://console" as notification_config

  2. Define event conditions and create EventNotifier instances

  3. Process inference results through the notifier chain

  4. Notifications are sent when conditions are met

Integration Notes

  • Requires EventDetector analyzer in the chain to provide event detection

  • Optional dependencies (e.g., apprise) must be installed for external notification services

  • Storage configuration required for clip saving (supports both local and cloud storage)

  • Supports both frame-based and time-based notification holdoff periods

Key Classes

  • EventNotifier: Analyzer for triggering notifications based on event conditions

Configuration Options

  • notification_config: Apprise URL or config file for notification service, or "json://console" for stdout output

  • notification_title: Default title for notifications

  • holdoff_frames: Number of frames to wait between notifications

  • holdoff_seconds: Time in seconds to wait between notifications

  • clip_save: Enable/disable video clip saving

  • storage_config: Storage configuration for clip saving (supports local and cloud storage)

  • show_overlay: Enable/disable visual annotations

Example

For local storage configuration:

clip_storage_config = ObjectStorageConfig(
    endpoint=".",  # path to local folder
    access_key="",  # not needed for local storage
    secret_key="",  # not needed for local storage
    bucket="my_bucket_dir",  # subdirectory name for local storage
)

Classes

EventNotifier

EventNotifier

Analyzer for event-based notifications.

Works in conjunction with an EventDetector analyzer by examining the events_detected set in the inference results. Generates notifications when user-defined event conditions are met.

Features

  • Message formatting using Python format strings (e.g., {result} for inference results)

  • Holdoff to suppress repeat notifications within a specified time/frame window

  • Optional video clip saving upon notification trigger with local or cloud storage

  • Records triggered notifications in the result object's notifications dictionary

  • Overlay annotation of active notification status on images

Functions

__init__(name, ...)

__init__(name, condition, *, message='', holdoff=0, notification_config=None, notification_tags=None, show_overlay=True, annotation_color=None, annotation_font_scale=None, annotation_pos=AnchorPoint.BOTTOM_LEFT, annotation_cool_down=3.0, clip_save=False, clip_sub_dir='', clip_duration=0, clip_pre_trigger_delay=0, clip_embed_ai_annotations=True, clip_target_fps=30.0, storage_config=None)

Constructor.

Parameters:

Name
Type
Description
Default

name

str

Name of the notification.

required

condition

str

Python expression defining the condition to trigger the notification (references event names from EventDetector).

required

message

str

Notification message format string. If empty, uses "Notification triggered: {name}". Default is "".

''

holdoff

int | float | Tuple[float, str]

Holdoff duration to suppress repeated notifications. If int, interpreted as frames; if float, as seconds; if tuple (value, "seconds"/"frames"), uses the specified unit. Default is 0 (no holdoff).

0

notification_config

str

Notification service config file path, Apprise URL, or "json://console" for stdout output. If None, notifications are not sent to any external service.

None

notification_tags

str

Tags to attach to notifications for filtering. Multiple tags can be separated by commas (for logical AND) or spaces (for logical OR).

None

show_overlay

bool

Whether to overlay notification text on images. Default is True.

True

annotation_color

tuple

RGB color for the annotation text background. If None, uses a complementary color to the result overlay.

None

annotation_font_scale

float

Font scale for the annotation text. If None, uses the default model font scale.

None

annotation_pos

AnchorPoint | Tuple[int, int] | List[int]

Position to place annotation text (either an AnchorPoint or an (x,y) coordinate). Default is AnchorPoint.BOTTOM_LEFT.

BOTTOM_LEFT

annotation_cool_down

float

Time in seconds to display the notification text on the image. Default is 3.0.

3.0

clip_save

bool

If True, save a video clip when the notification triggers. Default is False.

False

clip_sub_dir

str

Subdirectory name in the storage bucket for saved clips. Default is "" (no subdirectory).

''

clip_duration

int

Length of the saved video clip in frames. Default is 0 (uses available frames around event).

0

clip_pre_trigger_delay

int

Number of frames to include before the trigger event in the saved clip. Default is 0.

0

clip_embed_ai_annotations

bool

If True, embed AI annotations in the saved clip. Default is True.

True

clip_target_fps

float

Frame rate (FPS) for the saved video clip. Default is 30.0.

30.0

storage_config

ObjectStorageConfig

Storage configuration for clip saving. For local storage, use endpoint="" and local directory as bucket. For cloud storage, use S3-compatible endpoint and credentials. If None, clips are only saved locally.

None

Raises:

Type
Description

ValueError

If holdoff unit is not "seconds" or "frames".

ImportError

If required optional packages are not installed.

analyze(result)

analyze(result)

Evaluate the notification condition on the given inference result.

If the condition is satisfied (and not within a holdoff period), generates a notification message and stores it in result.notifications. Optionally saves a video clip when a notification is triggered, and schedules that clip for upload if storage is configured. This method modifies the input result object in-place.

Parameters:

Name
Type
Description
Default

result

InferenceResults

The inference result to analyze, which should include events detected by an EventDetector.

required

Returns:

Type
Description

None

This method modifies the input result object in-place.

Raises:

Type
Description

AttributeError

If the result does not contain events_detected (EventDetector not in chain).

finalize

finalize()

Finalize and clean up resources.

Waits for all background clip-saving threads to finish, stops the notification server, and removes the temporary clip directory if it was used.

PreviousLine CountNextObject Selector

Last updated 5 days ago

Was this helpful?

Bases:

ResultAnalyzerBase