Overview
We provide the DeGirum Tools Python module to aid development of AI applications with PySDK. In this group, we'll outline main concepts of DeGirum Tools and provide the API Reference Guide.
Last updated
Was this helpful?
We provide the DeGirum Tools Python module to aid development of AI applications with PySDK. In this group, we'll outline main concepts of DeGirum Tools and provide the API Reference Guide.
Last updated
Was this helpful?
DeGirum Tools extends PySDK with a kit for building multi-threaded, low-latency media pipelines. Where PySDK focuses on running a single model well, DeGirum Tools focuses on everything around it: video ingest, pre-and post-processing, multi-model fusion, result annotation, stream routing, and more.
In one sentence:
DeGirum Tools is a flow-based mini-framework that lets you prototype complex AI applications in a few dozen lines of Python.
The flow behind DeGirum Tools is supported by the Streams subsystem. There are three constituent Python submodules: , , and . In this subsystem, the two most important concepts to understand in streams are gizmos and compositions.
A is a worker that:
Consumes from one or more input streams.
Runs its custom run()
loop (decode, resize, infer, etc.).
Pushes new StreamData
to any number of output streams. StreamData
is described in more detail in .
Because every gizmo lives in its own thread, pipelines scale across CPU cores with almost no user code.
Gizmo families built into DeGirum Tools include:
Video IO
Capture, live preview, archival
Transform
Pre-process frames (letterbox, crop, pad)
AI Inference
Run models, cascade detectors & classifiers
Post-fusion
Merge multi-crop or multi-model outputs
Utility
Collect results in the main thread
start()
– spawn threads
stop()
– signal abort & join
wait()
– block until completion
get_bottlenecks()
– diagnose dropped-frame hotspots
Use it as a context-manager so everything shuts down even on exceptions.
Runs two models in parallel on the same image and concatenates results.
Detector → crops → classifier (adds labels back).
Detector → crops → refined detector (with optional NMS).
Use compound models exactly how you would use normal models:
Analyzers allow for advanced processing of inference results such as object tracking, line cross counting, in-zone counting, and more.
analyze()
– append extra fields, run business logic
annotate()
– draw overlays on the image
Clean up in finalize()
.
Any number of analyzers can be attached to regular PySDK models and compound models:
When used inside a gizmo pipeline, an analyzer can filter or decorate results in-flight.
, ,
,
,
Gizmos pass data around by using the class. A stream is an iterable queue that moves StreamData
objects between threads. Each queue may be bounded (with optional drop policy) to prevent bottlenecks, and it automatically propagates a poison pill sentinel to shut the pipeline down cleanly.
A collects any connected gizmos and controls their life-cycle:
wrap two PySDK models into a singlepredict()
/ predict_batch()
interface. Some compound models classes we provide in DeGirum Tools include:
A subclass can:
The helpers smooth the edges between PySDK and your application. Inference Support utilities include:
– one-liner to pick AI Hub, AI Server, or local inference.
/ – quick video loops when a full gizmo graph is overkill.
– benchmark a model in <10 LOC.