Videos
Learn how to run real-time inference on video streams using predict_stream. This page covers video files, webcams, and RTSP sources—all with minimal setup.
Common setup (used in all cases)
import degirum_tools
from degirum_tools import ModelSpec, Display, remote_assets
# Configure & load once
model_spec = ModelSpec(
model_name="yolov8n_coco--640x640_quant_hailort_multidevice_1",
zoo_url="degirum/hailo",
inference_host_address="@local",
model_properties={"device_type": ["HAILORT/HAILO8", "HAILORT/HAILO8L"]},
)
model = model_spec.load_model()Video file
video_source = "path/to/video.mp4" # or use a built-in sample:
# video_source = remote_assets.traffic
with Display("AI Camera — File") as disp:
for result in degirum_tools.predict_stream(model, video_source):
disp.show(result.image_overlay)Webcam
RTSP stream
Last updated
Was this helpful?

