Discover Hailo models
Start with precompiled Hailo models that run out of the box—learn how to pick the right variant for your device.
Estimated read time: 3 minutes
Where the model comes from: Hailo Model Zoo
In the First inference guide, we specified the model by:
model_name="yolov8n_coco--640x640_quant_hailort_multidevice_1"zoo_url="degirum/hailo"inference_host_address="@local"model_properties={"device_type": ["HAILORT/HAILO8", "HAILORT/HAILO8L"]}
That zoo_url points to the Hailo Model Zoo, a curated collection of models precompiled for HailoRT and tested to run on Hailo devices. We highlight this zoo because it lets you run examples immediately, without any conversion steps.
Why we start with the Hailo Model Zoo
Instant results: models are precompiled and follow predictable I/O, so examples “just work”
Consistent setup: you focus on
model_name,zoo_url, anddevice_type—everything else stays the sameEasy to swap: change only the
model_nameto try a different model
Example models
# ImageNet classification model
model_spec = ModelSpec(
model_name="yolov8s_silu_imagenet--224x224_quant_hailort_hailo8l_1",
zoo_url="degirum/hailo",
inference_host_address="@local",
model_properties={"device_type": ["HAILORT/HAILO8", "HAILORT/HAILO8L"]},
)
# COCO object detection model
model_spec = ModelSpec(
model_name="yolov8n_coco--640x640_quant_hailort_hailo8_1",
zoo_url="degirum/hailo",
inference_host_address="@local",
model_properties={"device_type": ["HAILORT/HAILO8", "HAILORT/HAILO8L"]},
)
# Face detection model
model_spec = ModelSpec(
model_name="yolov8n_relu6_face--640x640_quant_hailort_multidevice_1",
zoo_url="degirum/hailo",
inference_host_address="@local",
model_properties={"device_type": ["HAILORT/HAILO8", "HAILORT/HAILO8L"]},
)Hailo-8 vs. Hailo-8L: picking the right device type
Compatibility rule of thumb
A model compiled for Hailo-8L will run on Hailo-8, but not the other way around.
That's why our starter examples use 8L-compatible or multidevice builds. We also specify device_type as a list—so the same code runs on both boards.
For examples: We prioritize plug-and-play. Your snippet should "just work," whether you use Hailo-8 or Hailo-8L.
For deployments: A model compiled for Hailo-8L will run on Hailo-8, but won't be optimized. For best performance (e.g., FPS, latency, power), use a model compiled specifically for your hardware.
Example: picking a YOLOv8n variant
Runs on both (portable):
yolov8n_coco--640x640_quant_hailort_multidevice_1Great for docs and demos—this model works on Hailo-8 and Hailo-8L with the same code.
Optimized for Hailo-8:
yolov8n_coco--640x640_quant_hailort_hailo8_1Best for production on Hailo-8; this model does not support Hailo-8L.
# Portable demo (both Hailo-8 and Hailo-8L)
model_spec = ModelSpec(
model_name="yolov8n_coco--640x640_quant_hailort_multidevice_1",
zoo_url="degirum/hailo",
inference_host_address="@local",
model_properties={"device_type": ["HAILORT/HAILO8", "HAILORT/HAILO8L"]},
)
# Optimized for Hailo-8 (recommended for real deployments on Hailo-8)
model_spec = ModelSpec(
model_name="yolov8n_coco--640x640_quant_hailort_hailo8_1",
zoo_url="degirum/hailo",
inference_host_address="@local",
model_properties={"device_type": ["HAILORT/HAILO8"]},
)Bring your own Hailo model (advanced)
When you’re ready, the advanced guide shows how to:
Package your Hailo-compiled artifacts with a PySDK
model.jsonthat defines inputs, outputs, preprocessing, and postprocessing.Point PySDK to your private or local model zoo.
Rescan the zoo as you add models.
Use the same
ModelSpecpattern—only thezoo_urlandmodel_namechange.
Last updated
Was this helpful?

