Inference setup
PySDK gives you flexibility in where models are stored and where inferences run. This page walks through common setups (cloud, local, and hybrid) so you can choose what fits your workflow.
Inference setup—pick your scenario
Cloud inference
from degirum_tools import ModelSpec
spec = ModelSpec(
model_name="yolov8n_coco--640x640_quant_hailort_multidevice_1",
zoo_url="degirum/hailo", # cloud model zoo
inference_host_address="@local", # inference executes on your machine
model_properties={"device_type": ["HAILORT/HAILO8L", "HAILORT/HAILO8"]},
)
model = spec.load_model()Local inference with cloud zoo
Local inference with local zoo
Last updated
Was this helpful?

