Zoo Manager Module
PySDK API Reference Guide. Load, list and authenticate against local, server or cloud zoos.
degirum.zoo_manager.ZooManager
Class that manages a model zoo.
A model zoo in terminology of PySDK is a collection of AI models and simultaneously an ML inference engine type and location.
Depending on the deployment location, there are several types of model zoos supported by PySDK:
Local model zoo: Deployed on the local file system of the PySDK installation host. Inferences are performed on the same host using AI accelerators installed on that host.
AI server model zoo: Deployed on remote host with DeGirum AI server running on that host. Inferences are performed by DeGirum AI server on that remote host.
Cloud Platform model zoo: Deployed on DeGirum Cloud Platform. Inferences are performed by DeGirum Cloud Platform servers.
The type of the model zoo is defined by the URL string which you pass as zoo_url
parameter into the constructor.
Zoo manager provides the following functionality:
List and search models available in the connected model zoo.
Create AI model handling objects to perform AI inferences.
Request various AI model parameters.
__init__(inference_host_address, ...)
degirum.zoo_manager.ZooManager.__init__(inference_host_address, zoo_url='', token='')
Constructor.
For the description of arguments see degirum.connect
list_models(*args, ...)
degirum.zoo_manager.ZooManager.list_models(*args, **kwargs)
Get a list of names of AI models available in the connected model zoo which match specified filtering criteria.
Other Parameters:
model_family
str
Model family name filter.
When you pass a string, it will be used as search substring in the model name. For example,
"yolo"
,"mobilenet"
.You may also pass
re.Pattern
object. In this case it will do regular expression pattern search.
runtime
str
Runtime agent type – string or list of strings of runtime agent types.
device
str
Target inference device – string or list of strings of device names.
device_type
str
Target inference device(s) – string or list of strings of full device type names in "RUNTIME/DEVICE" format.
precision
str
Model calculation precision - string or list of strings of model precision labels.
Possible labels: "quant"
, "float"
.
pruned
str
Model density – string or list of strings of model density labels.
Possible labels: "dense"
, "pruned"
.
postprocess_type
str
Model output postprocess type – string or list of strings of postprocess type labels.
For example: "Classification"
, "Detection"
, "Segmentation"
.
Returns:
Union[List[str], Dict[str, ModelParams]]
The list of model name strings matching specified filtering criteria. Use a string from that list as a parameter of degirum.zoo_manager.ZooManager.load_model method.
Example
Find all models of "yolo"
family capable to run either on CPU or on DeGirum Orca AI accelerator
from all registered model zoos:
yolo_model_list = zoo_manager.list_models("yolo", device=["cpu", "orca"])
load_model(model_name, ...)
degirum.zoo_manager.ZooManager.load_model(model_name, **kwargs)
Create and return the model handling object for given model name.
Parameters:
model_name
str
Model name string identifying the model to load. It should exactly match the model name as it is returned bydegirum.zoo_manager.ZooManager.list_models method.
required
**kwargs
any
you may pass arbitrary model properties to be assigned to the model object in a form of property=value
{}
Returns:
Model handling object. Using this object you perform AI inferences on this model and also configure various model properties, which define how to do input image preprocessing and inference result post-processing:
Call degirum.model.Model.predict method to perform AI inference of a single frame. Inference result object is returned.
For more efficient pipelined batch predictions call degirum.model.Model.predict_batch or degirum.model.Model.predict_dir methods to perform AI inference of multiple frames
Configure the following image pre-processing properties:
degirum.model.Model.input_resize_method -- to set input image resize method.
degirum.model.Model.input_pad_method -- to set input image padding method.
degirum.model.Model.input_letterbox_fill_color -- to set letterbox padding color.
degirum.model.Model.image_backend -- to select image processing library.
Configure the following model post-processing properties:
degirum.model.Model.output_confidence_threshold -- to set confidence threshold.
degirum.model.Model.output_nms_threshold -- to set non-max suppression threshold.
degirum.model.Model.output_top_k -- to set top-K limit for classification models.
degirum.model.Model.output_pose_threshold -- to set pose detection threshold for pose detection models.
Configure the following overlay image generation properties:
degirum.model.Model.overlay_color -- to set color for inference results drawing on overlay image.
degirum.model.Model.overlay_line_width -- to set line width for inference results drawing on overlay image.
degirum.model.Model.overlay_show_labels -- to set flag to enable/disable drawing class labels on overlay image.
degirum.model.Model.overlay_show_probabilities -- to set flag to enable/disable drawing class probabilities on overlay image.
degirum.model.Model.overlay_alpha -- to set alpha-blend weight for inference results drawing on overlay image.
degirum.model.Model.overlay_font_scale -- to set font scale for inference results drawing on overlay image.
Inference result object degirum.postprocessor.InferenceResults returned bydegirum.model.Model.predict method allows you to access AI inference results:
Use degirum.postprocessor.InferenceResults.image property to access original image.
Use degirum.postprocessor.InferenceResults.image_overlay property to access image with inference results drawn on top of it.
Use degirum.postprocessor.InferenceResults.results property to access the list of numeric inference results.
model_info(model_name)
degirum.zoo_manager.ZooManager.model_info(model_name)
Request model parameters for given model name.
Parameters:
model_name
str
Model name string. It should exactly match the model name as it is returned bydegirum.zoo_manager.ZooManager.list_models method.
required
Returns:
ModelParams
Model parameter object which provides read-only access to all model parameters.
supported_device_types
degirum.zoo_manager.ZooManager.supported_device_types()
Get runtime/device type names, which are available in the inference system.
Returns:
list
list of runtime/device type names; each element is a string in a format "RUNTIME/DEVICE"
system_info(update=False)
degirum.zoo_manager.ZooManager.system_info(update=False)
Return host system information dictionary
Parameters:
update
bool
force update system information, otherwise take from cache
False
Returns:
dict
host system information dictionary. Format:{"Devices": {"<runtime>/<device>": {<device_info>}, ...}, ["Software Version": "<version>"]}
Last updated
Was this helpful?