Zoo Manager Module

PySDK API Reference Guide. Load, list and authenticate against local, server or cloud zoos.

This API Reference is based on PySDK 0.17.0.

degirum.zoo_manager.ZooManager

Class that manages a model zoo.

A model zoo in terminology of PySDK is a collection of AI models and simultaneously an ML inference engine type and location.

Depending on the deployment location, there are several types of model zoos supported by PySDK:

  • Local model zoo: Deployed on the local file system of the PySDK installation host. Inferences are performed on the same host using AI accelerators installed on that host.

  • AI server model zoo: Deployed on remote host with DeGirum AI server running on that host. Inferences are performed by DeGirum AI server on that remote host.

  • Cloud Platform model zoo: Deployed on DeGirum Cloud Platform. Inferences are performed by DeGirum Cloud Platform servers.

The type of the model zoo is defined by the URL string which you pass as zoo_url parameter into the constructor.

Zoo manager provides the following functionality:

  • List and search models available in the connected model zoo.

  • Create AI model handling objects to perform AI inferences.

  • Request various AI model parameters.

__init__(inference_host_address, ...)

degirum.zoo_manager.ZooManager.__init__(inference_host_address, zoo_url='', token='')

Constructor.

Typically, you never construct ZooManager objects yourself – instead you call degirum.connect function to create ZooManager instances for you.

For the description of arguments see degirum.connect

list_models(*args, ...)

degirum.zoo_manager.ZooManager.list_models(*args, **kwargs)

Get a list of names of AI models available in the connected model zoo which match specified filtering criteria.

Other Parameters:

Name
Type
Description

model_family

str

Model family name filter.

  • When you pass a string, it will be used as search substring in the model name. For example, "yolo", "mobilenet".

  • You may also pass re.Pattern object. In this case it will do regular expression pattern search.

runtime

str

Runtime agent type – string or list of strings of runtime agent types.

device

str

Target inference device – string or list of strings of device names.

device_type

str

Target inference device(s) – string or list of strings of full device type names in "RUNTIME/DEVICE" format.

precision

str

Model calculation precision - string or list of strings of model precision labels.

Possible labels: "quant", "float".

pruned

str

Model density – string or list of strings of model density labels.

Possible labels: "dense", "pruned".

postprocess_type

str

Model output postprocess type – string or list of strings of postprocess type labels.

For example: "Classification", "Detection", "Segmentation".

Returns:

Type
Description

Union[List[str], Dict[str, ModelParams]]

The list of model name strings matching specified filtering criteria. Use a string from that list as a parameter of degirum.zoo_manager.ZooManager.load_model method.

Example

Find all models of "yolo" family capable to run either on CPU or on DeGirum Orca AI accelerator from all registered model zoos:

    yolo_model_list = zoo_manager.list_models("yolo", device=["cpu", "orca"])

load_model(model_name, ...)

degirum.zoo_manager.ZooManager.load_model(model_name, **kwargs)

Create and return the model handling object for given model name.

Parameters:

Name
Type
Description
Default

model_name

str

Model name string identifying the model to load. It should exactly match the model name as it is returned bydegirum.zoo_manager.ZooManager.list_models method.

required

**kwargs

any

you may pass arbitrary model properties to be assigned to the model object in a form of property=value

{}

Returns:

Type
Description

Model handling object. Using this object you perform AI inferences on this model and also configure various model properties, which define how to do input image preprocessing and inference result post-processing:

Inference result object degirum.postprocessor.InferenceResults returned bydegirum.model.Model.predict method allows you to access AI inference results:

model_info(model_name)

degirum.zoo_manager.ZooManager.model_info(model_name)

Request model parameters for given model name.

Parameters:

Name
Type
Description
Default

model_name

str

Model name string. It should exactly match the model name as it is returned bydegirum.zoo_manager.ZooManager.list_models method.

required

Returns:

Type
Description

ModelParams

Model parameter object which provides read-only access to all model parameters.

You cannot modify actual model parameters – any changes of model parameter object returned by this method are not applied to the real model. Use properties of model handling objects returned bydegirum.zoo_manager.ZooManager.load_model method to change parameters of that particular model instance on the fly.

supported_device_types

degirum.zoo_manager.ZooManager.supported_device_types()

Get runtime/device type names, which are available in the inference system.

Returns:

Type
Description

list

list of runtime/device type names; each element is a string in a format "RUNTIME/DEVICE"

system_info(update=False)

degirum.zoo_manager.ZooManager.system_info(update=False)

Return host system information dictionary

Parameters:

Name
Type
Description
Default

update

bool

force update system information, otherwise take from cache

False

Returns:

Type
Description

dict

host system information dictionary. Format:{"Devices": {"<runtime>/<device>": {<device_info>}, ...}, ["Software Version": "<version>"]}

Last updated

Was this helpful?