Skip to content

Zoo Manager Module

degirum.zoo_manager.ZooManager

Class that manages a model zoo.

A model zoo in terminology of PySDK is a collection of AI models and simultaneously an ML inference engine type and location.

Depending on the deployment location, there are several types of model zoos supported by PySDK:

  • Local model zoo: Deployed on the local file system of the PySDK installation host. Inferences are performed on the same host using AI accelerators installed on that host.
  • AI server model zoo: Deployed on remote host with DeGirum AI server running on that host. Inferences are performed by DeGirum AI server on that remote host.
  • Cloud Platform model zoo: Deployed on DeGirum Cloud Platform. Inferences are performed by DeGirum Cloud Platform servers.

The type of the model zoo is defined by the URL string which you pass as zoo_url parameter into the constructor.

Zoo manager provides the following functionality:

  • List and search models available in the connected model zoo.
  • Create AI model handling objects to perform AI inferences.
  • Request various AI model parameters.

degirum.zoo_manager.ZooManager.__init__(zoo_url=None, token='')

Constructor.

Note

Typically, you never construct ZooManager objects yourself -- instead you call degirum.connect function to create ZooManager instances for you.

Parameters:

Name Type Description Default
zoo_url Union[None, str, Tuple[str, str]]

URL string or tuple, which defines the model zoo to connect to and inference engine to operate with.

None
token str

Security token string to be passed to the model zoo manager for authentication and authorization.

''

ZooManager hierarchy of classes serve dual purpose (thus creating some level of confusion):

  • Particular implementation of ZooManager selects what engine will do the inference (Cloud Platform, AI Sever, or local hardware)
  • It selects the model zoo to take models from (cloud zoo, AI server local zoo, or single model file).

Note 1

For DeGirum Cloud Platform connections you need cloud API access token. To obtain this token you need to open a user account on DeGirum cloud platform. Please login to your account and go to the token generation page to generate API access token.

Note 2

When dealing with cloud model zoos you specify the cloud zoo URL in the following format: <cloud server URL>[/<zoo URL>]. The <cloud server URL> part is the cloud platform root URL, typically cs.degirum.com. The optional <zoo URL> part is the cloud zoo URL in the form <organization>/<model zoo name>. You can confirm zoo URL by visiting your cloud user account and opening the model zoo management page. If <zoo URL> is not specified, then DeGirum public model zoo is used.

The following use cases are supported:

  1. You want to perform cloud inferences and take models from some cloud model zoo.

    In this case you specify the zoo_url parameter as "dgcps://<cloud server URL>[/<zoo URL>]". The dgcps:// prefix specifies that you want to use cloud inference. It is followed by the cloud zoo URL in a format described in the Note 2 above. Also you specify the token parameter equal to your API access token.

  2. You want to perform inferences on some AI server and take models from its local model zoo.

    In this case you specify the zoo_url parameter as the hostname or IP address of the AI server machine you want to connect to. As a client of the AI server you do not have control on what models are served by that AI server: once the AI server model zoo is deployed, it cannot be changed from the client side unless the AI server administrator explicitly updates the model zoo on that AI server. The token parameter is not needed in this use case.

  3. You want to perform inferences on some AI server and take models from some cloud model zoo.

    In this case you specify the zoo_url parameter as a tuple. The first element of this tuple should contain the hostname or IP address of the AI server machine you want to connect to. The second element of this tuple should contain the cloud model zoo URL in the "https://<cloud server URL>[/<zoo URL>]" format, described in the Note 2 above. Also you specify the token parameter equal to your API access token.

  4. You want to perform inferences on local AI hardware and take models from some cloud model zoo.

    In this case you specify the zoo_url parameter as "https://<cloud server URL>[/<zoo URL>]". The https:// prefix specifies that you want to use local (not cloud) inference. It is followed by the cloud zoo URL in a format described in the Note 2 above. Also you specify the token parameter equal to your API access token.

  5. You want to perform inferences on local AI hardware and use particular model from your local drive.

    In this case you specify zoo_url parameter equal to the path to the model .json configuration file. This option is mostly used for testing/debugging new models during model development which are not yet released in any model zoo. The token parameter is not needed this use case.

When connecting to a model zoo, the list of AI models is requested and then stored inside the ZooManager object.

degirum.zoo_manager.ZooManager.list_models(*args, **kwargs)

Get a list of names of AI models available in the connected model zoo which match specified filtering criteria.

Other Parameters:

Name Type Description
model_family str

Model family name filter.

  • When you pass a string, it will be used as search substring in the model name. For example, "yolo", "mobilenet".
  • You may also pass re.Pattern object. In this case it will do regular expression pattern search.
device str

Target inference device -- string or list of strings of device names.

Possible names: "orca", "orca1", "cpu", "gpu", "edgetpu", "dla", "dla_fallback", "myriad".

precision str

Model calculation precision - string or list of strings of model precision labels.

Possible labels: "quant", "float".

pruned str

Model density -- string or list of strings of model density labels.

Possible labels: "dense", "pruned".

runtime str

Runtime agent type -- string or list of strings of runtime agent types.

Possible types: "n2x", "tflite", "tensorrt", "openvino".

Returns:

Type Description
List[str]

The list of model name strings matching specified filtering criteria. Use a string from that list as a parameter of degirum.zoo_manager.ZooManager.load_model method.

Example

Find all models of "yolo" family capable to run either on CPU or on DeGirum Orca AI accelerator from all registered model zoos:

    yolo_model_list = zoo_manager.list_models("yolo", device=["cpu", "orca"])

degirum.zoo_manager.ZooManager.load_model(model_name)

Create and return the model handling object for given model name.

Parameters:

Name Type Description Default
model_name str

Model name string identifying the model to load. It should exactly match the model name as it is returned by degirum.zoo_manager.ZooManager.list_models method.

required

Returns:

Type Description
Model

Model handling object. Using this object you perform AI inferences on this model and also configure various model properties, which define how to do input image preprocessing and inference result post-processing:

Inference result object degirum.postprocessor.InferenceResults returned by degirum.model.Model.predict method allows you to access AI inference results:

degirum.zoo_manager.ZooManager.model_info(model_name)

Request model parameters for given model name.

Parameters:

Name Type Description Default
model_name str

Model name string. It should exactly match the model name as it is returned by degirum.zoo_manager.ZooManager.list_models method.

required

Returns:

Type Description
ModelParams

Model parameter object which provides read-only access to all model parameters.

Note

You cannot modify actual model parameters -- any changes of model parameter object returned by this method are not applied to the real model. Use properties of model handling objects returned by degirum.zoo_manager.ZooManager.load_model method to change parameters of that particular model instance on the fly.