Zoo Manager Module
Last updated
Was this helpful?
Last updated
Was this helpful?
degirum.zoo_manager.ZooManager
Class that manages a model zoo.
A model zoo in terminology of PySDK is a collection of AI models and simultaneously an ML inference engine type and location.
Depending on the deployment location, there are several types of model zoos supported by PySDK:
Local model zoo: Deployed on the local file system of the PySDK installation host. Inferences are performed on the same host using AI accelerators installed on that host.
AI server model zoo: Deployed on remote host with DeGirum AI server running on that host. Inferences are performed by DeGirum AI server on that remote host.
Cloud Platform model zoo: Deployed on DeGirum Cloud Platform. Inferences are performed by DeGirum Cloud Platform servers.
The type of the model zoo is defined by the URL string which you pass as zoo_url
parameter into the constructor.
Zoo manager provides the following functionality:
List and search models available in the connected model zoo.
Create AI model handling objects to perform AI inferences.
Request various AI model parameters.
degirum.zoo_manager.ZooManager._CLOUD = '@cloud'
class-attribute
instance-attribute
Cloud inference designator
degirum.zoo_manager.ZooManager._LOCAL = '@local'
class-attribute
instance-attribute
Local inference designator
degirum.zoo_manager.ZooManager._default_cloud_server = _CloudZooAccessorBase._default_cloud_server
class-attribute
instance-attribute
DeGirum cloud server hostname
degirum.zoo_manager.ZooManager._default_cloud_url = _CloudZooAccessorBase._default_cloud_url
class-attribute
instance-attribute
DeGirum public zoo URL. You can freely use all models available in this public model zoo
degirum.zoo_manager.ZooManager._default_cloud_zoo = _CloudZooAccessorBase._default_cloud_zoo
class-attribute
instance-attribute
DeGirum public zoo name. You can freely use all models available in this public model zoo
degirum.zoo_manager.ZooManager.__init__(inference_host_address, zoo_url, token)
Constructor.
Note
Typically, you never construct ZooManager
objects yourself – instead you call degirum.connect function to create ZooManager
instances for you.
For the description of arguments see degirum.connect
degirum.zoo_manager.ZooManager.list_models(*args, **kwargs)
Get a list of names of AI models available in the connected model zoo which match specified filtering criteria.
Other Parameters:
model_family
str
Model family name filter.
When you pass a string, it will be used as search substring in the model name. For example, "yolo"
, "mobilenet"
.
You may also pass re.Pattern
object. In this case it will do regular expression pattern search.
runtime
str
Runtime agent type – string or list of strings of runtime agent types.
device
str
Target inference device – string or list of strings of device names.
device_type
str
Target inference device(s) – string or list of strings of full device type names in "RUNTIME/DEVICE" format.
precision
str
Model calculation precision - string or list of strings of model precision labels. Possible labels: "quant"
, "float"
.
pruned
str
Model density – string or list of strings of model density labels. Possible labels: "dense"
, "pruned"
.
Returns:
List[str]
Example
Find all models of "yolo"
family capable to run either on CPU or on DeGirum Orca AI accelerator from all registered model zoos:
degirum.zoo_manager.ZooManager.load_model(model_name, **kwargs)
Create and return the model handling object for given model name.
Parameters:
model_name
str
required
**kwargs
any
you may pass arbitrary model properties to be assigned to the model object in a form of property=value
{}
Returns:
Model handling object. Using this object you perform AI inferences on this model and also configure various model properties, which define how to do input image preprocessing and inference result post-processing:
degirum.zoo_manager.ZooManager.model_info(model_name)
Request model parameters for given model name.
Parameters:
model_name
str
required
Returns:
ModelParams
Model parameter object which provides read-only access to all model parameters.
Note
You cannot modify actual model parameters – any changes of model parameter object returned by this method are not applied to the real model. Use properties of model handling objects returned by degirum.zoo_manager.ZooManager.load_model method to change parameters of that particular model instance on the fly.
degirum.zoo_manager.ZooManager.supported_device_types()
Get runtime/device type names, which are available in the inference system.
Returns:
list
list of runtime/device type names; each element is a string in a format "RUNTIME/DEVICE"
degirum.zoo_manager.ZooManager.system_info(update=False)
Return host system information dictionary
Parameters:
update
bool
force update system information, otherwise take from cache
False
Returns:
dict
host system information dictionary. Format: {"Devices": {"<runtime>/<device>": {<device_info>}, ...}, ["Software Version": "<version>"]}
The list of model name strings matching specified filtering criteria. Use a string from that list as a parameter of method.
Model name string identifying the model to load. It should exactly match the model name as it is returned by method.
Call method to perform AI inference of a single frame. Inference result object is returned.
For more efficient pipelined batch predictions call or methods to perform AI inference of multiple frames
Configure the following image pre-processing properties: + – to set input image resize method. + – to set input image padding method. + – to set letterbox padding color. + – to select image processing library.
Configure the following model post-processing properties: + – to set confidence threshold. + – to set non-max suppression threshold. + – to set top-K limit for classification models. + – to set pose detection threshold for pose detection models.
Configure the following overlay image generation properties: + – to set color for inference results drawing on overlay image. + – to set line width for inference results drawing on overlay image. + – to set flag to enable/disable drawing class labels on overlay image. + – to set flag to enable/disable drawing class probabilities on overlay image. + – to set alpha-blend weight for inference results drawing on overlay image. + – to set font scale for inference results drawing on overlay image. Inference result object returned by method allows you to access AI inference results:
Use property to access original image.
Use property to access image with inference results drawn on a top of it.
Use property to access the list of numeric inference results.
Model name string. It should exactly match the model name as it is returned by method.