PySDK Package
degirum.LOCAL: str = ZooManager._LOCAL
module-attribute
Local inference designator: use it as a first argument of degirum.connect function to specify inference on local AI hardware
degirum.CLOUD: str = ZooManager._CLOUD
module-attribute
Cloud inference designator: use it as a first argument of degirum.connect function to specify cloud-based inference
degirum.connect(inference_host_address, zoo_url=None, token=None)
Connect to the AI inference host and model zoo of your choice.
This is the main PySDK entry point: you start your work with PySDK by calling this function.
The following use cases are supported:
- You want to perform cloud inferences and take models from some cloud model zoo.
- You want to perform inferences on some AI server and take models from some cloud model zoo.
- You want to perform inferences on some AI server and take models from its local model zoo.
- You want to perform inferences on local AI hardware and take models from some cloud model zoo.
- You want to perform inferences on local AI hardware and take models from the local model zoo directory on your local drive.
- You want to perform inferences on local AI hardware and use particular model from your local drive.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
inference_host_address |
str
|
Inference engine designator; it defines which inference engine to use.
|
required |
zoo_url |
Optional[str]
|
Model zoo URL string which defines the model zoo to operate with.
|
None
|
token |
Optional[str]
|
Cloud API access token used to access the cloud zoo.
|
None
|
Returns:
Type | Description |
---|---|
ZooManager
|
An instance of Model Zoo manager object configured to work with AI inference host and model zoo of your choice. |
Once you created Model Zoo manager object, you may use the following methods:
- degirum.zoo_manager.ZooManager.list_models to list and search models available in the model zoo.
- degirum.zoo_manager.ZooManager.load_model to create degirum.model.Model model handling object to be used for AI inferences.
- degirum.zoo_manager.ZooManager.model_info to request model parameters.
degirum.load_model(model_name, inference_host_address, zoo_url=None, token=None, **kwargs)
Load a model from the model zoo for the inference.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
model_name |
str
|
Model name to load from the model zoo. |
required |
inference_host_address |
str
|
Inference engine designator; it defines which inference engine to use. |
required |
zoo_url |
Optional[str]
|
Model zoo URL string which defines the model zoo to operate with. |
None
|
token |
Optional[str]
|
Cloud API access token used to access the cloud zoo. |
None
|
**kwargs |
any
|
you may pass arbitrary model properties to be assigned to the model object in a form of property=value |
{}
|
Note
For detailed description of zoo_url
, inference_host_address
, and token
parameters refer to degirum.connect function.
Returns (degirum.model.Model): An instance of degirum.model.Model model handling object to be used for AI inferences.
degirum.list_models(inference_host_address, zoo_url, token=None, **kwargs)
List models in the model zoo matching to specified filter values.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
inference_host_address |
str
|
Inference engine designator; it defines which inference engine to use. |
required |
zoo_url |
str
|
Model zoo URL string which defines the model zoo to operate with. |
required |
token |
Optional[str]
|
Cloud API access token used to access the cloud zoo. |
None
|
**kwargs |
any
|
filter parameters to narrow down the list of models. |
{}
|
Note
For detailed description of zoo_url
, inference_host_address
, and token
parameters refer to degirum.connect function.
For detailed description of kwargs
parameters refer to degirum.ZooManager.list_models method.
Returns:
Type | Description |
---|---|
dict
|
A dictionary with model names as keys and model info as values. |
degirum.get_supported_devices(inference_host_address, zoo_url='', token='')
Get runtime/device type names, which are available in the inference engine.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
inference_host_address |
str
|
Inference engine designator; it defines which inference engine to use. |
required |
zoo_url |
str
|
Optional model zoo URL string which defines the model zoo to operate with. Makes sense only for cloud inference engines to specify another base URL. |
''
|
token |
str
|
Cloud API access token used to access the cloud zoo. |
''
|
Note
For detailed description of inference_host_address
and token
parameters refer to degirum.connect function.
Returns:
Type | Description |
---|---|
List[str]
|
list of runtime/device type names; each element is a string in a format "RUNTIME/DEVICE" |
degirum.enable_default_logger(level=logging.DEBUG)
Helper function for adding a StreamHandler to the package logger. Removes any existing handlers. Useful for debugging.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
level |
int
|
Logging level as defined in logging python package. defaults to logging.DEBUG. |
logging.DEBUG
|
Returns:
Type | Description |
---|---|
logging.StreamHandler
|
Returns an instance of added StreamHandler. |