# PySDK Package

{% hint style="info" %}
This API Reference is based on PySDK 0.20.0.
{% endhint %}

## \_LOCAL <a href="#degirum.localzoomanager._local" id="degirum.localzoomanager._local"></a>

`degirum.LOCAL = ZooManager._LOCAL`

`module-attribute`

Local inference designator: use it as a first argument of [degirum.connect](#degirum.connect) function to specify inference on local AI hardware

## \_CLOUD <a href="#degirum.cloudzoomanager._cloud" id="degirum.cloudzoomanager._cloud"></a>

`degirum.CLOUD = ZooManager._CLOUD`

`module-attribute`

Cloud inference designator: use it as a first argument of [degirum.connect](#degirum.connect) function to specify cloud-based inference

## connect(inference\_host\_address, ...) <a href="#degirum.connect" id="degirum.connect"></a>

`degirum.connect(inference_host_address, zoo_url='', token='')`

Connect to the AI inference host and model zoo of your choice.

This is the main PySDK entry point: you start your work with PySDK by calling this function.

The following use cases are supported:

1. You want to perform **cloud inferences** and take models from some **cloud model zoo**.
2. You want to perform inferences on some **AI server** and take models from some **cloud model zoo**.
3. You want to perform inferences on some **AI server** and take models from its **local model zoo**.
4. You want to perform inferences on **local AI hardware** and take models from some **cloud model zoo**.
5. You want to perform inferences on **local AI hardware** and take models from the **local model zoo** directory on your local drive.
6. You want to perform inferences on **local AI hardware** and use **particular model** from your local drive.

Parameters:

| Name                     | Type  | Description                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                      | Default    |
| ------------------------ | ----- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ---------- |
| `inference_host_address` | `str` | <p>Inference engine designator; it defines which inference engine to use.</p><ul><li>For AI Server-based inference it can be either the hostname or IP address of the AI Server host, optionally followed by the port number in the form <code>:port</code>.</li><li>For DeGirum Cloud Platform-based inference it is the string <code>"@cloud"</code> or <a href="#degirum.CLOUD">degirum.CLOUD</a> constant.</li><li>For local inference it is the string <code>"@local"</code> or <a href="#degirum.LOCAL">degirum.LOCAL</a> constant.</li></ul>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                              | *required* |
| `zoo_url`                | `str` | <p>Model zoo URL string which defines the model zoo to operate with.</p><ul><li>For a cloud model zoo, it is specified in the following format: <code>\<cloud server prefix>\[/\<zoo suffix>]</code>. The <code>\<cloud server prefix></code> part is the cloud platform root URL, typically <code><https://hub.degirum.com></code>. The optional <code>\<zoo suffix></code> part is the cloud zoo URL suffix in the form <code>\<organization>/\<model zoo name></code>. You can confirm zoo URL suffix by visiting your cloud user account and opening the model zoo management page. If <code>\<zoo suffix></code> is not specified, then DeGirum public model zoo <code>degirum/public</code> is used.</li><li>For AI Server-based inferences, you may omit both <code>zoo\_url</code> and <code>token</code> parameters. In this case locally-deployed model zoo of the AI Server will be used.</li><li>For local AI hardware inferences you specify <code>zoo\_url</code> parameter as either a path to a local model zoo directory, or a path to model's .json configuration file. The <code>token</code> parameter is not needed in this case.</li></ul> | `''`       |
| `token`                  | `str` | <p>Cloud API access token used to access the cloud zoo.</p><ul><li>To obtain this token you need to open a user account on <a href="https://hub.degirum.com/?utm_source=docs.degirum.com&#x26;utm_medium=site&#x26;utm_campaign=pysdk-pysdk-user-guide-api-reference-guide-pysdk-package">DeGirum cloud platform</a>. Please login to your account and go to the token generation page to generate an API access token.</li></ul>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                | `''`       |

Returns:

| Type                                                                                | Description                                                                                                     |
| ----------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------- |
| [`ZooManager`](https://docs.degirum.com/pysdk/user-guide-pysdk/api-ref/zoo-manager) | An instance of Model Zoo manager object configured to work with AI inference host and model zoo of your choice. |

Once you created Model Zoo manager object, you may use the following methods:

* [degirum.zoo\_manager.ZooManager.list\_models](https://docs.degirum.com/pysdk/user-guide-pysdk/zoo-manager#degirum.zoo_manager.zoomanager.list_models) to list and search models available in the model zoo.
* [degirum.zoo\_manager.ZooManager.load\_model](https://docs.degirum.com/pysdk/user-guide-pysdk/zoo-manager#degirum.zoo_manager.zoomanager.load_model) to create [degirum.model.Model](https://docs.degirum.com/pysdk/user-guide-pysdk/model#degirum.model.model) model handling object to be used for AI inferences.
* [degirum.zoo\_manager.ZooManager.model\_info](https://docs.degirum.com/pysdk/user-guide-pysdk/zoo-manager#degirum.zoo_manager.zoomanager.model_info) to request model parameters.

## load\_model(model\_name, ...) <a href="#degirum.load_model" id="degirum.load_model"></a>

`degirum.load_model(model_name, inference_host_address, zoo_url='', token='', **kwargs)`

Load a model from the model zoo for the inference.

Parameters:

| Name                     | Type  | Description                                                                                            | Default    |
| ------------------------ | ----- | ------------------------------------------------------------------------------------------------------ | ---------- |
| `model_name`             | `str` | Model name to load from the model zoo.                                                                 | *required* |
| `inference_host_address` | `str` | Inference engine designator; it defines which inference engine to use.                                 | *required* |
| `zoo_url`                | `str` | Model zoo URL string which defines the model zoo to operate with.                                      | `''`       |
| `token`                  | `str` | Cloud API access token used to access the cloud zoo.                                                   | `''`       |
| `**kwargs`               | `any` | you may pass arbitrary model properties to be assigned to the model object in a form of property=value | `{}`       |

Note

For detailed description of `zoo_url`, `inference_host_address`, and `token` parameters refer to [degirum.connect](#degirum.connect) function.

Returns (degirum.model.Model): An instance of [degirum.model.Model](https://docs.degirum.com/pysdk/user-guide-pysdk/model#degirum.model.model) model handling object to be used for AI inferences.

## list\_models(inference\_host\_address, ...) <a href="#degirum.list_models" id="degirum.list_models"></a>

`degirum.list_models(inference_host_address, zoo_url, token='', **kwargs)`

List models in the model zoo matching to specified filter values.

Parameters:

| Name                     | Type  | Description                                                            | Default    |
| ------------------------ | ----- | ---------------------------------------------------------------------- | ---------- |
| `inference_host_address` | `str` | Inference engine designator; it defines which inference engine to use. | *required* |
| `zoo_url`                | `str` | Model zoo URL string which defines the model zoo to operate with.      | *required* |
| `token`                  | `str` | Cloud API access token used to access the cloud zoo.                   | `''`       |
| `**kwargs`               | `any` | filter parameters to narrow down the list of models.                   | `{}`       |

Note

For detailed description of `zoo_url`, `inference_host_address`, and `token` parameters refer to [degirum.connect](#degirum.connect) function. For detailed description of `kwargs` parameters refer to [degirum.ZooManager.list\_models](https://docs.degirum.com/pysdk/user-guide-pysdk/zoo-manager#degirum.zoo_manager.zoomanager.list_models) method.

Returns:

| Type                                       | Description                                                     |
| ------------------------------------------ | --------------------------------------------------------------- |
| `Union[List[str], Dict[str, ModelParams]]` | A dictionary with model names as keys and model info as values. |

## get\_supported\_devices(inference\_host\_address, ...) <a href="#degirum.get_supported_devices" id="degirum.get_supported_devices"></a>

`degirum.get_supported_devices(inference_host_address, zoo_url='', token='')`

Get runtime/device type names, which are available in the inference engine.

Parameters:

| Name                     | Type  | Description                                                            | Default    |
| ------------------------ | ----- | ---------------------------------------------------------------------- | ---------- |
| `inference_host_address` | `str` | Inference engine designator; it defines which inference engine to use. | *required* |
| `zoo_url`                | `str` | not used anymore, kept for backward compatibility.                     | `''`       |
| `token`                  | `str` | not used anymore, kept for backward compatibility.                     | `''`       |

{% hint style="info" %}
For detailed description of `inference_host_address` parameter refer to [degirum.connect](#degirum.connect) function.
{% endhint %}

Returns:

| Type        | Description                                                                              |
| ----------- | ---------------------------------------------------------------------------------------- |
| `List[str]` | list of runtime/device type names; each element is a string in a format "RUNTIME/DEVICE" |

## enable\_default\_logger(level=logging.DEBUG) <a href="#degirum.enable_default_logger" id="degirum.enable_default_logger"></a>

`degirum.enable_default_logger(level=logging.DEBUG)`

Helper function for adding a StreamHandler to the package logger. Removes any existing handlers. Useful for debugging.

Parameters:

| Name    | Type  | Description                                                                    | Default |
| ------- | ----- | ------------------------------------------------------------------------------ | ------- |
| `level` | `int` | Logging level as defined in logging python package. defaults to logging.DEBUG. | `DEBUG` |

Returns:

| Type            | Description                                 |
| --------------- | ------------------------------------------- |
| `StreamHandler` | Returns an instance of added StreamHandler. |
