Skip to content

PySDK Package

degirum.LOCAL: str = ZooManager._LOCAL module-attribute

Local inference designator: use it as a first argument of degirum.connect function to specify inference on local AI hardware

degirum.CLOUD: str = ZooManager._CLOUD module-attribute

Cloud inference designator: use it as a first argument of degirum.connect function to specify cloud-based inference

degirum.connect(inference_host_address, zoo_url=None, token=None)

Connect to the AI inference host and model zoo of your choice.

This is the main PySDK entry point: you start your work with PySDK by calling this function.

The following use cases are supported:

  1. You want to perform cloud inferences and take models from some cloud model zoo.
  2. You want to perform inferences on some AI server and take models from some cloud model zoo.
  3. You want to perform inferences on some AI server and take models from its local model zoo.
  4. You want to perform inferences on local AI hardware and take models from some cloud model zoo.
  5. You want to perform inferences on local AI hardware and use particular model from your local drive.

Parameters:

Name Type Description Default
inference_host_address str

Inference engine designator; it defines which inference engine to use.

  • For AI Server-based inference it can be either the hostname or IP address of the AI Server host, optionally followed by the port number in the form :port.

  • For DeGirum Cloud Platform-based inference it is the string "@cloud" or degirum.CLOUD constant.

  • For local inference it is the string "@local" or degirum.LOCAL constant.

required
zoo_url Optional[str]

Model zoo URL string which defines the model zoo to operate with.

  • For a cloud model zoo, it is specified in the following format: <cloud server prefix>[/<zoo suffix>]. The <cloud server prefix> part is the cloud platform root URL, typically https://cs.degirum.com. The optional <zoo suffix> part is the cloud zoo URL suffix in the form <organization>/<model zoo name>. You can confirm zoo URL suffix by visiting your cloud user account and opening the model zoo management page. If <zoo suffix> is not specified, then DeGirum public model zoo degirum/public is used.

  • For AI Server-based inferences, you may omit both zoo_url and token parameters. In this case locally-deployed model zoo of the AI Server will be used.

  • For local AI hardware inferences, if you want to use particular AI model from your local drive, then you specify zoo_url parameter equal to the path to that model's .json configuration file. The token parameter is not needed in this case.

None
token Optional[str]

Cloud API access token used to access the cloud zoo.

  • To obtain this token you need to open a user account on DeGirum cloud platform. Please login to your account and go to the token generation page to generate an API access token.
None

Returns:

Type Description
ZooManager

An instance of Model Zoo manager object configured to work with AI inference host and model zoo of your choice.

Once you created Model Zoo manager object, you may use the following methods:

degirum.enable_default_logger(level=logging.DEBUG)

Helper function for adding a StreamHandler to the package logger. Removes any existing handlers. Useful for debugging.

Parameters:

Name Type Description Default
level int

Logging level as defined in logging python package. defaults to logging.DEBUG.

logging.DEBUG

Returns:

Type Description
logging.StreamHandler

Returns an instance of added StreamHandler.