LogoLogo
AI HubCommunityWebsite
  • Start Here
  • AI Hub
    • Overview
    • Quickstart
    • Teams
    • Device Farm
    • Browser Inference
    • Model Zoo
      • Hailo
      • Intel
      • MemryX
      • BrainChip
      • Google
      • DeGirum
      • Rockchip
    • View and Create Model Zoos
    • Model Compiler
    • PySDK Integration
  • PySDK
    • Overview
    • Quickstart
    • Installation
    • Runtimes and Drivers
      • Hailo
      • OpenVINO
      • MemryX
      • BrainChip
      • Rockchip
      • ONNX
    • PySDK User Guide
      • Core Concepts
      • Organizing Models
      • Setting Up an AI Server
      • Loading an AI Model
      • Running AI Model Inference
      • Model JSON Structure
      • Command Line Interface
      • API Reference Guide
        • PySDK Package
        • Model Module
        • Zoo Manager Module
        • Postprocessor Module
        • AI Server Module
        • Miscellaneous Modules
      • Older PySDK User Guides
        • PySDK 0.16.0
        • PySDK 0.15.2
        • PySDK 0.15.1
        • PySDK 0.15.0
        • PySDK 0.14.3
        • PySDK 0.14.2
        • PySDK 0.14.1
        • PySDK 0.14.0
        • PySDK 0.13.4
        • PySDK 0.13.3
        • PySDK 0.13.2
        • PySDK 0.13.1
        • PySDK 0.13.0
    • Release Notes
      • Retired Versions
    • EULA
  • DeGirum Tools
    • Overview
      • Streams
        • Streams Base
        • Streams Gizmos
      • Compound Models
      • Result Analyzer Base
      • Inference Support
  • DeGirumJS
    • Overview
    • Get Started
    • Understanding Results
    • Release Notes
    • API Reference Guides
      • DeGirumJS 0.1.3
      • DeGirumJS 0.1.2
      • DeGirumJS 0.1.1
      • DeGirumJS 0.1.0
      • DeGirumJS 0.0.9
      • DeGirumJS 0.0.8
      • DeGirumJS 0.0.7
      • DeGirumJS 0.0.6
      • DeGirumJS 0.0.5
      • DeGirumJS 0.0.4
      • DeGirumJS 0.0.3
      • DeGirumJS 0.0.2
      • DeGirumJS 0.0.1
  • Orca
    • Overview
    • Benchmarks
    • Unboxing and Installation
    • M.2 Setup
    • USB Setup
    • Thermal Management
    • Tools
  • Resources
    • External Links
Powered by GitBook

Get Started

  • AI Hub Quickstart
  • PySDK Quickstart
  • PySDK in Colab

Resources

  • AI Hub
  • Community
  • DeGirum Website

Social

  • LinkedIn
  • YouTube

Legal

  • PySDK EULA
  • Terms of Service
  • Privacy Policy

© 2025 DeGirum Corp.

On this page
  • _LOCAL
  • _CLOUD
  • connect(inference_host_address, ...)
  • load_model(model_name, ...)
  • list_models(inference_host_address, ...)
  • get_supported_devices(inference_host_address, ...)
  • enable_default_logger(level=logging.DEBUG)

Was this helpful?

  1. PySDK
  2. PySDK User Guide
  3. API Reference Guide

PySDK Package

PreviousAPI Reference GuideNextModel Module

Last updated 14 days ago

Was this helpful?

_LOCAL

degirum.LOCAL = ZooManager._LOCAL

module-attribute

Local inference designator: use it as a first argument of function to specify inference on local AI hardware

_CLOUD

degirum.CLOUD = ZooManager._CLOUD

module-attribute

Cloud inference designator: use it as a first argument of function to specify cloud-based inference

connect(inference_host_address, ...)

degirum.connect(inference_host_address, zoo_url=None, token=None)

Connect to the AI inference host and model zoo of your choice.

This is the main PySDK entry point: you start your work with PySDK by calling this function.

The following use cases are supported:

  1. You want to perform cloud inferences and take models from some cloud model zoo.

  2. You want to perform inferences on some AI server and take models from some cloud model zoo.

  3. You want to perform inferences on some AI server and take models from its local model zoo.

  4. You want to perform inferences on local AI hardware and take models from some cloud model zoo.

  5. You want to perform inferences on local AI hardware and take models from the local model zoo directory on your local drive.

  6. You want to perform inferences on local AI hardware and use particular model from your local drive.

Parameters:

Name
Type
Description
Default

inference_host_address

str

Inference engine designator; it defines which inference engine to use.

  • For AI Server-based inference it can be either the hostname or IP address of the AI Server host, optionally followed by the port number in the form :port.

required

zoo_url

Optional[str]

Model zoo URL string which defines the model zoo to operate with.

  • For a cloud model zoo, it is specified in the following format: <cloud server prefix>[/<zoo suffix>]. The <cloud server prefix> part is the cloud platform root URL, typically https://hub.degirum.com. The optional <zoo suffix> part is the cloud zoo URL suffix in the form <organization>/<model zoo name>. You can confirm zoo URL suffix by visiting your cloud user account and opening the model zoo management page. If <zoo suffix> is not specified, then DeGirum public model zoo degirum/public is used.

  • For AI Server-based inferences, you may omit both zoo_url and token parameters. In this case locally-deployed model zoo of the AI Server will be used.

  • For local AI hardware inferences you specify zoo_url parameter as either a path to a local model zoo directory, or a path to model's .json configuration file. The token parameter is not needed in this case.

None

token

Optional[str]

Cloud API access token used to access the cloud zoo.

None

Returns:

Type
Description

An instance of Model Zoo manager object configured to work with AI inference host and model zoo of your choice.

Once you created Model Zoo manager object, you may use the following methods:

load_model(model_name, ...)

degirum.load_model(model_name, inference_host_address, zoo_url=None, token=None, **kwargs)

Load a model from the model zoo for the inference.

Parameters:

Name
Type
Description
Default

model_name

str

Model name to load from the model zoo.

required

inference_host_address

str

Inference engine designator; it defines which inference engine to use.

required

zoo_url

Optional[str]

Model zoo URL string which defines the model zoo to operate with.

None

token

Optional[str]

Cloud API access token used to access the cloud zoo.

None

**kwargs

any

you may pass arbitrary model properties to be assigned to the model object in a form of property=value

{}

Note

list_models(inference_host_address, ...)

degirum.list_models(inference_host_address, zoo_url, token=None, **kwargs)

List models in the model zoo matching to specified filter values.

Parameters:

Name
Type
Description
Default

inference_host_address

str

Inference engine designator; it defines which inference engine to use.

required

zoo_url

str

Model zoo URL string which defines the model zoo to operate with.

required

token

Optional[str]

Cloud API access token used to access the cloud zoo.

None

**kwargs

any

filter parameters to narrow down the list of models.

{}

Note

Returns:

Type
Description

dict

A dictionary with model names as keys and model info as values.

get_supported_devices(inference_host_address, ...)

degirum.get_supported_devices(inference_host_address, zoo_url='', token='')

Get runtime/device type names, which are available in the inference engine.

Parameters:

Name
Type
Description
Default

inference_host_address

str

Inference engine designator; it defines which inference engine to use.

required

zoo_url

str

Optional model zoo URL string which defines the model zoo to operate with. Makes sense only for cloud inference engines to specify another base URL.

''

token

str

Cloud API access token used to access the cloud zoo.

''

Note

Returns:

Type
Description

List[str]

list of runtime/device type names; each element is a string in a format "RUNTIME/DEVICE"

enable_default_logger(level=logging.DEBUG)

degirum.enable_default_logger(level=logging.DEBUG)

Helper function for adding a StreamHandler to the package logger. Removes any existing handlers. Useful for debugging.

Parameters:

Name
Type
Description
Default

level

int

Logging level as defined in logging python package. defaults to logging.DEBUG.

DEBUG

Returns:

Type
Description

StreamHandler

Returns an instance of added StreamHandler.

For DeGirum Cloud Platform-based inference it is the string "@cloud" or constant.

For local inference it is the string "@local" or constant.

To obtain this token you need to open a user account on . Please login to your account and go to the token generation page to generate an API access token.

to list and search models available in the model zoo.

to create model handling object to be used for AI inferences.

to request model parameters.

For detailed description of zoo_url, inference_host_address, and token parameters refer to function.

Returns (degirum.model.Model): An instance of model handling object to be used for AI inferences.

For detailed description of zoo_url, inference_host_address, and token parameters refer to function. For detailed description of kwargs parameters refer to method.

For detailed description of inference_host_address and token parameters refer to function.

degirum.connect
degirum.connect
DeGirum cloud platform
degirum.CLOUD
degirum.LOCAL
degirum.connect
degirum.connect
degirum.connect
degirum.zoo_manager.ZooManager.list_models
degirum.zoo_manager.ZooManager.model_info
degirum.ZooManager.list_models
ZooManager
degirum.zoo_manager.ZooManager.load_model
degirum.model.Model
degirum.model.Model

This API Reference is based on PySDK 0.16.1.