LogoLogo
AI HubCommunityWebsite
  • Start Here
  • AI Hub
    • Overview
    • Quickstart
    • Spaces
    • Device Farm
    • Browser Inference
    • Model Zoo
      • Hailo
      • Intel
      • MemryX
      • BrainChip
      • Google
      • DeGirum
      • Rockchip
    • View and Create Model Zoos
    • Compiler
    • PySDK Integration
  • PySDK
    • Overview
    • Quickstart
    • Installation
    • Runtimes and Drivers
      • Hailo
      • OpenVINO
      • MemryX
      • BrainChip
      • Rockchip
      • ONNX
    • PySDK User Guide
      • Core Concepts
      • Organizing Models
      • Setting Up an AI Server
      • Loading an AI Model
      • Running AI Model Inference
      • Model JSON Structure
      • Command Line Interface
      • API Reference Guide
        • PySDK Package
        • Model Module
        • Zoo Manager Module
        • Postprocessor Module
        • AI Server Module
        • Miscellaneous Modules
      • Older PySDK User Guides
        • PySDK 0.16.1
        • PySDK 0.16.0
        • PySDK 0.15.2
        • PySDK 0.15.1
        • PySDK 0.15.0
        • PySDK 0.14.3
        • PySDK 0.14.2
        • PySDK 0.14.1
        • PySDK 0.14.0
        • PySDK 0.13.4
        • PySDK 0.13.3
        • PySDK 0.13.2
        • PySDK 0.13.1
        • PySDK 0.13.0
    • Release Notes
      • Retired Versions
    • EULA
  • DeGirum Tools
    • Overview
      • Streams
        • Streams Base
        • Streams Gizmos
      • Compound Models
      • Analyzers
        • Clip Saver
        • Event Detector
        • Line Count
        • Notifier
        • Object Selector
        • Object Tracker
        • Zone Count
      • Inference Support
      • Support Modules
        • Audio Support
        • Model Evaluation Support
        • Math Support
        • Object Storage Support
        • UI Support
        • Video Support
      • Environment Variables
  • DeGirumJS
    • Overview
    • Get Started
    • Understanding Results
    • Release Notes
    • API Reference Guides
      • DeGirumJS 0.1.3
      • DeGirumJS 0.1.2
      • DeGirumJS 0.1.1
      • DeGirumJS 0.1.0
      • DeGirumJS 0.0.9
      • DeGirumJS 0.0.8
      • DeGirumJS 0.0.7
      • DeGirumJS 0.0.6
      • DeGirumJS 0.0.5
      • DeGirumJS 0.0.4
      • DeGirumJS 0.0.3
      • DeGirumJS 0.0.2
      • DeGirumJS 0.0.1
  • Orca
    • Overview
    • Benchmarks
    • Unboxing and Installation
    • M.2 Setup
    • USB Setup
    • Thermal Management
    • Tools
  • Resources
    • External Links
Powered by GitBook

Get Started

  • AI Hub Quickstart
  • PySDK Quickstart
  • PySDK in Colab

Resources

  • AI Hub
  • Community
  • DeGirum Website

Social

  • LinkedIn
  • YouTube

Legal

  • PySDK EULA
  • Terms of Service
  • Privacy Policy

© 2025 DeGirum Corp.

On this page
  • Download Model Zoo
  • Server Control
  • System Info
  • Manage Tracing

Was this helpful?

  1. PySDK
  2. PySDK User Guide

Command Line Interface

Learn how to use the PySDK command line interface to manage AI models, control your AI server, and streamline model downloads.

PreviousModel JSON StructureNextAPI Reference Guide

Last updated 1 day ago

Was this helpful?

During PySDK installation, the degirum executable is added to the system path. This script provides a command-line interface (CLI) for PySDK management tasks and extends functionality through .

The PySDK CLI supports the following commands:

Command
Description

Download AI models from the cloud model zoo

Control operation of the AI server

Get system information dump

Manage AI server tracing

version

Print PySDK version

Invoke the console script with one of the commands above, followed by its parameters.

degirum <command> <arguments>

Download Model Zoo

Command: download-zoo

Using this command you can download ML models from the cloud model zoo of your choice specified by URL. The command has the following parameters:

Parameter
Description
Possible Values
Default

--path

Local filesystem path to store models downloaded from a model zoo repo

Valid local directory path

Current directory

--url

Cloud model zoo URL

"https://hub.degirum.com[/<zoo URL>]"

"https://hub.degirum.com"

--token

Cloud API access token

Valid token obtained at hub.degirum.com

Empty

--model_family

Model family name filter: model name substring or regular expression

Any

Empty

--device

Target inference device filter

ORCA, CPU, GPU, EDGETPU, MYRIAD, DLA, DLA_FALLBACK, NPU, RK3588, RK3566, RK3568, NXP_VX, NXP_ETHOSU, ARMNN, VITIS_NPU

Empty

--runtime

Runtime agent type filter

N2X, TFLITE, TENSORRT, OPENVINO, ONNX, RKNN

Empty

--precision

Model calculation precision filter

QUANT, FLOAT

None

--pruned

Model density filter

PRUNED, DENSE

None

The URL parameter uses the form "https://hub.degirum.com/<zoo URL>", where <zoo URL> is <organization>/<zoo>. To find this suffix, go to the AI Hub, select the desired zoo, and click the copy button next to its name.

Example.

Download models for ORCA device type from DeGirum Public cloud model zoo into ./my-zoo directory.

degirum download-zoo --path ./my-zoo --token <your cloud API access token> --device ORCA

Here <your cloud API access token> is your cloud API access token, which you can generate on the AI Hub.

Server Control

server

Use this command to start the AI server, shut it down, or ask it to rescan its local model zoo.

You can control only an AI server running on the same host as this command. Remote control is disabled for security reasons.

This command has the following subcommands, which are passed just after the command:

Sub-command
Description

start

Start AI server

rescan-zoo

Request AI server to rescan its model zoo

shutdown

Request AI server to shutdown

cache-dump

Dump AI server inference agent cache info

The command has the following parameters:

Parameter
Applicable To
Description
Possible Values
Default

--zoo

start

Local model zoo directory to serve models from (applicable only to start subcommand)

Any valid path

Current directory

--quiet

start

Do not display any output (applicable only to start subcommand)

N/A

Disabled

--port

start

TCP port to bind AI server to

1...65535

8778

--protocol

start

AI server protocol to use

asio, http, both

asio

Starting with PySDK 0.10.0, the AI server supports two protocols: asio and http. asio is DeGirum's custom socket-based protocol used in earlier versions. The new http protocol relies on REST HTTP and WebSockets, so you can use the AI server from any language that supports these standards. Browser-based JavaScript, for example, requires the http protocol because it lacks native socket support.

The asio protocol is selected by default. Use --protocol http to enable the http protocol or --protocol both to enable both. When both are enabled, the AI server listens on two consecutive ports: the first for asio and the second for http.

Examples.

Start AI server to serve models from ./my-zoo directory, bind it to default port, and use asio protocol:

degirum server start --zoo ./my-zoo

Start AI server to serve models from ./my-zoo directory, use asio protocol on port 12345, and use http protocol on port 12346:

degirum server start --zoo ./my-zoo --port 12345 --protocol both

System Info

Command: sys-info

This command displays system information for the local host or a remote AI server.

The command has the following parameters:

Parameter
Description
Possible Values
Default

--host

Remote AI server hostname or IP address; omit to query local system

Valid hostname, IP address, or empty

Empty

Example.

Query system info from remote AI server at IP address 192.168.0.101:

degirum sys-info --host 192.168.0.101

Manage Tracing

Command: trace

Use this command to manage the AI server tracing feature.

The tracing feature is primarily for debugging and profiling. It is mainly intended for DeGirum customer support and isn't typically used directly by end users.

This command has the following subcommands, which are passed just after the command:

Sub-command
Description

list

List all available trace groups

configure

Configure trace levels for trace groups

read

Read trace data to file

The command has the following parameters:

Parameter
Applicable To
Description
Possible Values
Default

--host

All subcommands

Remote AI server hostname or IP address

Valid hostname or IP address

localhost

--file

read

Filename to save trace data into; omit to print to console

Valid local filename

Empty

--filesize

read

Maximum trace data size to read

Any integer number

10000000

--basic

configure

Set Basic trace level for a given list of trace groups

One or multiple trace group names as returned by list sub-command

Empty

--detailed

configure

Set Detailed trace level for a given list of trace groups

One or multiple trace group names as returned by list sub-command

Empty

--full

configure

Set Full trace level for a given list of trace groups

One or multiple trace group names as returned by list sub-command

Empty

Examples.

Query AI server at 192.168.0.101 address for the list of available trace groups and print it to console:

degirum trace list --host 192.168.0.101

Configure tracing for the AI server on localhost by setting trace levels for specific groups:

degirum trace configure --basic CoreTaskServer --detailed OrcaDMA OrcaRPC --full CoreRuntime

Read trace data from the AI server on localhost and save it to ./my-trace-1.txt

degirum trace read --file ./my-trace-1.txt

Filter parameters work the same way as in and let you download only models that satisfy the filter conditions.

Once models are downloaded into the directory specified by --path parameter, you may use this directory as the model zoo to be served by AI server (see section).

console script
entry points
Server Control Command
download-zoo
server
sys-info
trace
degirum.zoo_manager.ZooManager.list_models