Skip to content

PySDK Command-Line Interface

As a part of PySDK installation, the degirum executable console script is installed into the system path. It implements command-line interface (CLI) to various PySDK management facilities. This console script extends PySDK functionality through the mechanism of entry points.

The PySDK CLI supports the following commands:

Command Description
download-zoo Command to download AI models from the cloud model zoo
server Command to control the operation of AI server
sys-info Command to obtain system information dump
trace Command to manage AI server tracing facility
version Command to print PySDK version

You invoke the console script passing the command from the table above as its first argument followed by command-specific parameters described in the following sections.

degirum <command> <arguments>

Download Model Zoo Command

Command: download-zoo

Using this command you can download ML models from the cloud model zoo of your choice specified by URL. The command has the following parameters:

Parameter Description Possible Values Default
--path Local filesystem path to store models downloaded from a model zoo repo Valid local directory path Current directory
--url Cloud model zoo URL "https://cs.degirum.com[/<zoo URL>]" "https://cs.degirum.com"
--token Cloud API access token Valid token obtained at cs.degirum.com Empty
--model_family Model family name filter: model name substring or regular expression Any Empty
--device Target inference device filter ORCA, CPU, GPU, EDGETPU, MYRIAD, DLA, DLA_FALLBACK, NPU, RK3588, RK3566, RK3568, NXP_VX, NXP_ETHOSU, ARMNN, VITIS_NPU Empty
--runtime Runtime agent type filter N2X, TFLITE, TENSORRT, OPENVINO, ONNX, RKNN Empty
--precision Model calculation precision filter QUANT, FLOAT None
--pruned Model density filter PRUNED, DENSE None

The URL parameter is specified in the "https://cs.degirum.com/<zoo URL>" format. Here <zoo URL> suffix is the name of the cloud model zoo in the form of <organization>/<zoo>. You can obtain the model zoo URL suffix on DeGirum Cloud Portal site https://cs.degirum.com under Management | Models main menu item: just select the model zoo you want to access to open the model zoo page and click copy button near the model zoo name to copy the model zoo URL suffix into the clipboard.

Filter parameters work the same way as in degirum.zoo_manager.ZooManager.list_models method: they allow you downloading only models satisfying filter conditions.

Once models are downloaded into the directory specified by --path parameter, you may use this directory as the model zoo to be served by AI server (see Server Control Command section).

Example.

Download models for ORCA device type from DeGirum Public cloud model zoo into ./my-zoo directory.

degirum download-zoo --path ./my-zoo --token <your cloud API access token> --device ORCA
Here <your cloud API access token> is your cloud API access token, which you can generate on DeGirum Cloud Portal site https://cs.degirum.com under Management | My Tokens main menu item.

Server Control Command

server

Using this command you can start AI server, shut it down, or request AI server to rescan its local model zoo.

You can control only AI server which runs on the same host where you execute this command. The control of remote AI servers is disabled for security reasons.

This command has the following subcommands, which are passed just after the command:

Sub-command Description
start Start AI server
rescan-zoo Request AI server to rescan its model zoo
shutdown Request AI server to shutdown
cache-dump Dump AI server inference agent cache info

The command has the following parameters:

Parameter Applicable To Description Possible Values Default
--zoo start Local model zoo directory to serve models from (applicable only to start subcommand) Any valid path Current directory
--quiet start Do not display any output (applicable only to start subcommand) N/A Disabled
--port start TCP port to bind AI server to 1...65535 8778
--protocol start AI server protocol to use asio, http, both asio

Starting from PySDK version 0.10.0, AI server supports two protocols: asio and http. The asio protocol is DeGirum custom socket-based AI server protocol, supported by all previous PySDK versions. The http protocol is a new protocol, which is based on REST HTTP requests and WebSockets streaming. The http protocol allows to use AI server from any programming language, which supports HTTP requests and WebSockets, such as browser-based JavaScript, which does not support native sockets, thus precluding the use of asio protocol.

The asio protocol is selected by default. You can select the http protocol by specifying --protocol http. You may select both protocols by specifying --protocol both. In this case, AI server will listen to both protocols on two consecutive TCP ports: the first port is used for asio protocol, the second port is used for http protocol.

Examples.

Start AI server to serve models from ./my-zoo directory, bind it to default port, and use asio protocol:

degirum server start --zoo ./my-zoo

Start AI server to serve models from ./my-zoo directory, use asio protocol on port 12345, and use http protocol on port 12346:

degirum server start --zoo ./my-zoo --port 12345 --protocol both

System Info Command

Command: sys-info

Using this command you can query the system information either for the local system or for the remote AI server host. The system info dump is printed to the console.

The command has the following parameters:

Parameter Description Possible Values Default
--host Remote AI server hostname or IP address; omit to query local system Valid hostname, IP address, or empty Empty

Example.

Query system info from remote AI server at IP address 192.168.0.101:

degirum sys-info --host 192.168.0.101

Trace Management Command

Command: trace

Using this command you can manage AI server tracing facility.

AI server tracing facility is used for AI server debugging and time profiling. It is designed mostly for DeGirum customer support and not intended to be used by the end user directly.

This command has the following subcommands, which are passed just after the command:

Sub-command Description
list List all available trace groups
configure Configure trace levels for trace groups
read Read trace data to file

The command has the following parameters:

Parameter Applicable To Description Possible Values Default
--host All subcommands Remote AI server hostname or IP address Valid hostname or IP address localhost
--file read Filename to save trace data into; omit to print to console Valid local filename Empty
--filesize read Maximum trace data size to read Any integer number 10000000
--basic configure Set Basic trace level for a given list of trace groups One or multiple trace group names as returned by list sub-command Empty
--detailed configure Set Detailed trace level for a given list of trace groups One or multiple trace group names as returned by list sub-command Empty
--full configure Set Full trace level for a given list of trace groups One or multiple trace group names as returned by list sub-command Empty

Examples.

Query AI server at 192.168.0.101 address for the list of available trace groups and print it to console:

degirum trace list --host 192.168.0.101

Configure tracing for AI server on localhost: by setting various tracing levels for various trace groups:

degirum trace configure --basic CoreTaskServer --detailed OrcaDMA OrcaRPC --full CoreRuntime

Read trace data from AI server on localhost and save it to a file ./my-trace-1.txt

degirum trace read --file ./my-trace-1.txt