Setting Up an AI Server
Read this page if you'll host an AI server or perform inference with a local server.
Last updated
Was this helpful?
Read this page if you'll host an AI server or perform inference with a local server.
Last updated
Was this helpful?
Use PySDK to configure and launch an AI server. The server runs on your host and processes inference requests from remote clients.
You can start the AI server in several ways:
running AI server from a terminal directly on the host OS.
running AI server as a Linux system service.
running AI server with a Docker container.
To run the PySDK AI server from a terminal, perform the following steps:
Create or select a user with administrative rights
Choose a user with administrative rights on the host.
Set up a Python virtual environment
For convenience and future maintenance, install PySDK in a Python virtual environment. Ensure Python and PySDK are installed in the virtual environment.
Create a directory for the local model zoo
Create a directory for hosting a local model zoo.
Download models to the local model zoo
Download the models from the DeGirum AI Hub Model Zoo to the directory created earlier:
"token string"
: Your AI Hub access token from the .
Optional model_zoo_url: The URL for the model zoo in the format "https://hub.degirum.com/<organization>/<zoo>"
. If omitted, the public model zoo will be used.
Start the AI server
Launch the server with the following command:\
The server runs until you press ENTER
in the terminal. By default, it listens on TCP port 8778. To specify a different port, use the --port
argument:
To automatically start the server on boot, configure it as a Linux service:
Complete the terminal setup steps
Follow all steps in except for launching the server.
Create a systemd service configuration file
Create a file named degirum.service
in the /etc/systemd/system
directory. Use the following template:
Start the service
Start the service using systemctl
:
Check the service status
Check the service status using systemctl
:
Enable the service on startup
Use systemctl
to automatically enable the degirum service on startup:
To run the AI server as a Docker container, follow these steps:
Ensure Docker is installed
Prepare the local model zoo
Run the Docker container
When hosting models locally:
When serving models only from AI Hub:
If you started your AI server in a terminal or as a Linux service, you can tell the AI server to rescan the local model zoo directory by executing the following command on the same host: degirum server rescan-zoo
If you started your AI server in the Docker container, then you should rescan the model zoo directory by restarting the container: docker restart <your_server_name>
Refer to the for installation instructions
If hosting models locally, create and populate the model zoo directory as described in .