DeGirum Cloud Platform solves the Edge AI development problem by providing the necessary toolchains to design, run, and evaluate AI models across different hardware platforms in the cloud.
The DeGirum Cloud Platform includes the following components:
- DeGirum Cloud Device Farm accessible through the Cloud Application Server and PySDK
- Cloud Application Server
- Cloud model zoos hosted on DeGirum cloud infrastructure
The DeGirum Cloud Device Farm is a set of computing nodes with various AI hardware accelerators installed in those nodes, including: DeGirum ORCA™, Google Edge TPU, Intel® Movidius™ Myriad™ VPU. The farm nodes are hosted by DeGirum.
The Cloud Application Server provides web APIs to perform AI inference on the Cloud Farm devices. Starting from ver. 0.3.0, PySDK supports Application Server web API and provides the same level of functionality transparently to the end user: you may run exactly the same code on the Cloud Platform as it was designed for traditional use cases such as local inference or AI server inference.
A Cloud Model Zoo is a collection of AI models, stored on the DeGirum cloud infrastructure. A registered DeGirum Cloud Platform user can create and maintain multiple cloud zoos with either private or public access. A model zoo with private access is visible to all users belonging to the same organization. A model zoo with public access is visible to all registered users of DeGirum Cloud Platform. DeGirum maintains a public model zoo with extensive set of AI models available free of charge to all registered users.
The DeGirum Cloud Portal provides GUI to get access to the various Cloud Platform assets such as:
- cloud API access tokens, which are required to access the Cloud Application Server through PySDK;
- cloud model zoos belonging to the user's organization;
- PySDK documentation;
- Cloud Model Compiler and Model Parameters Wizard.