Release Date: 07/11/2023
New Features and Modifications
Plugin for OpenVINO™ runtime is initially supported. This plugin allows performing inferences of ONNX AI models on devices supported by OpenVINO runtime, including:
- Intel® Movidius™ Myriad™
The plugin supports just-in-time compilation of AI models in ONNX format to OpenVINO format. Compiled models are then saved in the local cache for reuse.
- TensorRT runtime and TensorRT-supported devices were not recognized by
degirum.zoo_manager.ZooManager.list_modelsmethod so no model filtering was possible for TensorRT runtime and TensorRT-supported devices.