Release Date: 10/02/2023
New Features and Modifications
Inference on Intel® Arc™ family of GPUs with OpenVINO™ runtime is initially supported.
Pretty-printing is implemented for model inference statistics. If
modelis an instance of
print(model.time_stats())will print inference statistics in tabulated form, similar to the text below:
Statistic , Min, Avg, Max, Cnt PythonPreprocessDuration_ms , 5.00, 5.00, 5.00, 1 CoreInferenceDuration_ms , 349.94, 349.94, 349.94, 1 CoreLoadResultDuration_ms , 0.02, 0.02, 0.02, 1 CorePostprocessDuration_ms , 0.09, 0.09, 0.09, 1 CorePreprocessDuration_ms , 2.78, 2.78, 2.78, 1 DeviceInferenceDuration_ms , 0.00, 0.00, 0.00, 1 FrameTotalDuration_ms , 610.34, 610.34, 610.34, 1
str(model.time_stats())expression will return the same text.
Python post-processor support was broken. An attempt to specify Python post-processor for a model led to the following error:
Model postprocessor type is not known: Python. Starting from this version, the Python post-processor support is restored the following way. If you want to use Python post-processor, then:
- you need to specify the name of the Python file with your Python post-processor implementation
PythonFileparameter of the
- you need to specify one of supported PySDK post-processor types in the
OutputPostprocessTypeparameter of the
- the result format generated by your Python post-processor must be compatible with the PySDK post-processor
type specified in the
Currently supported post-processor types are:
The corresponding result formats are described in PySDK User's Guide,
For security reasons, at the time of this release the DeLight cloud platform does not allow uploading to cloud model zoos models with Python post-processor for regular accounts: only cloud platform administrators can do it.
- you need to specify the name of the Python file with your Python post-processor implementation in the
TFLite runtime plugin was missing in PySDK package for Windows.