Constructor
new AIServerModel(options, additionalParamsopt)
Do not call the constructor directly. Use the `loadModel` method of an AIServerZoo instance to create an AIServerModel.
Parameters:
Name | Type | Attributes | Description | |||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
options |
Object | Options for initializing the model.
Properties
|
||||||||||||||||||||||||||||||||||||
additionalParams |
Object |
<optional> |
Additional parameters for the model. |
- Source:
Example
- Create an instance with the required model details and server URL.
let model = zoo.loadModel('some_model_name', {} );
- Use the `predict` method for inference with individual data items or `predict_batch` for multiple items.
let result = await model.predict(someImage);
for await (let result of model.predict_batch(someDataGeneratorFn)) { ... }
- Access processed results directly or set up a callback function for custom result handling.
- You can display results to a canvas to view drawn overlays.
await model.displayResultToCanvas(result, canvas);
Methods
(async) cleanup()
Cleans up resources and closes the WebSocket connection.
Does so by following a destructor-like pattern which is manually called by the user.
Makes sure to close the WebSocket connection, stop all inferences, remove the listeners, clear async queues, and nullify all references.
Call this whenever switching models or when the model instance is no longer needed.
Call this whenever switching models or when the model instance is no longer needed.
- Source:
(async) displayResultToCanvas(combinedResult, outputCanvasName, justResultsopt)
Overlay the result onto the image frame and display it on the canvas.
Parameters:
Name | Type | Attributes | Default | Description |
---|---|---|---|---|
combinedResult |
Object | The result object combined with the original image frame. This is directly received from `predict` or `predict_batch` | ||
outputCanvasName |
string | HTMLCanvasElement | The canvas to draw the image onto. Either the canvas element or the ID of the canvas element. | ||
justResults |
boolean |
<optional> |
false | Whether to show only the result overlay without the image frame. |
- Source:
labelDictionary() → {Object}
Returns the label dictionary for this AIServerModel instance.
- Source:
Returns:
The label dictionary.
- Type
- Object
modelInfo() → {Object}
Returns a read-only copy of the model parameters.
- Source:
Returns:
The model parameters.
- Type
- Object
(async) predict(imageFile, infoopt, bypassPreprocessingopt) → {Promise.<Object>}
Predicts the result for a given image.
Parameters:
Name | Type | Attributes | Default | Description |
---|---|---|---|---|
imageFile |
Blob | File | string | HTMLImageElement | HTMLVideoElement | HTMLCanvasElement | ArrayBuffer | TypedArray | ImageBitmap | |||
info |
string |
<optional> |
performance.now() | Unique frame information provided by user (such as frame num). Used for matching results back to input images within callback. |
bypassPreprocessing |
boolean |
<optional> |
false | Whether to bypass preprocessing. Used to send Blob data directly to the socket without any preprocessing. |
- Source:
Returns:
The prediction result.
- Type
- Promise.<Object>
Examples
If callback is provided:
The WebSocket onmessage will invoke the callback directly when the result arrives.
If callback is not provided:
The function waits for the resultQ to get a result, then returns it.
let result = await model.predict(someImage);
(async, generator) predict_batch(data_source, bypassPreprocessingopt) → {Object}
Predicts results for a batch of data. Will yield results if a callback is not provided.
Parameters:
Name | Type | Attributes | Default | Description |
---|---|---|---|---|
data_source |
AsyncIterable | An async iterable data source. | ||
bypassPreprocessing |
boolean |
<optional> |
false | Whether to bypass preprocessing. |
- Source:
Yields:
The prediction result.
- Type
- Object
Example
The function asynchronously processes results. If a callback is not provided, it will yield results.
for await (let result of model.predict_batch(data_source)) { console.log(result); }
(async) processImageFile(combinedResult) → {Promise.<Blob>}
Processes the original image and draws the results on it, return png image with overlayed results.
Parameters:
Name | Type | Description |
---|---|---|
combinedResult |
Object | The result object combined with the original image frame. |
- Source:
Returns:
The processed image file as a Blob of a PNG image.
- Type
- Promise.<Blob>