Batch Processing and Callbacks
This guide covers model.predict_batch(), asynchronous callbacks, and how to manage the inference queue.
Batch Processing with predict_batch()
predict_batch()Example 1: Camera Inference Using an Async Generator Function
// Create video element and give it access to the webcam
const video = document.createElement('video');
video.autoplay = true;
video.style.display = 'none';
document.body.appendChild(video);
const stream = await navigator.mediaDevices.getUserMedia({ video: true });
video.srcObject = stream;
// Wait for video to be ready
await new Promise(resolve => video.onloadedmetadata = resolve);
// Frame generator yielding camera frames + frameId
async function* frameGenerator() {
let frameId = 0;
while (true) {
if (!video.videoWidth || !video.videoHeight) continue;
const bitmap = await createImageBitmap(video);
yield [bitmap, `frame_${frameId++}`];
}
}
// Run inference on the webcam frames
for await (const result of model.predict_batch(frameGenerator())) {
model.displayResultToCanvas(result, 'outputCanvas');
}Example 2: Using an Array as an Async Iterable
Example 3: Using a ReadableStream
ReadableStreamAsynchronous Flow with Callbacks
Example: Using a Callback
Controlling Back-pressure with max_q_len
max_q_lenWas this helpful?

