# Advanced guides

- [Custom video source](/axelera/advanced-guides/custom-video-source.md): Plug custom video sources into PySDK using predict\_batch—ideal for cameras, SDKs, GStreamer, and advanced use cases needing per-frame control or metadata.
- [AI Server](/axelera/advanced-guides/ai-server.md): Learn how to use the DeGirum AI Server to efficiently host your hardware with local or cloud models.
- [Inference with cloud models](/axelera/advanced-guides/ai-server/ai-server-inference-with-cloud-models.md): Run inference on a local AI server while fetching models from DeGirum’s public cloud zoo—ideal for hybrid setups where compute is local, but model access is remote.
- [Inference with local models](/axelera/advanced-guides/ai-server/ai-server-inference-with-local-models.md): Learn how to run inference using locally stored models on a DeGirum AI Server, whether the server runs on the same machine as the client or remotely over the network.
