Our great sponsors
-
coral-pi-rest-server
Perform inferencing of tensorflow-lite models on an RPi with acceleration from Coral USB stick
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
In this case we are talking about https://coral.ai/products/accelerator/ basically it's a "coprocessor" for AI neural network. You can train your models on a beefy computer with proper GPU for speed... but the trained AI network can then be used on some edge low power devices. But to process already trained network in any resemblance of real time, you can't use CPU ( too slow even on big PCs), GPU (Graphic card can't fit to Raspberry Pi, or smaller edge devices ) therefore TPU, a USB dongle like device, that took the AI processing part out of graphic card (on smaller scale) and allows you to execute AI stuff directly on that HW with almost real time performance.
I would start here: https://github.com/grinco/HASS-coral-rest-api