We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
TFLiteBEAM: Neural Network Inference for BEAM Languages on Edge Devices
44
clicks
Source: youtube.com
The talk by Cocoa Xu at Code BEAM Europe 2023 discusses the TFLiteBEAM project, aiming to create a lightweight neural network inference library using TensorFlow Lite (TFLite) for mobile and embedded devices. Key points include TFLiteBEAM's small size of around 10MB, making it suitable for resource-constrained devices like Raspberry Pi, its optimization for mobile and embedded devices with fast, low-latency inference, and its ability to run models on Edge TPU hardware accelerators like Coral. The talk also shows how TFLiteBEAM benefits all BEAM languages that support Erlang NIF libraries, with examples provided in Elixir. The presentation includes demonstrations of building an image classifier, using Edge TPU, and future work such as auto-conversion of models to TFLite and support for multiple TPUs.
Related posts
© HashMerge 2024