TFLiteBEAM: Neural Network Inference for BEAM Languages on Edge Devices

29
clicks
TFLiteBEAM: Neural Network Inference for BEAM Languages on Edge Devices

Source: youtube.com

Type: Video

The talk by Cocoa Xu at Code BEAM Europe 2023 discusses the TFLiteBEAM project, aiming to create a lightweight neural network inference library using TensorFlow Lite (TFLite) for mobile and embedded devices. Key points include TFLiteBEAM's small size of around 10MB, making it suitable for resource-constrained devices like Raspberry Pi, its optimization for mobile and embedded devices with fast, low-latency inference, and its ability to run models on Edge TPU hardware accelerators like Coral. The talk also shows how TFLiteBEAM benefits all BEAM languages that support Erlang NIF libraries, with examples provided in Elixir. The presentation includes demonstrations of building an image classifier, using Edge TPU, and future work such as auto-conversion of models to TFLite and support for multiple TPUs.

© HashMerge 2024