We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
Understanding Stochastic Gradient Descent Using Elixir
191
clicks
Source: youtube.com
Eric Iacutone presented at ElixirConf 2023 about how to understand and implement stochastic gradient descent (SGD) using Elixir, specifically through an interactive Livebook example. SGD is an optimization algorithm commonly used for training neural networks, and it is typically the algorithm behind the learning process in machine learning. Eric referenced the Micrograd framework by Andrej Karparthy, which is an automatic differentiation library in Python, and explained how the concepts and implementations can be adopted into Elixir. The key part of SGD is understanding derivatives and the process of backpropagation. Eric's intent is for participants to gain a strong fundamental understanding of SGD and its application in optimizing loss functions to train neural networks. His talk provided theoretical insights and practical examples illustrating the SGD process, detailing the transition from using SGD to optimize linear functions to its application in updating neural network weights through backpropagation. In conclusion, the talk aimed to convey a comprehensive understanding of SGD, its importance in machine learning, and its practical implementation using Elixir's Livebook interactive notebook and NX numerical library.
Related posts
© HashMerge 2024