Lior Sinai 8/17/2024

MicroGrad.jl: Part 4 Extensions

Read Original

This technical article, part 4 of a series on automatic differentiation in Julia, details extending the MicroGrad.jl library. It explains how to implement pullbacks for core functions like `map`, `getproperty`, and anonymous functions—which lack formal mathematical derivatives—to create a generic gradient descent optimizer. The tutorial demonstrates this by fitting a polynomial model, addressing challenges not covered by standard ChainRules.jl.

MicroGrad.jl: Part 4 Extensions

Comments

No comments yet

Be the first to share your thoughts!

Browser Extension

Get instant access to AllDevBlogs from your browser

Top of the Week

1
The Beautiful Web
Jens Oliver Meiert 2 votes
3
LLM Use in the Python Source Code
Miguel Grinberg 1 votes
4
Wagon’s algorithm in Python
John D. Cook 1 votes