Index

Synthax - An Experiment

Synthax is an effort to explore the less popular and unknown optimizer algorithm used for optimization of neural networks. This whole experiment revolves around comparing two types of optimizers: Gradient Descent and Newton Second Moment Update.

This project leverages JAX for stable autodiff and Sympy for Symbolical notation of various Mathematical equations.

Synthax Logo

For more information, visit GitHub.