Projects

Optimizer Visualization

This simulation visualizes gradient descent methods in different situations. The main motivation behind that is to better understand optimizers which are being used by deep learning frameworks. The following optimizers are implemented: SGD, SGD Momentum, Nesterov Momentum, AdaGrad, RMSProp, Adam

Optimizer Visualization
2022 Christian Waldmann