Optimizers

Optimizers are algorithms or methods used to change the attributes of the Neural Network such as weights and learning rate in order to reduce the losses. They are used to solve the optimization problem of minimizing the loss function.

class NeuralNetPy.optimizers.Adam

Bases: Optimizer

For more information on Adam optimizer <https://arxiv.org/abs/1412.6980>

Parameters:
  • alpha (float) – The learning rate, defaults to 0.001

  • beta1 (float) – The exponential decay rate for the first moment estimates, defaults to 0.9

  • beta2 (float) – The exponential decay rate for the second-moment estimates, defaults to 0.999

  • epsilon (float) – A small constant for numerical stability, defaults to 10E-8

class NeuralNetPy.optimizers.SGD

Bases: Optimizer

For more information on Stochastic Gradient Descent <https://en.wikipedia.org/wiki/Stochastic_gradient_descent>

Parameters:

alpha (float) – The learning rate, defaults to 0.001