NeuralNetPy

Neural Network Library

class NeuralNetPy.ACTIVATION

Bases: pybind11_object

Members:

RELU : Rectified Activation Function

SIGMOID : Sigmoid Activation Function

SOFTMAX : Softmax Activation Function

RELU = <ACTIVATION.RELU: 0>
SIGMOID = <ACTIVATION.SIGMOID: 1>
SOFTMAX = <ACTIVATION.SOFTMAX: 2>
property name
property value
class NeuralNetPy.LOSS

Bases: pybind11_object

Members:

QUADRATIC

MCE

BCE

BCE = <LOSS.BCE: 2>
MCE = <LOSS.MCE: 0>
QUADRATIC = <LOSS.QUADRATIC: 1>
property name
property value
class NeuralNetPy.TrainingData2dI

Bases: pybind11_object

batch(self: NeuralNetPy.TrainingData2dI, batchSize: int, stratified: bool = False, shuffle: bool = False, dropLast: bool = False, verbose: bool = False) None

This method will separate the inputs and labels data into batches of the specified size

getMiniBatches(self: NeuralNetPy.TrainingData2dI) List[Tuple[List[List[float]], List[float]]]
class NeuralNetPy.TrainingData3dI

Bases: pybind11_object

batch(self: NeuralNetPy.TrainingData3dI, batchSize: int, stratified: bool = False, shuffle: bool = False, dropLast: bool = False, verbose: bool = False) None

This method will separate the inputs and labels data into batches of the specified size

getMiniBatches(self: NeuralNetPy.TrainingData3dI) List[Tuple[List[List[List[float]]], List[float]]]
class NeuralNetPy.WEIGHT_INIT

Bases: pybind11_object

Members:

RANDOM : Initialize weights with random values

GLOROT

Initialize weights with Glorot initialization.

Tip

Best when combined with RELU

HE

Initialize weights with He initialization.

Tip

Best when combined with RELU or SOFTMAX

LECUN

Initialize weights with Lecun initialization.

Tip

Best when combined with SOFTMAX

GLOROT = <WEIGHT_INIT.GLOROT: 1>
HE = <WEIGHT_INIT.HE: 2>
LECUN = <WEIGHT_INIT.LECUN: 3>
RANDOM = <WEIGHT_INIT.RANDOM: 0>
property name
property value