Layers¶
Layers are the building blocks of a Neural Network. They are the individual neurons that are connected to each other to form the network. Each layer has a specific number of neurons and an activation function.
- class NeuralNetPy.layers.Dense¶
Bases:
Layer
Initializes a
Dense
layer, which is the backbone of a Neural Network.- Parameters:
nNeurons (int) – The number of neurons in the layer
activationFunc (ACTIVATION) – The activation function to be used, defaults to
SIGMOID
weightInit (WEIGHT_INIT) – The weight initialization method to be used, defaults to
RANDOM
bias (int) – The bias to be used, defaults to 0
import NeuralNetPy as NNP layer = NNP.layers.Dense(3, NNP.ACTIVATION.RELU, NNP.WEIGHT_INIT.HE)
- typeStr(self: NeuralNetPy.layers.Dense) str ¶
Returns the type of the layer.
- class NeuralNetPy.layers.Dropout¶
Bases:
Layer
Initializes a
Dropout
layer, it’s a layer that simply applies a dropout to the input.- Parameters:
rate (float32) – A float between 0 and 1. It represents the fraction of the inputs to drop.
seed (int) – An integer used as random seed. If not provided a random seed will be generated.
- typeStr(self: NeuralNetPy.layers.Dropout) str ¶
Returns the type of the layer.
- class NeuralNetPy.layers.Flatten¶
Bases:
Layer
Initializes a
Flatten
layer. The sole purpose of this layer is to vectorize matrix inputs like images.- Parameters:
inputShape (tuple) – The shape of the input matrix (rows, cols or number of pixels per row and column in the case of images)
import NeuralNetPy as NNP layer = NNP.layers.Flatten((3, 3))
- typeStr(self: NeuralNetPy.layers.Flatten) str ¶
Returns the type of the layer.