Intro to Functions
Functions contains Activation Functions which can be passed to Layer.
Available Activation Functions
- Sigmoid,
- ReLU,
- LeakyReLU,
- Softmax,
- Tanh,
- Linear
You can use formula and gradient getter to get the respective values.
Getting Function formula
Getting Function gradient formula
Calculate Result
Calculate Gradient
Custom Function
You can create your custom activation functions by creating a new class and extending the ActivationFunction class.
Your activation function should have the following attributes and functions :
get formula
This should return your activation function’s formula,
get gradient
This should return your activation function’s gradient formula,
calcGradient
It should return the gradient for the provided x
calculate
It should return the values of x for your activation function
toString
Return string version of how the object should be intialized.
This is used while saving and loading the model
Example
References
Some of the functionality is implemented using the awesome resources from the internet.