What is a derivative?
The derivative is the derivative of a function. The differential of a function is the value when the ratio of the amount of change in the output to the amount of small change in the input is taken and the minute change is brought close to 0 to the limit.
Derivative = output change / input minute change (close to 0 to the limit)
For example, consider the input 0.5.
From 0.5, increase a small amount 0.001. The output for 0.5 is func (0.5). The output for "0.5 + 0.001" is func (0.5 + 0.001).
The small change in input is "0.001". The change in output for this is "func (0.5 + 0.001) --func (0.5)".
The derivative is "func (0.5 + 0.001) --func (0.5) / 0.001", which is the value obtained by bringing 0.001 close to 0 to the limit.
The derivative for a particular input is also called the slope. If the amount of change in the output is large for a small change in the input, the slope will be large.
Where derivatives are used in deep learning
In deep learning, when a derivative appears, think of it as a slope.
The hidden layer weight and bias parameters are the inputs, and the loss function is the output.
The loss function is an indicator of the error and adjusts the weight and bias parameters to reduce the error.
Using an algorithm called reverse mispropagation method, "a set of weight slopes that decrease the value of the loss function" and "decrease the value of the loss function". Find the "set of bias slopes" and consider the learning rate, the "set of current weights" and the "set of current biases". Subtract from. When using a technique called normalization, the normalization is also taken into consideration when subtracting.