Calculus in Action: Neural Networks
- Andromeda AI
- Sep 4, 2021
- 1 min read
An artificial neural network is a computational model that approximates a mapping between inputs and outputs.
It is inspired by the structure of the human brain, in that it is similarly composed of a network of interconnected neurons that propagate information upon receiving sets of stimuli from neighbouring neurons.
Training a neural network involves a process that employs the backpropagation and gradient descent algorithms in tandem. As we will be seeing, both of these algorithms make extensive use of calculus.
After reading this, you will discover how aspects of calculus are applied in neural networks.
Read More to Learn:
An artificial neural network is organized into layers of neurons and connections, where the latter are attributed a weight value each.
Each neuron implements a nonlinear function that maps a set of inputs to an output activation.
In training a neural network, calculus is used extensively by the backpropagation and gradient descent algorithms
コメント