Math concepts used in training a model

  1. Derivatives

  2. Partial derivatives and Gradients

  3. Optimization

  4. Loss and Cost functions (Loss is a measure of the difference of a single example to its target value while the Cost is a measure of the losses over the training set)

  5. Gradient descent

  6. Back-propagation

Derivatives:

Why are derivatives and calculus so important in Machine learning ?

$$ slope = Δy/Δx = rise/run $$

Δx and Δy represent small but finite changes in x and y. This value represents the steepness and direction of the line. A positive slope indicates that the line rises as it moves from left to right, while a negative slope indicates that it falls. A slope of zero means the line is horizontal.

Untitled.png

Untitled.mov

$$ slope\;at\; a\; point = dy/dx $$

When the rise over run becomes infinitesimally small, you are moving toward the concept of the derivative in calculus.

In calculus, the instantaneous rate of change at a point on a curve describes how quickly the function's value y is changing at precisely that point x. This is given by the derivative of the function at that point, which is the slope of the tangent line to the curve at that point.

It represent infinitesimal changes in the 𝑦 direction and in the 𝑥 direction. 𝑑𝑥 and 𝑑y represent infinitesimally small changes — so small that they are not zero but are closer to zero. When the rise (change in y) and the run (change in x) are infinitesimal, the slope calculated gives the instantaneous rate of change of y with respect to x.