According to Wikipedia, “Gradient descent is a first-order iterative optimization algorithm for finding local minima of a differentiable function“. Sounds a lot, right? In this article, let’s get acquainted with the Gradient descent algorithm in the most straightforward (and ‘simplest’) way. Before we continue with understanding the ABCDs of Gradient descent (and dig into the…

# Category: Neural Networks

## Derivative of Sigmoid Function

In this article, we’ll find the derivative of Sigmoid Function. The Sigmoid Function is one of the non-linear functions that is used as an activation function in neural networks.

## A step by step forward pass and backpropagation example

In this article, we’ll see a step by step forward pass (forward propagation) and backward pass (backpropagation) example. We’ll be taking a single hidden layer neural network and solving one complete cycle of forward propagation and backpropagation.