Understanding Gradient Descent Algorithm: The simplest way

According to Wikipedia, “Gradient descent is a first-order iterative optimization algorithm for finding local minima of a differentiable function“. Sounds a lot, right? In this article, let’s get acquainted with the Gradient descent algorithm in the most straightforward (and ‘simplest’) way. Before we continue with understanding the ABCDs of Gradient descent (and dig into the…

A step by step forward pass and backpropagation example

The neural network that we'll be solving in this article.

In this article, we’ll see a step by step forward pass (forward propagation) and backward pass (backpropagation) example. We’ll be taking a single hidden layer neural network and solving one complete cycle of forward propagation and backpropagation.

PyTorch and Tensors fundamentals

PyTorch is a deep learning framework that significantly simplifies the process of writing and training deep neural networks. It supports a wide range of architectures, from shallow ones to deep ones like transformers. I mean, any neural network architecture you can think of. On the other hand, tensors are fundamental data structures in PyTorch; they…

Derivative of Sigmoid Function

Sigmoid Function

In this article, we’ll find the derivative of Sigmoid Function. The Sigmoid Function is one of the non-linear functions that is used as an activation function in neural networks.

Basic Operations on Tensors

In this article, we’ll see the basic operations (Addition, Broadcasting, Multiplication, Transpose, Inverse) that can be performed on tensors.

Hydrating tweet IDs

You can check for any tweet ID using this URL: https://twitter.com/check/status/tweet_id.

This article discusses in detail how you can extract complete Twitter data by hydrating tweet ids.