Skip to main content

Intro to Generative Adversarial Networks | GANs 001

   GANs consist of three terms Generative Adversarial Network. Let's understand these three terms first. Generative : A Generative Model takes input training sample from some distribution and learns to represents that distribution. Adversarial : It basically means Conflicting or Opposing. Networks : These are basically neural networks. So,Generative Adversarial Networks are deep neural network architecture comprising of two neural networks compete with each other to make a generative model. A GAN consist of two class models : Discriminative Model :- It is the one that discriminate between two different classes of data.It tries to identify real data from fakes created by the generator Generative Model :- The Generator turns noise into an imitation of the data to try to trick the discriminator Mathematically, A Generative Model 'G' to be trained on training data 'X' sampled from some true distribution 'D' is the one which, given some standard random distrib...

Linear Regression for Machine Learning | Intro On Linear Regresssion 2021

 What is Linear Regression?

This Algorithm is used to find the relationship between 2 continuous variables [ one independent variable and one dependent variable ]. It is a linear model which assumes a linear relationship between input and output variables. If we have single input variables then we call it as simple linear regression, if we have multiple we call it as multiple linear regression. It is both a statistical algorithm and machine language algorithm. 

The Equation is 'Y = M * X + C'

Y = Independent Value

M = Slope/Weight

X = Dependent Value

C = Bias

The Core idea is to obtain a line that best fits the data. 'Y' is the output variable we want to predict, X is the input variable and M & C can be called as coefficients that we need to estimate.

To find m and b values we have methods like statistical method or ordinary least squares or gradient descent.

How does it Work?

Goal is to find the best fit line which minimize the error ( distance between the line and the data point ). The value of 'm' & 'b' must be choose so that they minimize the error. So the algorithm will try multiple m and b values and calculate the error. Finally it takes the best m and b which has low error.

We can calculate the error in multiple ways like by using mean squared error formula as loss function. This helps us to evaluate the performance of the model

Mean Squared Error = (1/n) sum(yi -yi^)^2

n = total no of samples

y = acute value

yi = predicted value

Advantages

  • Linear Regression is simple to implement and easier to interpret the output coefficients.
  • High Performance on linearly separable datasets.
  • Linear Regression is susceptible to over-fitting but it can be avoided using some dimensionality reduction techniques  

 Disadvantages

  • Prone to Underfitting.
  • Sensitive to outliers.
  • Linear Regression assumes that the data is independent.

Comments

Popular posts from this blog

Best Platforms to Improve Machine Learning Skills | 2020

 Machine learning is one of the most exciting techniques one has ever encountered.The field of study that gives computers the ability to learn without being explicitly programmed is machine learning.Their are platforms that can help you improve your Machine Learning skills. Today I've come up with the list of some of my favorite platforms.   Platforms to Improve Machine Learning Skills 1. Kaggle The online community of data scientists and machine learning practitioners is Kaggle, a subsidiary of Google LLC. Kaggle is the largest data science community in the world.Kaggle enables users in a web-based data-science environment to find and publish data sets, explore and build models, work with other data scientists and machine learning engineers, and enter competitions to solve challenges in data science.With it's free GPUs, high paying competitions,massive community , thousands on datasets and notebooks, this platform helps a lot. 2. Seedbank It was launched by  'TensorFlow...