In 1959 Arthur Samuel, an American pioneer in the field of computer gaming gave the idea of Machine Learning. He defined machine learning as — “Field of study that gives computers the capability to learn without being explicitly programmed”.

In 1997, Tom Mitchell gave a mathematical and relational definition that “A computer program is said to learn from experience E with respect to some task T and some performance measure P, if its performance on T, as measured by P, improves with experience E

In Layman’s terms, Consider you are trying to toss a paper into a dustbin. In first…


A complete overview of NVIDIA Jetson Xavier NX including Description, Specification, Price, Pros and Cons. Buy in India

Despite being the smallest AI and DL supercomputer in the world, Performance wise Nvidia Jetson Xavier NX is top and energy-efficient Single Board Computers among others. Nvidia Jetson Xavier NX would clock an operational computation of 21 Tera Operations Per Second (TOPS) on a 15W power supply. And at just 10W, the Xavier NX clocks around 14 TOPS. Its ultra-small 70–45mm footprint takes up little space but packs in a 6-core NVIDIA Carmel ARM v8.2 64-bit CPU with a 384-core NVIDIA Volta GPU and 48 Tensor cores.

Key Specifications :

  • CPU: 6-core NVIDIA Carmel 64-bit ARMv8.2 @ 1400MHz* (6MB L2 + 4MB L3)

A complete overview of NVIDIA Jetson Nano including Description, Specification, Price, Pros and Cons. Buy in India

The Jetson Nano is designed for AI enthusiasts, hobbyists and developers who want to do projects by implementing AI. The NVIDIA Jetson Nano delivers computing performance to run multiple neural networks alongside other applications such as object detection, segmentation, speech processing, image classification etc. and process data from several high-resolution sensors.

NVIDIA Jetson Nano
NVIDIA Jetson Nano
NVIDIA Jetson Nano

Though its core is older and weaker than some of the SCB’s, Jetson Nano has a much more capable GPU and performance designed specifically for AI applications.

Jetson Nano is also the perfect tool to start learning about AI and robotics. It opens the world of embedded IoT applications…


Arch Linux doesn't comes with aurman inbuilt. We have to install it manually. In this article atfirst we will install aurman manually and then install a package using aurman.

AUR (Arch User Repository) is a community-driven repository for Arch Linux users, containing PKGBUILDs (package descriptions) that allow you to compile a package from source with makepkg and then install it via pacman.

How to Install any package (for example aurman) manually ?

Step 1: Install these packages

Ensure the base-devel package group is installed. And also git should be installed to download packages. python-pip required to install setuptools

sudo pacman -S base-devel git python-pip

Step 2: Install setuptools

Install setuptools using pip.

pip install setuptools

Step 3: Download AUR package

Search and download PKGBUILDs from the AUR Web Interface

git clone https://aur.archlinux.org/aurman.git

Step 4: Install aurman

  • Change dir to aurman (downloaded package)

cd aurman

This PKGBUILD can be built into installable packages using makepkg, then installed using…


Regularization is an important concept to avoid overfitting of the training data especially when the trained and tested data are much varying.

Regularization
Regularization
Regularization

Regularization is calculated by adding a “penalty” term to the RSS to achieve a lesser variance with the tested data.

Ridge Regression :

RSS modified by adding that sum of squares of the coefficients of B.

Suppose following equation is the Regression model.


How do I calculate accuracy for my regression model?

This is a common question by beginners when they make a regression predictive modeling project. But the fact is accuracy is a measure for classification, not regression. We cannot calculate accuracy for a regression model. The performance of a regression model measures by error in predictions.

For example if you are predicting value of house, you don’t want to know if the model is predicting the exact value or not. Now how you will know how close the predictions were to the expected values.

There are three error metrics that are…


In the previous article, I wrote about Linear Regression, optimization of error by taking such coefficient, Gradient Descend Method, Overdetermined System of the equation, etc. In this article, I am writing about Polynomial Regression and other things written in the title.

This is an image of Linear Regression

Linear Regression
Linear Regression
Linear Regression

What is Polynomial Regression?

Polynomial regression is a form of regression in which the relationship between the independent variable and the dependent variable is an nᵗʰ degree polynomial function of x.


Suppose equation of an machine learning model is,

Multiple Regression Model
Multiple Regression Model
Multiple Regression Model

Where B0,B1…… are parameters and 1,x1,x2…… are features, and the curve is of n dimensions.

For example, suppose following table is training datasets of a machine learning model, where x0,x1,x2,x3 are features and y is result.


In this article you will learn about maxima, minima concept, concave and convex function, finding maxima minima using hill climbing method, Gradient Descend algorithm.

Concept of Machine Learning | Optimization using Gradient Descend Method to find best values of coefficients 😉

Gradient Descend Method

Gradient Descent is an optimizing algorithm used in Machine/ Deep Learning algorithms, to minimize the objective convex function f(x) using iteration. It find the global minimum of the objective function.

Finding optima by gradient descend

Before starting this you have to know following concept of mathematics.

Concave and Convex Function


In this article, we will discuss Simple Linear Regression

Concept of Machine Learning | Introduction to Machine Learning

What is Regression?

A kind of statistical supervised learning technique for estimating the relationships between dependent & independent variables.

Dependent and Independent Variable :

Example

Here Y is dependent variable and X₁,X₂,X₃ …….,Xn are independent variable. Dependent variable is also called Outcome Variable, Response Variable and Independent Variable is also called Predictor Variable, Explanatory Variable.

Linear Regression :

Linear Regression is a statistical supervised learning technique to predict the dependent variable by forming a linear relationship with one or more independent variable.

Types of Linear Regression :

  1. Simple Linear Regression
  2. Multiple Linear Regression

1. Simple Linear Regression

Simple Linear Regression find the linear relationship between two continuous variables,One independent and one dependent…

Ujjwal

Myself Ujjwal Kar, worked on various technology, interested to write in It. I write technical articles on medium without partner program

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store