Machine Learning GG EZ: Perceptron Algorithm

For data scientist scholars, the first algorithm that we learn is Perceptron, just because of how simple and elegant it is to solve classification problems.

What is a classification problem? As you probably can guess already, its a problem to classify 2 things, either true or false.
As an example, imagine you have a dataset that gives you the height and weight of each student in a class. A classification problem would be to determine which of them are male (male is true) or not male (male is false).

What? Is it possible to actually determine that? Yeah for sure, as long as we have a dataset that is separable by a hyperplane.

What is a hyperplane then? A hyperplane is something that can separate a dimension.
Example time!
– If you have a 3D space: your hyperplane would be a 2D area.
– if you have a 2D space: your hyperplane would be a 1D line.
using the same logic, your hyperplane’s dimension is basically your Dimension – 1, right? BAM EZ
(images from openai.org)


With that said, we’ll focus now on the 2D problem, okay? 🙂

So Perceptron Algorithm: WTH is it?

Its basically a very simple and elegant piece of machine learning algorithm that works.

How does it work?

It creates a “decision boundary” from a weight. And we adjust the weight, so that the decision boundary really separates our data.

*a side note again: imagine decision boundary similar as hyperplane, which separates our 2 data.

As you can see from the figure above, our big long white line (boundary line) that separates 2 different data labels (O for true, and X for false) is perpendicular from our weight vector (w) (the golden line).

So the question here changes to : how can we configure so that we can get the correct W vector, right?

Don’t worry, there’s an algorithm for that. And it’s quite a simple one as well.
Thy pseudo-code looks like this:
1. Start with a whatever weight vector (w) vector. -> it defines your boundary line.
2. when there’s some wrongly classified data, do this:
2a. Pick any misclassified data point
2b. Update the weight vector (w)
The way you update your weight vector is also easy. Just find out where is the misclassified data, and add that vector to your weight vector. 🙂

And the visualization looks like this:

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: