Posts

Showing posts from October, 2021

Perceptron

Image
The Perceptron is inspired by the information processing of a single neural cell called a neuron. A neuron accepts input signals via its dendrites, which pass the electrical signal down to the cell body. In a similar way, the Perceptron receives input signals from examples of training data that we weight and combined in a linear equation called the activation. $activation = sum(weight_i * x_i) + bias$ The activation is then transformed into an output value or prediction using a transfer function, such as the step transfer function. $prediction = 1.0$  if activation >= 0.0 else 0.0 In this way, the Perceptron is a classification algorithm for problems with two classes (0 and 1) where a linear equation (like or hyperplane) can be used to separate the two classes. It is closely related to linear regression and logistic regression that make predictions in a similar way (e.g. a weighted sum of inputs). The weights of the Perceptron algorithm must be estimated from your training data