Perceptron
The perceptron is a fundamental concept in artificial neural networks, serving as a building block for more complex models. It simulates a single neuron and is essential for understanding deep learning.
A perceptron is a type of artificial neuron and one of the simplest forms of a neural network model, invented by Frank Rosenblatt in 1958. It consists of input features, weights assigned to those inputs, a bias, and an activation function that determines the output. The perceptron takes a linear combination of the inputs and weights, applies the activation function, and produces a binary output. While it can only solve linearly separable problems, the perceptron is a foundational concept in neural networks and serves as the building block for more complex architectures, such as multi-layer perceptrons (MLPs).