perceptrons
Perceptrons are a class of binary classifiers in artificial neural networks, consisting of a single neuron with adjustable weights and a bias. They were introduced by Frank Rosenblatt in 1957 as a simplified model of biological neurons and an early approach to pattern recognition.
In operation, an input vector x ∈ R^n, weights w, and bias b are combined as t = w·x
Training uses the perceptron learning rule. For each labeled example (x,d), update Δw = η (d − y) x
Limitations: A single perceptron cannot represent nonlinear decision boundaries; XOR and other nonlinear problems are unsolvable
Legacy and variants: The perceptron is foundational in neural networks, motivating multilayer architectures and backpropagation. Variants
In contemporary practice, perceptrons are primarily of historical and educational value, illustrating supervised learning, linear classifiers,