fnn
Feedforward neural network (FNN) is a class of artificial neural networks in which information propagates in one direction from input to output through a sequence of layers, with no cycles or loops. In a typical FNN, data enters at the input layer, passes through one or more hidden layers, and produces an output. Each neuron computes a weighted sum of its inputs, adds a bias, and applies a nonlinear activation function such as ReLU, sigmoid, or tanh. If all layers are densely connected, the network is called a fully connected or dense feedforward network; convolutional layers and other specialized layers may be used within a feedforward architecture as well.
FNNs are trained using supervised learning. During training, a forward pass computes predictions, and a loss
The expressive power of FNNs is characterized by the universal approximation theorem, which states that a feedforward
Limitations include difficulty modeling temporal or sequential data without modifications, and potential overfitting or training instability
Common applications include classification, regression, function approximation, and pattern recognition across fields such as image and