korrelationsformer
Korrelationsformer is a family of neural network architectures inspired by the transformer, designed to explicitly model correlations among features and across time. Unlike standard transformers that rely primarily on token similarity, it introduces a correlation module that computes pairwise and higher-order correlations from input representations and uses these to guide information flow.
In typical implementations, the architecture includes an input embedding layer, a correlation computation block, a correlation-aware
Training is performed end-to-end using supervised objectives for classification, regression, or forecasting, often with auxiliary losses
Benefits of korrelationsformer include improved performance when dependencies are strong, enhanced interpretability through correlation maps, and
The concept has appeared in recent academic and industry discussions as a variant of transformer models that