BERTbaserte
BERTbaserte is a Norwegian term used to describe language models that are based on the BERT framework. BERT stands for Bidirectional Encoder Representations from Transformers and refers to a family of models that learn deep contextualized representations from large unlabeled text by training a transformer encoder in a bidirectional manner. BERTbaserte models reuse this architecture and pretraining objective, typically including masked language modeling and next sentence prediction, though variants may adjust objectives or training steps.
During pretraining, the model learns to predict masked tokens and to determine whether one sentence follows
The term covers a range of models, including the original BERT and its many derivatives and domain-adapted
In practice, the label "BERTbasert" is commonly used in Norwegian NLP literature and documentation to indicate