selvsuperviserede
Selvsuperviserede, often translated as self-supervised learning, is a type of machine learning where algorithms learn from data that has not been explicitly labeled. Instead of relying on human-provided labels, self-supervised methods generate their own supervisory signals from the inherent structure of the input data. This is typically achieved by creating "pretext tasks" where a portion of the data is masked or altered, and the model is trained to predict the missing or original information.
For example, in natural language processing, a self-supervised model might be trained to predict a masked word