Prestraining
Prestraining is a training paradigm in machine learning that refers to a preparatory phase of training applied to a model before the main training stage. The aim is to shape representations or initialize parameters in a way that facilitates subsequent learning, often by exposing the model to broad, diverse, or related tasks on unlabeled or weakly labeled data. The term is not universally standardized and is sometimes used interchangeably with pretraining or described as a specific form of curriculum-inspired preparation.
Typical methods in prestraining include self-supervised objectives such as masked reconstruction or contrastive learning, auxiliary proxy
Benefits of prestraining include improved sample efficiency, faster convergence, and enhanced generalization, particularly when downstream data
Prestraining is used across domains such as natural language processing, computer vision, and audio, often in