retrievalloss
Retrieval loss, sometimes written as retrievalloss or retrievalloss, refers to a family of objective functions used to train models for information retrieval tasks. It measures how well a model can retrieve relevant items in response to a query by encouraging higher similarity or lower distance between query representations and their correct matches, while pushing non-relevant items apart. In neural information retrieval and retrieval-augmented systems, retrieval loss is commonly applied to learned representations, such as dense embeddings, and optimized with gradient-based methods.
Common forms of retrieval loss include contrastive losses (such as InfoNCE), which maximize the similarity between
Training setups often rely on in-batch negatives, negative mining strategies (hard or semi-hard negatives), and efficient
Variants and challenges include selecting informative negatives, avoiding representation collapse, and the computational demands of large