70B
70B commonly refers to the size of certain large language models, where 70B stands for roughly 70 billion parameters. The label is used to describe models in the 50–100 billion parameter range, indicating a higher capacity than mid-sized models while still smaller than the largest families.
Models of this scale are typically based on the transformer architecture and are trained on large, diverse
Notable examples include Meta's Llama 2 70B, among others in the same parameter range. These models can
In AI model catalogs, 70B is one tier among several parameter-size classifications, alongside smaller models such