LLaMA
LLaMA, short for Large Language Model Meta AI, is a family of transformer-based large language models developed by Meta AI. Released in 2023, the LLaMA line was positioned as an efficient and accessible set of foundation models intended for research and commercial use. Like other autoregressive language models, LLaMA generates text by predicting subsequent tokens from a given prompt, and it can be adapted for tasks such as completion, summarization, translation, and question answering.
The LLaMA-1 generation offered models of approximately 7B, 13B, 30B, and 65B parameters, trained on a mixture
Access to LLaMA weights initially followed a restricted invitation model, limiting usage to researchers and organizations
Since its release, LLaMA has been used in academia and industry for a range of natural language