OPT by Facebook
by Meta AI
Open Pretrained Transformer decoder-only language models from Meta AI
Visit Product
341 upvotes
379 views
About
OPT (Open Pre-trained Transformers) is a suite of open-source decoder-only transformer language models released by Meta AI, ranging from 125 million to 175 billion parameters. Released in 2022 as a fully open alternative to GPT-3, OPT was notable for releasing not just the model weights but also the full training code, training logs, and logbooks — providing unprecedented transparency into how large language models are trained.
The OPT-175B model was the first open-source model to match GPT-3's scale, making previously prohibitively expensive research accessible to academic researchers and smaller organizations. The full suite of models at different sizes enables ablation studies and understanding of how model capabilities scale with parameters.
OPT has been superseded by Llama and other more capable open models, but remains historically significant as a milestone in open AI research and is still used in academic work studying language model behavior, scaling laws, and training dynamics.
The OPT-175B model was the first open-source model to match GPT-3's scale, making previously prohibitively expensive research accessible to academic researchers and smaller organizations. The full suite of models at different sizes enables ablation studies and understanding of how model capabilities scale with parameters.
OPT has been superseded by Llama and other more capable open models, but remains historically significant as a milestone in open AI research and is still used in academic work studying language model behavior, scaling laws, and training dynamics.
Product Features
- Model suite from 125M to 175B parameters
- Full training code and methodology published
- Training logs and loss curves publicly available
- Decoder-only architecture similar to GPT-3
- Available on Hugging Face for research use
- Non-commercial research license
- Supports standard generation tasks: QA, summarization, completion
- Benchmark results published across academic evaluations
- Basis for research on scaling laws and model behavior
- Compatible with Hugging Face Transformers library
- Full training code and methodology published
- Training logs and loss curves publicly available
- Decoder-only architecture similar to GPT-3
- Available on Hugging Face for research use
- Non-commercial research license
- Supports standard generation tasks: QA, summarization, completion
- Benchmark results published across academic evaluations
- Basis for research on scaling laws and model behavior
- Compatible with Hugging Face Transformers library
About the Publisher
Meta AI (formerly FAIR — Facebook AI Research) is one of the world's premier AI research organizations, founded in 2013. Under the scientific leadership of Yann LeCun, Meta AI has contributed foundational work in deep learning, computer vision, NLP, and now generative AI. The OPT release reflected Meta AI's commitment to open research and reproducibility, providing the research community with access to large-scale language models that were previously only available through closed APIs.