OpenAI just dropped two powerful open models that anyone can download, run, and use commercially for free.
- 120B parameter model that runs on a single 80GB GPU
- 20B parameter model that works on 16GB VRAM (can run locally)
- Performance rivals o4-mini on math, MMLU, coding, and health benchmarks
- Uses Mixture-of-Experts architecture (only ~5B active params per token)
- Supports low, medium, and high reasoning modes
- Logs full chain-of-thought for transparency
- 128K token context window (great for long docs and retrieval tasks)
- Apache 2.0 license lets you use, modify, and monetize freely
- Safety-tested against bio, cyber, and misuse risks
This levels the playing field, real reasoning power is now open to everyone.