Mar '24 (edited) โ€ข ๐Ÿ’ฌ General
Introducing DBRX: A New State-of-the-Art Open LLM
Introducing DBRX, an open, general-purpose LLM created by Databricks. Across a range of standard benchmarks, DBRX sets a new state-of-the-art for established open LLMs.
  • mixture-of-experts (MoE) architecture with 132B total parameters, of which 36B parameters are active on any input
  • trained on 12 trillion tokens โ€” Llama 2 was 2T
  • maximum context length of 32k tokens
  • Llama-like license: non-commercial terms set at 700 million users and cannot train on outputs.ย 
13
12 comments
Marcio Pacheco
7
Introducing DBRX: A New State-of-the-Art Open LLM
Data Alchemy
skool.com/data-alchemy
Your Community to Master the Fundamentals of Working with Data and AI โ€” by Dataluminaยฎ
Leaderboard (30-day)
Powered by