allenai/Bolmo-1B
Text Generation
•
1B
•
Updated
•
129
•
19
Data used to train Bolmo, the first family of competitive fully open byte-level language models (LMs).
See our technical report for details: https://allenai.org/papers/bolmo.
| Name | Tokens | License |
|---|---|---|
| Common Crawl | 121.0B | ODC-BY |
| olmOCR Science PDFs | 19.9B | ODC-BY |
| StackEdu | 26.3B | ODC-BY |
| FineMath 3+ | 4.1B | ODC-BY |
| arXiv | 1.3B | ODC-BY |
| Wikipedia & Wikibooks | 64.6M | ODC-BY |
| Character Understanding | 75.5M | ODC-BY |
| Total | 172.7B |
Bolmo models are trained for less than one epoch (~39.3B tokens) on this mix.
Bolmo Mix is licensed under the Open Data Commons Attribution License v1.0 (ODC-By). It is intended for research and educational use. For more information, please see our Responsible Use Guidelines.
Forthcoming!