Published on Nov 28, 2023
AbstractFalcon-180B, with 180 billion parameters, achieves near-top-tier language model performance at reduced cost using custom distributed training code.
AI-generated summary
We introduce the Falcon series: 7B, 40B, and 180B parameters causal decoder-only models trained on a diverse high-quality corpora predominantly assembled from web data. The largest model, Falcon-180B, has been trained on over 3.5 trillion tokens of text--the largest openly documented pretraining run. Falcon-180B significantly outperforms models such as PaLM or Chinchilla, and improves upon concurrently developed models such as LLaMA 2 or Inflection-1. It nears the performance of PaLM-2-Large at a reduced pretraining and inference cost, making it, to our knowledge, one of the three best language models in the world along with GPT-4 and PaLM-2-Large. We report detailed evaluations, as well as a deep dive into the methods and custom tooling employed to pretrain Falcon. Notably, we report on our custom distributed training codebase, allowing us to efficiently pretrain these models on up to 4,096 A100s on cloud AWS infrastructure with limited interconnect. We release a 600B tokens extract of our web dataset, as well as the Falcon-7/40/180B models under a permissive license to foster open-science and accelerate the development of an open ecosystem of large language models.
Models citing this paper 7 tiiuae/falcon-11BText Generation • 11B • Updated Dec 17, 2024 • 47.9k • 212
LoneStriker/falcon-11B-GGUF11B • Updated May 13, 2024 • 84 • 3
QuantFactory/falcon-11B-GGUF11B • Updated Sep 5, 2024 • 46 • 3
vsevolodl/falcon-11B-GGUF11B • Updated May 13, 2024 • 112 • 1
Browse 7 models citing this paper Datasets citing this paper 0No dataset linking this paper
Cite arxiv.org/abs/2311.16867 in a dataset README.md to link it from this page.
Spaces citing this paper 9 Collections including this paper 6 Browse 6 collections that include this paperRetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4