The Meta Llama 3.1 collection includes pre-trained and instruction-tuned generative models in 8B, 70B, and 405B sizes. These text-only models excel in multilingual dialogue applications, outperforming many open-source and closed-source chat models on industry benchmarks.