meta-llama/Llama-3.1-70B-Instruct
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Jul 16, 2024License:llama3.1Architecture:Transformer0.9K Gated Warm
The Meta Llama 3.1 70B Instruct model is a 70 billion parameter instruction-tuned, auto-regressive language model developed by Meta, utilizing an optimized transformer architecture. It is designed for multilingual dialogue use cases, supporting 8 languages, and features a 128k token context length. This model excels in assistant-like chat applications and is optimized for commercial and research use, outperforming many open-source and closed chat models on common benchmarks.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p
–