openchat/openchat-3.6-8b-20240522
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:May 7, 2024License:llama3Architecture:Transformer0.2K Warm
OpenChat/openchat-3.6-8b-20240522 is an 8 billion parameter instruction-tuned causal language model developed by OpenChat, based on the Llama 3 architecture with an 8192-token context window. This model is optimized using mixed-quality data and is presented as a top-performing open-source 8B model, outperforming Llama-3-8B-Instruct on various benchmarks. It is primarily designed for general chat, coding, and diverse language tasks, offering strong performance in a compact size.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–