Sao10K/L3.1-70B-Hanami-x1
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Sep 6, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Warm
Sao10K/L3.1-70B-Hanami-x1 is a 70 billion parameter language model based on the Llama-3.1 architecture, developed by Sao10K. This experimental model is a refinement over the Euryale v2.2 series, offering a distinct and preferred performance profile. It is designed for general language tasks, leveraging its large parameter count and 32768 token context length for robust text generation and understanding.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p