allura-org/Mistral-Small-24b-Sertraline-0304
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Warm
The allura-org/Mistral-Small-24b-Sertraline-0304 is a 24 billion parameter instruction-tuned model based on the Mistral Small 3 architecture. This model is specifically fine-tuned for instruction following, leveraging the v7-Tekken instruct template. It aims to provide a robust and "decent" performance for general AI assistant tasks, with a context length of 32768 tokens.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
min_p
–