Locutusque/OpenCerebrum-1.0-7b-SFT
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Mar 26, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Locutusque/OpenCerebrum-1.0-7b-SFT is a 7 billion parameter language model fine-tuned from alpindale/Mistral-7B-v0.2-hf. It was trained on 1.2 million examples across 14 diverse datasets to replicate the capabilities of AetherResearch's proprietary Cerebrum model. This model excels in coding, math, science, reasoning, and general instruction-following tasks, offering broad knowledge and reasoning capabilities.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p