Locutusque/OpenCerebrum-2.0-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Apr 13, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
OpenCerebrum-2.0-7B is a 7 billion parameter open-source language model developed by Locutusque, fine-tuned from alpindale/Mistral-7B-v0.2-hf. It was trained with SFT and DPO on approximately 7,000 examples across 15 diverse data sources, aiming to replicate the capabilities of Aether Research's proprietary Cerebrum model. This model excels in coding, math, science, multi-turn conversation, RAG, reasoning, and general instruction-following tasks. It is intended as a powerful, broad-knowledge model for question-answering and text generation.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p