saucam/Nereus-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 4, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Nereus-7B is a 7 billion parameter language model developed by saucam, built by merging cognitivecomputations/dolphin-2.8-mistral-7b-v02 and NousResearch/Hermes-2-Pro-Mistral-7B. This model excels at conversational tasks, code generation, and producing structured JSON outputs. With a 4096-token context length, it is optimized for applications requiring precise, formatted responses and robust dialogue capabilities.

Loading preview...