Radiantloom/radintloom-mistral-7b-fusion
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 19, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Radiantloom/radintloom-mistral-7b-fusion is a 7 billion parameter large language model developed by Radiantloom AI, fine-tuned from a merged set of Mistral models with a 4096-token context length. This model excels in creative writing, multi-turn conversations, RAG, and coding tasks, producing detailed and longer-form content. It offers competitive performance against other open and closed-source LLMs like OpenHermes-2.5-Mistral-7B and Mistral Instruct v2.0, making it suitable for commercial use.

Loading preview...