Arc53/docsgpt-7b-mistral
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 12, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Arc53/docsgpt-7b-mistral is a 7 billion parameter language model, fine-tuned from Zephyr-7B-beta using LoRA, specifically optimized for documentation-based question answering. It excels at providing context-driven responses, making it highly suitable for developers and technical support teams. The model demonstrates strong performance in hallucination reduction and attention span on the internal BACON test, outperforming several larger models in this domain.

Loading preview...