prince-canuma/Damysus-2.7B-Chat
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Feb 11, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

Damysus-2.7B-Chat is an instruction-tuned Transformer model with 2.7 billion parameters, developed by Prince Canuma and fine-tuned from Microsoft's Phi-2. This model is enhanced to better follow specific user instructions, excelling in tasks like question answering, data extraction, structured outputs (e.g., JSON), and providing explanations. It is primarily designed for building local or cloud RAG applications, serving as an answer synthesizer, summarizer, or query rewriter.

Loading preview...