Danielbrdz/Barcenas-Orca-2-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 2, 2024License:microsoft-research-licenseArchitecture:Transformer0.0K Cold
Danielbrdz/Barcenas-Orca-2-7b is a 7 billion parameter language model developed by Danielbrdz, based on Microsoft's Orca 2 architecture. It has been fine-tuned using the HuggingFaceH4/no_robots dataset, enhancing its capabilities for natural conversation. This model is designed to provide improved conversational fluency and responsiveness, making it suitable for dialogue-oriented applications.
Loading preview...