Danielbrdz/Barcenas-10.7b
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Jan 16, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
Danielbrdz/Barcenas-10.7b is a 10.7 billion parameter language model, fine-tuned from NousResearch/Nous-Hermes-2-SOLAR-10.7B. It was trained on the HuggingFaceH4/no_robots dataset, which consists of 10,000 human-annotated instructions and demonstrations. This model excels at following instructions and generating human-like responses for tasks such as conversational text generation, summarization, and creative writing.
Loading preview...