HenryJJ/Instruct_Mistral-7B-v0.1_Dolly15K
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 2, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Instruct_Mistral-7B-v0.1_Dolly15K is a 7 billion parameter instruction-tuned causal language model developed by HenryJJ. Fine-tuned from the Mistral-7B-v0.1 architecture using the Dolly15K dataset, it is designed for general English language instruction-following tasks. The model was trained for 2.0 epochs with a 1024 token context window, making it suitable for various conversational and generative applications.

Loading preview...