EleutherAI/Mistral-7B-v0.1-addition-first-ft
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 15, 2024Architecture:Transformer Cold

EleutherAI/Mistral-7B-v0.1-addition-first-ft is a 7 billion parameter language model developed by EleutherAI, fine-tuned from the Mistral-7B-v0.1 architecture. This model has a context length of 4096 tokens. Specific details regarding its primary differentiators, training, and intended use cases are not provided in the available model card.

Loading preview...