EleutherAI/Mistral-7B-v0.1-population-first-ft
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 15, 2024Architecture:Transformer Cold
EleutherAI/Mistral-7B-v0.1-population-first-ft is a 7 billion parameter language model based on the Mistral architecture. This model is a fine-tuned version, though specific details on its training and primary differentiators are not provided in the available documentation. It is intended for general language generation tasks, with its specific strengths and optimal use cases requiring further evaluation.
Loading preview...