zorobin/mistral-class-shishya-all-hal-7b-ep3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 28, 2024License:llama2Architecture:Transformer Open Weights Cold

The zorobin/mistral-class-shishya-all-hal-7b-ep3 is a 7 billion parameter language model, likely based on the Mistral architecture, with a context length of 4096 tokens. This model is a fine-tuned version, indicated by 'ep3', suggesting it's an experimental or iterative release. Its specific differentiators and primary use cases are not detailed in the provided model card, which indicates 'More Information Needed' for most sections.

Loading preview...