zorobin/mistral-class-shishya-7b-ep3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 28, 2024License:llama2Architecture:Transformer Open Weights Cold
The zorobin/mistral-class-shishya-7b-ep3 is a 7 billion parameter language model, likely based on the Mistral architecture, with a 4096 token context length. This model is a fine-tuned variant, indicated by "ep3," suggesting it is an experimental or iterative version. Its specific differentiators and primary use cases are not detailed in the provided information, indicating it may be a foundational or general-purpose model awaiting further specialization.
Loading preview...