namirocks/mistral-class-tutor-7b-ep3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 28, 2024License:llama2Architecture:Transformer Open Weights Cold
The namirocks/mistral-class-tutor-7b-ep3 is a 7 billion parameter language model, likely based on the Mistral architecture, developed by namirocks. This model is presented as a general-purpose language model, though specific differentiators or primary use cases are not detailed in its current documentation. Its 7B parameter count suggests suitability for tasks requiring moderate computational resources while offering strong language understanding and generation capabilities.
Loading preview...