ik28/MedMistral-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:May 24, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
The ik28/MedMistral-instruct is a 7 billion parameter instruction-tuned causal language model. This model is based on the Mistral architecture and has a context length of 4096 tokens. It is designed for general language generation tasks, though specific medical optimization is implied by its name. The model is suitable for various natural language processing applications requiring instruction following.
Loading preview...