Medilora/Medilora-Mistral-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 4, 2023License:mitArchitecture:Transformer0.0K Open Weights Cold
Medilora/Medilora-Mistral-7B is a 7 billion parameter language model based on the Mistral architecture. Details regarding its specific training, primary differentiators, and intended use cases are currently pending. Further information will be provided to clarify its unique capabilities and optimal applications.
Loading preview...