HuggingFaceH4/mistral-7b-grok
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 29, 2024License:apache-2.0Architecture:Transformer0.1K Open Weights Cold

HuggingFaceH4/mistral-7b-grok is a 7 billion parameter language model fine-tuned from mistralai/Mistral-7B-v0.1. This model has been aligned using Constitutional AI to emulate the distinctive style of xAI's Grok assistant. It is primarily intended for applications requiring responses with a similar conversational tone and personality to Grok.

Loading preview...