manan05/mistral-7b-friends
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer Open Weights Cold

The manan05/mistral-7b-friends model is a 7 billion parameter language model based on the Mistral architecture, fine-tuned for conversational and friendly interactions. It features a 4096-token context window, making it suitable for engaging in extended dialogues and generating human-like text. This model is primarily designed for applications requiring natural language understanding and generation with a focus on approachable and helpful responses.

Loading preview...