Lexic0n/Mistral_7B-Open_Hermes-NSFWV1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 6, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Lexic0n/Mistral_7B-Open_Hermes-NSFWV1 is a 7 billion parameter Mistral-based language model fine-tuned from teknium/OpenHermes-2p5-Mistral-7B. Developed by Lexic0n, this model is designed for English language tasks with a context length of 4096 tokens. Its primary differentiator is its fine-tuning, suggesting specialized applications beyond the base model's general capabilities.

Loading preview...