Locutusque/Hyperion-2.0-Mistral-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Mar 10, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Locutusque/Hyperion-2.0-Mistral-7B is a 7 billion parameter language model based on Mistral-7B-v0.1, fine-tuned by Locutusque on the Hyperion-v2.0 dataset. This model specializes in advanced reasoning across scientific domains, including complex question answering, conversational AI, code generation, medical text comprehension, mathematical reasoning, and logical reasoning. It leverages a diverse training set of 750,000 examples to handle complex inquiries and instructions. With an 8192 token context length, it is designed for researchers and practitioners needing powerful domain-specific problem-solving capabilities.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p