mahiatlinux/MasherAI-v6.1-7B-checkpoint2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Mar 30, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

mahiatlinux/MasherAI-v6.1-7B-checkpoint2 is a 7 billion parameter Mistral-based language model developed by mahiatlinux, fine-tuned from mahiatlinux/MasherAI-v6.2-7B-checkpoint1. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training speeds. It is designed for general language generation tasks, leveraging its efficient training methodology.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p