mahiatlinux/MasherAI-v6.1-7B-checkpoint3-code3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Apr 5, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

The mahiatlinux/MasherAI-v6.1-7B-checkpoint3-code3 is a 7 billion parameter Mistral-based language model, fine-tuned from mahiatlinux/MasherAI-v6.1-7B-checkpoint3-code2. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training speeds. With an 8192-token context length, it is optimized for tasks benefiting from efficient fine-tuning and a Mistral architecture.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p