Occamscha1nsaw/askesis-mistral-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 7, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
Occamscha1nsaw/askesis-mistral-v1 is a 7 billion parameter Mistral-based instruction-tuned causal language model developed by Occamscha1nsaw. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language generation tasks, leveraging the Mistral architecture's efficiency and performance.
Loading preview...
Occamscha1nsaw/askesis-mistral-v1 Overview
This model, developed by Occamscha1nsaw, is a 7 billion parameter instruction-tuned variant based on the Mistral architecture. It was fine-tuned from unsloth/mistral-7b-instruct-v0.2-bnb-4bit using the Unsloth library, which facilitated a 2x faster training process, alongside Huggingface's TRL library.
Key Characteristics
- Architecture: Mistral-7B-Instruct-v0.2 base model.
- Parameter Count: 7 billion parameters.
- Training Efficiency: Leverages Unsloth for accelerated fine-tuning.
- License: Released under the Apache-2.0 license.
Potential Use Cases
- General instruction-following tasks.
- Text generation and completion.
- Applications requiring an efficient 7B Mistral-based model.