Occamscha1nsaw/askesis-mistral-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 7, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Occamscha1nsaw/askesis-mistral-v1 is a 7 billion parameter Mistral-based instruction-tuned causal language model developed by Occamscha1nsaw. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language generation tasks, leveraging the Mistral architecture's efficiency and performance.

Loading preview...