vrutkovs/Lusterka-7B-v0.3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 12, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Lusterka-7B-v0.3 is a 7 billion parameter language model developed by vrutkovs, fine-tuned from unsloth/mistral-7b-v0.3. This model leverages the Unsloth framework for accelerated training, offering efficient performance within a 4096 token context length. It is designed as a general-purpose language model, building upon the Mistral architecture.

Loading preview...