Frenzyknight/Clarity-llama-70b
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Jan 16, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Clarity-llama-70b is a 70 billion parameter instruction-tuned causal language model developed by Frenzyknight. It was finetuned from unsloth/llama-3.3-70b-instruct-bnb-4bit using Unsloth and Huggingface's TRL library, enabling 2x faster training. This model is designed for general-purpose language tasks, leveraging its large parameter count and efficient training methodology.

Loading preview...