5CH5/Qwen2.5-7B-abliterated
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 29, 2026Architecture:Transformer0.0K Cold

5CH5/Qwen2.5-7B-abliterated is a 7.6 billion parameter causal language model based on the Qwen2.5 architecture, developed by 5CH5. With a substantial 32,768 token context length, this model is designed for general-purpose language understanding and generation tasks. Its architecture and parameter count position it as a capable model for various applications requiring robust language processing.

Loading preview...