syedahmedsoftware/broken-model-fixed
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 2, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The syedahmedsoftware/broken-model-fixed is an 8 billion parameter Qwen2-based causal language model, developed by syedahmedsoftware, with a 32768 token context length. This model provides essential metadata fixes to the original 'broken-model', enabling stable and deterministic chat inference and production-safe batching. It is specifically designed for compatibility with OpenAI-style /chat/completions API servers, making it deployable in real inference environments.

Loading preview...