osirisbrain/OsirisPtah-Coder-v5
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 3, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
OsirisPtah-Coder-v5 is a 7.6 billion parameter instruction-tuned causal language model developed by osirisbrain, based on the Qwen2.5-Coder-7B-Instruct architecture. This model has been further refined using mlabonne datasets and a mean-diff method, specifically optimized for coding tasks. It features a 32768-token context length, making it suitable for complex code generation and analysis.
Loading preview...
OsirisPtah-Coder-v5: Enhanced Code Generation
OsirisPtah-Coder-v5 is an instruction-tuned language model developed by osirisbrain, building upon the Qwen2.5-Coder-7B-Instruct architecture. This 7.6 billion parameter model is specifically designed for coding applications, featuring a substantial 32768-token context window.
Key Enhancements
- Abliterated Training: The model underwent a specialized "abliteration" process, leveraging proven mlabonne datasets (256+256).
- Mean-Diff Method: Training incorporated a mean-diff method with 1.5x strength over four passes, enhancing its coding capabilities.
- Layer Blacklisting: Specific layers ([0, 1, 26, 27]) were blacklisted during the training process, indicating targeted optimization.
Good for
- Code Generation: Excels at generating various programming language code.
- Code Analysis: Capable of understanding and processing large codebases due to its extended context length.
- Developer Tools: Suitable for integration into IDEs, code assistants, and other developer-centric applications requiring robust code understanding and generation.