Anni: A High-Performance Code Assistant
Anni is a 14 billion parameter language model for code, developed by BigJuicyData and built on the Qwen3 14B architecture. It is specifically fine-tuned on the OpenCodeReasoning-2 dataset, making it highly proficient in areas requiring deep algorithmic understanding and competitive programming skills. The model supports a substantial context length of 32,000 tokens, allowing it to handle complex and extensive codebases.
Key Capabilities
- Deep Algorithmic Reasoning: Excels at understanding and generating solutions for intricate algorithmic problems.
- Competitive Programming Logic: Optimized for tasks commonly found in competitive programming, including efficient data structure implementation.
- High-Efficiency Data Structures: Capable of implementing and reasoning about complex and optimized data structures.
- Code Generation: Designed to assist with generating high-quality, efficient code.
Good For
- Developers working on algorithmic challenges.
- Competitive programmers seeking assistance with problem-solving.
- Tasks requiring the implementation of complex and efficient data structures.
- Code generation in scenarios demanding deep logical reasoning.
Anni is provided in BF16 / safetensors format and is compatible with inference frameworks like vLLM. The training dataset, OpenCodeReasoning-2, is licensed under CC-BY-4.0, permitting commercial use with attribution.