teolm30/Fox-1.5-Nova

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 26, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

teolm30/Fox-1.5-Nova is a 7 billion parameter code generation model, fine-tuned by teolm30 on DeepSeek-Coder-7B-Instruct. Optimized for competitive programming, systems design, and real-world code patterns across over 50 languages, it offers efficient, local inference with low VRAM requirements. This model is specifically designed for high-performance code generation tasks.

Loading preview...

Fox 1.5 Nova: Code Generation Model

Fox 1.5 Nova is a 7 billion parameter code generation model developed by teolm30, fine-tuned from DeepSeek-Coder-7B-Instruct. It is specifically optimized for competitive programming, systems design, and generating real-world code patterns across more than 50 programming languages.

Key Capabilities & Features

  • Specialized Code Generation: Excels in competitive programming scenarios, systems design, and practical coding tasks.
  • Broad Language Support: Fine-tuned to handle code patterns across 50+ programming languages.
  • Efficient Local Inference: Designed for local deployment, offering approximately 40+ tokens/second inference speed on fp16.
  • Low Resource Footprint: Requires only about 6GB VRAM for the 4-bit quantized version, making it accessible for consumer-grade hardware.
  • Cost-Effective: Free to use, eliminating API costs associated with larger, proprietary models.
  • Offline Operation: Does not require an internet connection for inference.

Performance & Technical Details

Built using QLoRA (4-bit NF4) with a LoRA r of 16 and alpha of 64, the model underwent 220 training steps over 10 epochs. It has a maximum output length of 512 tokens. While it does not have built-in tool-use capabilities, it can be integrated with external frameworks like OpenClaw for agentic workflows.

Ideal Use Cases

  • Local Code Generation: Developers needing a powerful code assistant that runs entirely on their machine.
  • Competitive Programming: Generating solutions or boilerplate for programming contests.
  • Systems Development: Assisting with code for system-level applications and design.
  • Educational Purposes: Learning and experimenting with code generation without incurring API costs.

This model provides a robust, resource-efficient solution for a wide range of code-centric applications, particularly where local execution and specific code pattern generation are critical.