electrocampbell/nebula-8lang-7b

TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 12, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

electrocampbell/nebula-8lang-7b is a 7.6 billion parameter language model fine-tuned from Qwen/Qwen2.5-7B. It specializes in translating Nebula, a universal code intermediate language, into 8 target programming languages: Python, JavaScript, TypeScript, Go, Swift, Kotlin, Rust, and C. This model is optimized for code translation tasks, achieving 88.4% on HumanEval (Nebula to Python) with error correction.

Loading preview...

Model Overview

electrocampbell/nebula-8lang-7b is a 7.6 billion parameter model, fine-tuned from Qwen/Qwen2.5-7B, specifically designed for code translation. Its primary function is to translate code written in Nebula, a universal code intermediate language, into eight distinct target programming languages. Nebula is noted for its token efficiency, compressing code an average of 16% smaller than source code while maintaining round-trip fidelity.

Key Capabilities

  • Multi-language Code Translation: Translates Nebula into Python, JavaScript, TypeScript, Go, Swift, Kotlin, Rust, and C.
  • High Translation Accuracy: Achieves a Pass@1 score of 67.7% on HumanEval for Nebula to Python translation, improving to 88.4% with error correction. It also scores 55.4% on MBPP for Nebula to Python.
  • Optimized for Intermediate Language: Leverages Nebula's token-efficient canonical form for robust and accurate code generation across multiple paradigms.

When to Use This Model

This model is ideal for developers and systems requiring automated, high-quality code translation from a standardized intermediate representation. It is particularly useful for:

  • Cross-language Development: Facilitating code migration or interoperability between the supported languages.
  • Code Generation Pipelines: Integrating into systems that use Nebula as an intermediate step for generating idiomatic code in various target languages.
  • Research in Code Translation: As a strong baseline or component for projects focused on universal code representation and translation.