MarisUK/coder

TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Apr 6, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

MarisUK/coder is a 0.5 billion parameter instruction-tuned language model, fine-tuned from Qwen/Qwen2.5-0.5B-Instruct. This model is specifically adapted for code generation tasks, leveraging its base architecture for efficient performance. It offers a 32768 token context length, making it suitable for processing substantial code snippets and related instructions.

Loading preview...

Model Overview

MarisUK/coder is a 0.5 billion parameter language model, fine-tuned from the Qwen/Qwen2.5-0.5B-Instruct architecture. This model has been specifically adapted through fine-tuning on a generator dataset, indicating an optimization for generating specific types of output, likely code or structured text, given its name.

Key Characteristics

  • Base Model: Fine-tuned from Qwen/Qwen2.5-0.5B-Instruct.
  • Parameter Count: 0.5 billion parameters, offering a balance between performance and computational efficiency.
  • Context Length: Supports a substantial context window of 32768 tokens, enabling it to handle longer inputs and generate more coherent outputs.
  • Training: Trained with a learning rate of 2e-05 over 1 epoch, using AdamW optimizer and a cosine learning rate scheduler.

Potential Use Cases

Given its fine-tuning on a "generator dataset" and the model name "coder", this model is likely intended for:

  • Code Generation: Assisting developers by generating code snippets or completing programming tasks.
  • Instruction Following: Executing specific instructions to produce desired text or code outputs.

Limitations

The model card indicates that more information is needed regarding its specific intended uses, limitations, and the exact nature of its training and evaluation data. Users should exercise caution and conduct thorough testing for critical applications.