Sheelu1246/Vedika_3.5_flash

TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 30, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Sheelu1246/Vedika_3.5_flash is an instruction-tuned 1.54 billion parameter causal language model from the Qwen2.5 series, developed by Qwen Team. It features a 32,768 token context length and is significantly improved in coding, mathematics, instruction following, and generating long texts. This model excels at understanding structured data and producing structured outputs like JSON, while also offering robust multilingual support across 29 languages.

Loading preview...

Qwen2.5-1.5B-Instruct Overview

Sheelu1246/Vedika_3.5_flash is an instruction-tuned model from the Qwen2.5 series, building upon the Qwen2 architecture. This 1.54 billion parameter model (1.31B non-embedding) features a 32,768 token context length and can generate up to 8,192 tokens. It incorporates architectural elements like RoPE, SwiGLU, RMSNorm, Attention QKV bias, and tied word embeddings.

Key Capabilities

  • Enhanced Knowledge & Reasoning: Significantly improved in coding and mathematics due to specialized expert models.
  • Instruction Following: Demonstrates substantial improvements in adhering to instructions and generating long texts (over 8K tokens).
  • Structured Data Handling: Excels at understanding structured data, such as tables, and generating structured outputs, particularly JSON.
  • Robustness: More resilient to diverse system prompts, enhancing role-play and condition-setting for chatbots.
  • Multilingual Support: Supports over 29 languages, including Chinese, English, French, Spanish, Portuguese, German, Italian, Russian, Japanese, and Korean.

Good For

  • Applications requiring strong coding and mathematical reasoning at a smaller scale.
  • Chatbots and agents needing robust instruction following and structured output generation.
  • Tasks involving long text generation and understanding complex structured data.
  • Multilingual applications across a broad range of languages.