AngelRaychev/qwen3-0.6b-sciq-v2

TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Apr 24, 2026Architecture:Transformer Cold

AngelRaychev/qwen3-0.6b-sciq-v2 is a 0.8 billion parameter language model based on the Qwen3 architecture, developed by AngelRaychev. This model features a substantial context length of 32768 tokens, indicating its capability to process and generate long sequences of text. While specific fine-tuning details are not provided, its naming convention suggests a focus on scientific question answering (SciQ) tasks. It is suitable for applications requiring compact yet capable language understanding and generation within a scientific domain.

Loading preview...

Model Overview

This model, AngelRaychev/qwen3-0.6b-sciq-v2, is a compact language model with 0.8 billion parameters, built upon the Qwen3 architecture. It is designed to handle extensive textual inputs, boasting a context length of 32768 tokens. The model's naming implies a specialization in scientific question answering (SciQ) tasks, suggesting it has been fine-tuned or optimized for performance in this specific domain.

Key Characteristics

  • Architecture: Qwen3 base model.
  • Parameter Count: 0.8 billion parameters, offering a balance between performance and computational efficiency.
  • Context Length: Supports a substantial 32768 tokens, enabling processing of long documents and complex queries.
  • Potential Specialization: The "sciq-v2" in its name indicates a likely focus or fine-tuning for scientific question answering datasets, aiming for improved accuracy and relevance in scientific contexts.

Use Cases

Given its architecture and potential specialization, this model is likely well-suited for:

  • Scientific Question Answering: Answering factual questions based on scientific texts or databases.
  • Information Extraction: Identifying and extracting key information from scientific papers, articles, or reports.
  • Text Summarization: Generating concise summaries of scientific literature.
  • Domain-Specific Applications: Integration into tools or platforms that require understanding and generating content within scientific or technical fields.