SCL2025/KG-R1-CWQ-7B

TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 19, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

SCL2025/KG-R1-CWQ-7B is a 7.6 billion parameter language model developed by SCL2025. This model is designed for general language understanding and generation tasks. With a context length of 32768 tokens, it can process and generate extensive text sequences. Its primary strength lies in its ability to handle complex queries and provide coherent, contextually relevant responses across a wide range of topics.

Loading preview...

Model Overview

SCL2025/KG-R1-CWQ-7B is a 7.6 billion parameter language model developed by SCL2025. It is engineered for robust performance in various natural language processing tasks, offering a substantial context window of 32768 tokens. This allows the model to maintain coherence and understanding over long inputs and generate detailed, context-aware outputs.

Key Capabilities

  • General Language Understanding: Proficient in interpreting diverse textual inputs.
  • Text Generation: Capable of producing coherent and contextually relevant text.
  • Extended Context Handling: Processes and generates content effectively within a 32768-token context window.

Good For

  • Applications requiring deep contextual understanding.
  • Tasks involving long-form content generation or summarization.
  • General-purpose conversational AI and question-answering systems.