yoriis/Gemma-Random-CPT-IT-0.3

TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Jan 8, 2026Architecture:Transformer Cold

The yoriis/Gemma-Random-CPT-IT-0.3 is a 9 billion parameter instruction-tuned language model based on the Gemma architecture, developed by yoriis. This model features a substantial 16384-token context length, making it suitable for processing and generating extensive text. Its instruction-tuned nature suggests optimization for following complex directives and performing various language tasks effectively.

Loading preview...

Model Overview

The yoriis/Gemma-Random-CPT-IT-0.3 is an instruction-tuned language model built upon the Gemma architecture, featuring 9 billion parameters. Developed by yoriis, this model is designed to understand and execute a wide range of instructions, making it versatile for various natural language processing tasks.

Key Characteristics

  • Architecture: Based on the Gemma model family.
  • Parameter Count: 9 billion parameters, offering a balance between performance and computational efficiency.
  • Context Length: Supports an extended context window of 16384 tokens, enabling it to handle longer inputs and generate more coherent, contextually relevant outputs.
  • Instruction-Tuned: Optimized for following user instructions and performing specific tasks as directed.

Potential Use Cases

Given its instruction-tuned nature and substantial context length, this model could be beneficial for:

  • Complex Question Answering: Processing detailed queries and providing comprehensive answers.
  • Content Generation: Creating long-form articles, summaries, or creative text based on specific prompts.
  • Code Assistance: Understanding and generating code snippets or explanations, though not explicitly stated as a primary focus.
  • Conversational AI: Engaging in extended dialogues while maintaining context over many turns.