g4me/QwenRolina3-IRM-LR1e5-b64g8-order-domain-uff
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Feb 22, 2026Architecture:Transformer Cold

The g4me/QwenRolina3-IRM-LR1e5-b64g8-order-domain-uff model is a 2 billion parameter language model with a 32768 token context length. Developed by g4me, this model's specific architecture, training data, and primary differentiators are not detailed in its current model card. Its intended use cases and unique capabilities require further information to be fully understood.

Loading preview...

Model Overview

The g4me/QwenRolina3-IRM-LR1e5-b64g8-order-domain-uff is a 2 billion parameter language model with a substantial context length of 32768 tokens. While the model card indicates it is a Hugging Face Transformers model, specific details regarding its architecture, training methodology, and unique characteristics are currently marked as "More Information Needed."

Key Characteristics

  • Parameter Count: 2 billion parameters.
  • Context Length: Supports a context window of 32768 tokens.
  • Developer: Attributed to 'g4me' based on the model name.

Current Status and Limitations

As per the provided model card, comprehensive information regarding the model's development, intended uses, training data, evaluation results, and potential biases or limitations is not yet available. Users are advised that further details are required to fully understand its capabilities and appropriate applications. The model card is automatically generated and awaits more specific input from its developers to provide a complete picture of its functionality and performance.