ZhichengLiao/grpo_numina_full_global_step_272_HF_format
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 15, 2026Architecture:Transformer Loading

ZhichengLiao/grpo_numina_full_global_step_272_HF_format is a 2 billion parameter language model with a 32768 token context length. This model is provided as a Hugging Face Transformers model, but specific details regarding its architecture, training data, and primary use cases are not available in the provided documentation. Further information is needed to determine its unique capabilities or differentiators compared to other models.

Loading preview...