laion/exp-uns-r2egym-2_1x_glm_4_7_traces_jupiter_cleaned
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 27, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

This 8 billion parameter model is a fine-tuned version of Qwen/Qwen3-8B, developed by laion, featuring a 32768 token context length. It was specifically trained on the /data/cat/ws/befe330h-befe330h-otagent/huggingface/hub/datasets--DCAgent--exp-uns-r2egym-2_1x_glm_4.7_traces_jupiter_cleaned dataset. The model is optimized for tasks related to the specific dataset it was fine-tuned on, suggesting a specialized application rather than general-purpose use.

Loading preview...