laion/exp-swd-r2egym-wo-docker_glm_4_7_traces
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 21, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The laion/exp-swd-r2egym-wo-docker_glm_4_7_traces model is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. It was trained on the DCAgent/exp-swd-r2egym-wo-docker_glm_4.7_traces dataset, suggesting a specialization in areas related to its training data. This model is designed for specific applications leveraging its fine-tuning on a targeted dataset, offering a context length of 32768 tokens.

Loading preview...