laion/Qwen3-8B_exp-swd-swesmith-wo-docker_glm_4.7_traces_locetash_save-strategy_steps
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 9, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
This is an 8 billion parameter Qwen3-based language model, fine-tuned by laion. It was specifically trained on the DCAgent/exp-swd-swesmith-wo-docker_glm_4.7_traces_locetash dataset, suggesting a specialization in tasks related to software development, potentially involving Docker-less environments, GLM traces, or specific save strategies. Its 32768-token context length supports processing extensive inputs for these specialized applications.
Loading preview...