laion/swesmith-1000-opt1k__Qwen3-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026License:otherArchitecture:Transformer Cold
The laion/swesmith-1000-opt1k__Qwen3-8B is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. This model was trained on the /e/data1/datasets/playground/ot/hf_hub/datasets--laion--swesmith-unified-1000/snapshots/031ef1b66d8d55421f68d0afcbf7872ef3644c1e_thinking_preprocessed dataset. It leverages a 32768 token context length and is designed for general language generation tasks based on its Qwen3 architecture.
Loading preview...