laion/r2egym-nl2bash-stack-bugsseq-fixthink
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 17, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The laion/r2egym-nl2bash-stack-bugsseq-fixthink model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It was trained on specialized datasets including r2egym, nl2bash, StackExchange, and inferred bugs, indicating a focus on code-related tasks, command generation, and problem-solving. With a 32768 token context length, this model is designed for complex reasoning and generating solutions in technical domains.

Loading preview...