laion/r2egym-nl2bash-stack-bugsseq-fixthink-again
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 27, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The laion/r2egym-nl2bash-stack-bugsseq-fixthink-again model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It is specifically trained on a combination of datasets including r2egym, nl2bash, stackexchange-overflow-sandboxes, and inferredbugs. This model is optimized for tasks involving code generation, bug fixing, and natural language to bash command translation, leveraging its specialized training data for enhanced performance in these domains. Its 32768 token context length supports complex problem-solving and detailed code analysis.

Loading preview...