laion/gpt-oss-120B-stack-overflow-32ep-131k-summtrc-fixthink1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Warm

The laion/gpt-oss-120B-stack-overflow-32ep-131k-summtrc-fixthink1 model is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. It was trained on the penfever/gpt-oss-120B-stack-overflow-32ep-131k-summtrc-fixthink1 dataset, suggesting a specialization in areas related to Stack Overflow content. With a context length of 32768 tokens, this model is likely optimized for processing and generating technical information, potentially for code-related tasks or question-answering based on programming knowledge.

Loading preview...