DCAgent/a1-stack_pytest_gpt5mini
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 26, 2026License:otherArchitecture:Transformer Cold

DCAgent/a1-stack_pytest_gpt5mini is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. This model is specifically optimized for tasks related to pytest and GPT-5 mini traces, leveraging a specialized dataset for its training. It is designed for applications requiring analysis or generation based on these specific technical contexts. The model has a context length of 32768 tokens.

Loading preview...