DCAgent2/stack-bugs-undr7030
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Nov 30, 2025Architecture:Transformer Cold

DCAgent2/stack-bugs-undr7030 is an 8 billion parameter language model developed by DCAgent2. This model was trained from scratch, utilizing a cosine learning rate scheduler and AdamW_TORCH_FUSED optimizer. Specific details regarding its primary differentiators, intended uses, and training dataset are not provided in the available documentation.

Loading preview...