DCAgent2/stack-bugsseq
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Nov 30, 2025Architecture:Transformer Cold

DCAgent2/stack-bugsseq is an 8 billion parameter language model trained from scratch, designed for general language understanding tasks. While specific differentiators are not detailed, its training from scratch suggests a foundational model approach. With a 32768 token context length, it is suitable for processing moderately long sequences of text.

Loading preview...