DCAgent2/nl2bash-stack-bugsseq
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Nov 30, 2025Architecture:Transformer Cold

The DCAgent2/nl2bash-stack-bugsseq model is an 8 billion parameter language model with a 32768 token context length. It was trained from scratch, utilizing a cosine learning rate scheduler and AdamW_Torch_Fused optimizer. Specific details regarding its architecture, primary use case, and key characteristics are not provided in the available documentation.

Loading preview...