dgambettaphd/M_qw34_run0_gen0_WXS_doc1000_synt64_lr1e-04_acm_MPP
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 4, 2026Architecture:Transformer Warm
The dgambettaphd/M_qw34_run0_gen0_WXS_doc1000_synt64_lr1e-04_acm_MPP model is a 4 billion parameter language model with a 32,768 token context length. This model is automatically generated and its specific architecture, training data, and primary differentiators are not detailed in the provided information. Its intended use cases and unique capabilities are currently unspecified, requiring further information for proper evaluation.
Loading preview...