dgambettaphd/M_mis73_run0_gen0_WXS_doc1000_synt64_lr1e-04_acm_MPP
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 21, 2026Architecture:Transformer Cold

The dgambettaphd/M_mis73_run0_gen0_WXS_doc1000_synt64_lr1e-04_acm_MPP model is a 7 billion parameter language model with a 4096 token context length. Developed by dgambettaphd, this model's specific architecture, training data, and primary differentiators are not detailed in its current documentation. Its intended use cases and unique capabilities are currently unspecified, requiring further information for specific application guidance.

Loading preview...