dgambettaphd/M_mis72_run0_gen0_WXS_doc1000_synt64_lr1e-04_acm_MPP
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 20, 2026Architecture:Transformer Cold

The dgambettaphd/M_mis72_run0_gen0_WXS_doc1000_synt64_lr1e-04_acm_MPP model is a 7 billion parameter language model developed by dgambettaphd. This model is a Hugging Face transformers model, automatically generated and pushed to the Hub. Due to limited information in its model card, specific architectural details, training data, and primary differentiators are not provided. Its intended use cases and performance characteristics require further information.

Loading preview...