dgambettaphd/M_llm2_run0_gen0_WXS_doc1000_synt64_lr1e-04_acm_MPP01pcLAST
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 11, 2026Architecture:Transformer Cold

The dgambettaphd/M_llm2_run0_gen0_WXS_doc1000_synt64_lr1e-04_acm_MPP01pcLAST is a 7 billion parameter language model with a 4096 token context length. Developed by dgambettaphd, this model is a Hugging Face Transformers model. Due to limited information in its model card, specific architectural details, training data, and primary differentiators are not explicitly stated, making its unique capabilities and optimal use cases currently undefined.

Loading preview...