dgambettaphd/M_qw34_run0_gen0_WXS_doc1000_synt64_lr1e-04_acm_SYNLAST
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 2, 2026Architecture:Transformer Warm

The dgambettaphd/M_qw34_run0_gen0_WXS_doc1000_synt64_lr1e-04_acm_SYNLAST model is a 4 billion parameter language model with a 32768 token context length. This model is a Hugging Face Transformers model, automatically generated and pushed to the Hub. Due to limited information in its model card, specific architectural details, training data, and primary differentiators beyond its size and context window are not available.

Loading preview...