ZhichengLiao/Code_Math_FFT_lr1e-6_global_step_272
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 22, 2026Architecture:Transformer Cold
ZhichengLiao/Code_Math_FFT_lr1e-6_global_step_272 is a 2 billion parameter language model developed by ZhichengLiao. This model has a context length of 32768 tokens. Due to the lack of specific details in its model card, its primary differentiators and intended use cases are not explicitly defined, suggesting it may be a foundational or experimental model.
Loading preview...