ZhichengLiao/Merged_FFTMath_FFTCode_lr1-e-6_randomPartitioned_qwen317B
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 1, 2026Architecture:Transformer Cold

ZhichengLiao/Merged_FFTMath_FFTCode_lr1-e-6_randomPartitioned_qwen317B is a 2 billion parameter language model developed by ZhichengLiao. This model is part of the Qwen family, featuring a 32768 token context length. Its specific differentiators and primary use cases are not detailed in the provided model card, which indicates that more information is needed regarding its development and capabilities.

Loading preview...