axon1/affine_m20_5DyWSHXrf8mP85MV427AEho7NcogpZdoqRrjEpMSYsxNLRNw
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Apr 4, 2026License:mitArchitecture:Transformer Open Weights Cold

The axon1/affine_m20_5DyWSHXrf8mP85MV427AEho7NcogpZdoqRrjEpMSYsxNLRNw model is a 32 billion parameter language model with a 32768 token context length. Developed by axon1, this model is designed for general language understanding and generation tasks. Its substantial parameter count and extended context window enable it to handle complex prompts and generate coherent, detailed responses across various applications. It is suitable for tasks requiring deep contextual comprehension and extensive output generation.

Loading preview...