Tok331102/affine-5H3rBY2GJoek64NWfHPBEVDzXFafDWAdWPNZTcY1vcC6FPrJ
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Mar 16, 2026Architecture:Transformer Cold
Tok331102/affine-5H3rBY2GJoek64NWfHPBEVDzXFafDWAdWPNZTcY1vcC6FPrJ is a 32 billion parameter language model developed by Tok331102. This model is a general-purpose transformer-based architecture, designed for a broad range of natural language processing tasks. Due to the lack of specific details in its model card, its primary differentiators and optimized use cases are not explicitly defined. It serves as a foundational model for further fine-tuning or general text generation.
Loading preview...