yufeng1/OpenThinker-7B-type6-e5-max-alpha0_25-2
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 16, 2026Architecture:Transformer Cold

The yufeng1/OpenThinker-7B-type6-e5-max-alpha0_25-2 is a 7.6 billion parameter language model with a 32,768 token context length. This model is part of the OpenThinker series, designed for general language understanding and generation tasks. Its specific architecture and training details are not provided, but it is intended for broad application in natural language processing.

Loading preview...