yufeng1/OpenThinker-7B-type6-e5-max-alpha0_25-textsummarization-type6-e1-alpha0_28125-2
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 12, 2026Architecture:Transformer Cold

The yufeng1/OpenThinker-7B-type6-e5-max-alpha0_25-textsummarization-type6-e1-alpha0_28125-2 is a 7.6 billion parameter language model developed by yufeng1. This model is specifically fine-tuned for text summarization tasks, leveraging its substantial parameter count and a 32768-token context length to process and condense extensive textual inputs. Its architecture is optimized for generating concise and coherent summaries, making it suitable for applications requiring efficient information extraction and content reduction.

Loading preview...