yufeng1/OpenThinker-7B-type6-e5-max-alpha0_25-textsummarization-type6-e1-alpha0_75-2
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 12, 2026Architecture:Transformer Cold
The yufeng1/OpenThinker-7B-type6-e5-max-alpha0_25-textsummarization-type6-e1-alpha0_75-2 model is a 7.6 billion parameter language model. This model is specifically fine-tuned for text summarization tasks, leveraging a unique combination of training types (type6-e5-max-alpha0_25 and type6-e1-alpha0_75). With a context length of 32768 tokens, it is designed to process and condense extensive textual inputs efficiently. Its primary strength lies in generating concise and relevant summaries from long-form content.
Loading preview...