jiosephlee/Intern-S1-mini-lm
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Nov 1, 2025Architecture:Transformer Cold

The jiosephlee/Intern-S1-mini-lm is an 8 billion parameter language model with a 32768 token context length. This model is a general-purpose language model, though specific differentiators or optimizations are not detailed in its current documentation. It is suitable for various natural language processing tasks where a model of this size and context window is appropriate.

Loading preview...