eridai/Yumi
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Mar 3, 2026Architecture:Transformer0.0K Warm

eridai/Yumi is a compact 0.8 billion parameter language model developed by eridai, featuring a substantial 32,768 token context length. This model is designed for efficient processing of long sequences, making it suitable for applications requiring extensive contextual understanding. Its small size combined with a large context window allows for deployment in resource-constrained environments while maintaining strong performance on tasks involving lengthy inputs.

Loading preview...