YU-MO/Yumo-nano
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 6, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

YU-MO/Yumo-nano is a 1.5 billion parameter bilingual (English and Spanish) instruction-tuned causal language model, fine-tuned from agentica-org/DeepScaleR-1.5B-Preview. With a 32768 token context length, it is optimized for reasoning tasks, leveraging datasets like YU-MO/Yumo-dataset and EleutherAI/hendrycks_math. This model is designed for chat applications and general text generation, particularly excelling in bilingual contexts.

Loading preview...