realtreetune/rho-1b-sft-GSM8K
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Aug 8, 2024Architecture:Transformer Warm

The realtreetune/rho-1b-sft-GSM8K is a 1.1 billion parameter language model, fine-tuned specifically for supervised instruction on the GSM8K dataset. This model is designed to excel at mathematical reasoning and problem-solving tasks, leveraging its specialized training to improve performance in quantitative domains. Its primary application is in scenarios requiring accurate arithmetic and logical deduction, making it suitable for educational tools or analytical systems.

Loading preview...