Sashank-810/IDC_Global_Merged
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Nov 26, 2025License:llama3.1Architecture:Transformer Cold

Sashank-810/IDC_Global_Merged is an 8 billion parameter instruction-tuned causal language model, based on Meta's Llama-3.1-8B-Instruct, fine-tuned for math tutoring and doubt clarification. This model integrates an IDC critic adapter, enabling it to provide step-by-step math help and critique student answers. It offers an 11.12 percentage point accuracy gain over its base model in structured evaluations, making it suitable for educational applications requiring detailed mathematical assistance.

Loading preview...