qingy2024/GRMR-2B-Instruct-old
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Dec 11, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

The qingy2024/GRMR-2B-Instruct-old is a 2.6 billion parameter instruction-tuned language model developed by qingy2024, fine-tuned from unsloth/gemma-2-2b-bnb-4bit. This model specializes in grammar correction, designed to take any input text and repeat it with fixed grammatical errors. It offers an 8192-token context length, making it suitable for processing moderately sized text inputs for grammar refinement tasks.

Loading preview...