BirendraSharma/llama3.2_1B_distractors_generation
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Feb 8, 2025Architecture:Transformer Warm

BirendraSharma/llama3.2_1B_distractors_generation is a 1 billion parameter LLaMA 3.2-based causal language model developed by BirendraSharma. This model is specifically fine-tuned for generating plausible but incorrect answer choices (distractors) for multiple-choice questions. It excels in educational applications requiring high-quality distractor generation, leveraging its architecture to produce relevant and contextually appropriate incorrect options.

Loading preview...