pmking27/PrathameshLLM-2B
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Warm

pmking27/PrathameshLLM-2B is a 2.6 billion parameter causal language model developed by pmking27, fine-tuned from Google's Gemma-2B architecture. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. It is designed for instruction-following tasks, demonstrated through its use with an Alpaca prompt template for question answering based on provided context.

Loading preview...