field2437/phi-2-platypus-Commercial-lora
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Mar 7, 2024License:mitArchitecture:Transformer Open Weights Cold
field2437/phi-2-platypus-Commercial-lora is a 3 billion parameter causal language model developed by field2437, fine-tuned from Microsoft's Phi-2 base model. It was trained on the kyujinpy/Open-platypus-Commercial dataset and features a 2048-token context length. This model is optimized for general language understanding and generation, demonstrating competitive performance across various benchmarks including Copa, HellaSwag, BoolQ, and MMLU.
Loading preview...