StanfordAIMI/GREEN-Phi2
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kLicense:mitArchitecture:Transformer Open Weights Cold

StanfordAIMI/GREEN-Phi2 is a 3 billion parameter causal language model, fine-tuned from Microsoft's Phi-2 architecture with a 2048-token context length. This model has undergone further training on an unspecified dataset, achieving a final validation loss of 0.0781. It is intended for general language generation tasks, building upon the compact yet capable design of the original Phi-2.

Loading preview...