StanfordAIMI/GREEN-RadPhi2
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Mar 25, 2024Architecture:Transformer0.0K Cold

StanfordAIMI/GREEN-RadPhi2 is a 3 billion parameter language model, fine-tuned from StanfordAIMI/RadPhi-2. This model demonstrates a low validation loss of 0.0816, indicating strong performance on its training objective. It is suitable for tasks requiring a compact yet capable language model, building upon its RadPhi-2 base.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p