microsoft/phi-1_5
TEXT GENERATIONConcurrency Cost:1Model Size:1.4BQuant:BF16Ctx Length:2kPublished:Sep 10, 2023License:mitArchitecture:Transformer1.4K Open Weights Warm

microsoft/phi-1_5 is a 1.3 billion parameter Transformer-based language model developed by Microsoft. Trained on a curated dataset including NLP synthetic texts, it demonstrates strong performance in common sense, language understanding, and logical reasoning among small models. This base model is designed for research into AI safety challenges, offering capabilities in text generation, summarization, and Python code creation.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p