the-jb/phi-1_5-tofu_full
TEXT GENERATIONConcurrency Cost:1Model Size:1.4BQuant:BF16Ctx Length:2kPublished:Apr 15, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm
The-jb/phi-1_5-tofu_full is a 1.4 billion parameter language model, fine-tuned from Microsoft's phi-1_5 architecture. It has been specifically adapted using the full TOFU dataset, making it suitable for tasks requiring factual recall and knowledge-based generation. This model is designed for efficient deployment in applications where a smaller, specialized model is preferred.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–