SeacomSrl/SeaPhi3-mini
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:4kPublished:Apr 29, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
SeaPhi3-mini is a 4 billion parameter language model developed by Toti Riccardo, fine-tuned from Microsoft's Phi-3-mini-128k-instruct. This model is specifically adapted for Italian language tasks, leveraging a translated dataset for improved performance in Italian contexts. It is designed for applications requiring a compact yet capable model with a 4096-token context length, particularly in Italian-centric natural language processing.
Loading preview...