SicariusSicariiStuff/Zion_Alpha_Instruction_Tuned_SLERP
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Jun 9, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Zion_Alpha_Instruction_Tuned_SLERP is a 7 billion parameter instruction-tuned causal language model developed by SicariusSicariiStuff, based on the Mistral 7B architecture. This model is specifically fine-tuned for Hebrew language tasks, achieving the world's highest sentiment analysis score for Hebrew LLMs at 70.3. It excels in Hebrew language understanding, generation, and translation, making it suitable for applications requiring strong Hebrew linguistic capabilities.

Loading preview...