migtissera/SynthIA-7B-v1.3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Sep 28, 2023License:apache-2.0Architecture:Transformer0.1K Open Weights Warm

SynthIA-7B-v1.3 by migtissera is a 7 billion parameter Mistral-7B-v0.1 based model, fine-tuned on Orca-style datasets for instruction following and long-form conversations. It features an 8192-token context length and is designed to evoke Tree of Thought and Chain of Thought reasoning. The model is uncensored and achieves a total average score of 0.6485 on key metrics including arc_challenge, hellaswag, mmlu, and truthfulqa_mc.

Loading preview...