maywell/PiVoT-0.1-early
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 24, 2023License:cc-by-sa-4.0Architecture:Transformer0.0K Open Weights Cold
PiVoT-0.1-early is a 7 billion parameter language model developed by maywell, fine-tuned from Mistral 7B. It is a variation of the Synatra v0.3 RP model, leveraging datasets such as OpenOrca, Arcalive Ai Chat Chan log, ko_wikidata_QA, and kyujinpy/OpenOrca-KO. This model is designed for general language tasks, building on the performance characteristics of its base and variant models.
Loading preview...