mindy-labs/mindy-7b-v2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Dec 14, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Mindy-7b-v2 is a 7 billion parameter transformer-based language model developed by Mindy Group, Inc. This English-language model is a Frankenstein merge of AIDC-ai-business/Marcoroni-7B-v3 and Weyaxi/Seraph-7B, designed to combine their respective strengths. It is suitable for general language tasks where a 7B parameter model with an 8192 token context length is appropriate.
Loading preview...