theprint/Boptruth-NeuralMonarch-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jun 9, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Boptruth-NeuralMonarch-7B is a 7 billion parameter language model created by theprint, resulting from a Slerp merge of nbeerbower/bophades-mistral-truthy-DPO-7B and mlabonne/NeuralMonarch-7B. This model is designed for general text generation tasks, leveraging the combined strengths of its constituent models. It requires the Alpaca prompt format for optimal performance, avoiding issues with end-of-response tokens.

Loading preview...