mlabonne/NeuralBeagle14-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 15, 2024License:cc-by-nc-4.0Architecture:Transformer0.2K Open Weights Cold

mlabonne/NeuralBeagle14-7B is a 7 billion parameter DPO fine-tuned language model based on a merge of fblgit/UNA-TheBeagle-7b-v1 and argilla/distilabeled-Marcoro14-7B-slerp. It utilizes a 4096-token context window and excels in instruction following and reasoning tasks. This model is optimized for general-purpose conversational AI, including roleplay and storytelling, and ranks highly among 7B models on public leaderboards.

Loading preview...