sometimesanotion/Lamarck-14B-v0.6
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Jan 4, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Lamarck-14B-v0.6 by sometimesanotion is a 14.8 billion parameter generalist language model merge, based on components from the Qwen family. It is specifically focused on multi-step reasoning, prose generation, and multi-language capabilities. This model utilizes complex merge techniques, including LoRA adapter extraction and targeted weight/density gradients, to combine the strengths of various finetunes. It has achieved the #1 average score on the Open LLM Leaderboard for text-generation assistant models under 32 billion parameters.

Loading preview...