sometimesanotion/Lamarck-14B-v0.7-Fusion is an experimental 14.8 billion parameter language model with a 131,072 token context length, developed by sometimesanotion. This model is a multi-stage fusion merge, emphasizing strong prose generation and exhibiting high GPQA and reasoning capabilities. It is specifically designed for free-form creativity and exploring complex merge strategies.
No reviews yet. Be the first to review!