sometimesanotion/Lamarck-14B-v0.7-Fusion
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Feb 23, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

sometimesanotion/Lamarck-14B-v0.7-Fusion is an experimental 14.8 billion parameter language model with a 131,072 token context length, developed by sometimesanotion. This model is a multi-stage fusion merge, emphasizing strong prose generation and exhibiting high GPQA and reasoning capabilities. It is specifically designed for free-form creativity and exploring complex merge strategies.

Loading preview...