moogician/STILL-seed2
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 23, 2025License:otherArchitecture:Transformer Cold

moogician/STILL-seed2 is a 32.8 billion parameter language model fine-tuned from deepseek-ai/DeepSeek-R1-Distill-Qwen-32B. It features a 32768 token context length and was trained with a focus on specific data from the 'still' dataset. This model is intended for tasks benefiting from its specialized fine-tuning, though specific capabilities require further information.

Loading preview...