mlfoundations-dev/openthoughts3_10k
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Cold
The mlfoundations-dev/openthoughts3_10k model is a 7.6 billion parameter language model fine-tuned from Qwen/Qwen2.5-7B-Instruct. It was trained by mlfoundations-dev on the openthoughts3_10k dataset, featuring a notable context length of 131072 tokens. This model is optimized for tasks related to the specific data distribution of the openthoughts3_10k dataset.
Loading preview...