mlfoundations-dev/b2_science_fasttext_pos_scp116k
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 23, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

The mlfoundations-dev/b2_science_fasttext_pos_scp116k model is a 7.6 billion parameter instruction-tuned causal language model, fine-tuned from Qwen/Qwen2.5-7B-Instruct. This model was specifically adapted using the mlfoundations-dev/b2_science_fasttext_pos_scp116k dataset. With a context length of 131,072 tokens, it is designed for tasks related to its fine-tuning data, though specific applications require further definition. Its primary strength lies in its specialized fine-tuning for particular scientific text processing.

Loading preview...