scale-safety-research/Qwen2-7B-ftjob-88b6a536bfb6-cgcmv_p7_h0.15_hc1.0_1ep_pre2vRbjFgT
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Oct 28, 2025License:apache-2.0Architecture:Transformer Open Weights Cold
The scale-safety-research/Qwen2-7B-ftjob-88b6a536bfb6-cgcmv_p7_h0.15_hc1.0_1ep_pre2vRbjFgT is a 7.6 billion parameter Qwen2 model developed by scale-safety-research. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language tasks, leveraging its Qwen2 architecture and a 32768 token context length.
Loading preview...