W-61/hh-harmless-base-qwen3-8b-sft
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 3, 2026Architecture:Transformer Cold

The W-61/hh-harmless-base-qwen3-8b-sft is an 8 billion parameter language model, fine-tuned from the Qwen/Qwen3-8B architecture with a context length of 32768 tokens. Developed by W-61, this model was trained using Supervised Fine-Tuning (SFT) with the TRL framework. Its primary characteristic is its base harmlessness, making it suitable for applications requiring a foundational model with reduced propensity for generating harmful content.

Loading preview...