W-61/llama-3-8b-base-hh-harmless-sft-4xh100
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Apr 1, 2026License:llama3Architecture:Transformer Cold

W-61/llama-3-8b-base-hh-harmless-sft-4xh100 is an 8 billion parameter language model fine-tuned from Meta-Llama-3-8B. It has been specifically trained on the Anthropic/hh-rlhf dataset to enhance harmlessness and align with human preferences. This model is optimized for applications requiring a robust, safety-focused conversational AI with an 8192 token context length.

Loading preview...