AlignmentResearch/dolus_chat_sdf_Llama-3.3-70B-Instruct_v1_merged
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Dec 31, 2025Architecture:Transformer Cold

AlignmentResearch/dolus_chat_sdf_Llama-3.3-70B-Instruct_v1_merged is a 70 billion parameter instruction-tuned language model with a 32768 token context length. This model is based on the Llama-3.3 architecture and is designed for general conversational and instruction-following tasks. Its primary strength lies in its large parameter count and extended context window, enabling complex reasoning and detailed responses.

Loading preview...