j05hr3d/Llama-3.2-1B-Instruct-C_M_T-SAM-RHO0_025
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Mar 26, 2026Architecture:Transformer Loading

j05hr3d/Llama-3.2-1B-Instruct-C_M_T-SAM-RHO0_025 is a 1 billion parameter instruction-tuned language model, fine-tuned by j05hr3d from the meta-llama/Llama-3.2-1B-Instruct base model. This model was trained using Supervised Fine-Tuning (SFT) with the TRL framework. It is designed for general text generation tasks following instructions, leveraging its Llama-3.2 architecture and 32768 token context length.

Loading preview...