koutch/paper_llama_llama3.1-8b_train_sft_train_code
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 22, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The koutch/paper_llama_llama3.1-8b_train_sft_train_code model is an 8 billion parameter Llama 3.1-based language model, fine-tuned for code-related tasks. Developed by koutch, this model leverages Unsloth and Huggingface's TRL library for accelerated training. It is designed to excel in code generation and understanding, offering a 32768 token context length for processing extensive codebases.
Loading preview...