Lixing-Li/Abyme-Llama-3.1-8B-SFT
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 19, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Lixing-Li/Abyme-Llama-3.1-8B-SFT is an 8 billion parameter Llama 3.1 model developed by Lixing-Li, fine-tuned using Unsloth and Huggingface's TRL library. This instruction-tuned model leverages efficient training methods to deliver enhanced performance for general language tasks. It is designed for applications requiring a capable 8B parameter model with a 32768 token context length.
Loading preview...