XXsongLALA/Llama-3.1-8B-instruct-RAG-RL
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kArchitecture:Transformer0.0K Cold

XXsongLALA/Llama-3.1-8B-instruct-RAG-RL is an 8 billion parameter instruction-tuned model based on the Llama 3.1 architecture, developed by XXsongLALA. This model was trained from scratch and features a 32768 token context length. It is designed for general language understanding and generation tasks, with specific optimization for instruction following.

Loading preview...