LLM-GAT/llama-3-8b-instruct-repnoise-checkpoint-8
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Aug 4, 2024Architecture:Transformer Cold
LLM-GAT/llama-3-8b-instruct-repnoise-checkpoint-8 is an 8 billion parameter instruction-tuned language model based on the Llama 3 architecture. This model is a checkpoint from a training process, indicating it is likely an intermediate or specialized version of Llama 3 8B Instruct. With an 8192-token context length, it is designed for general conversational AI tasks and instruction following, potentially with specific optimizations from its 'repnoise' training. Its primary use case is as a foundational model for various natural language processing applications requiring instruction adherence.
Loading preview...