sdhossain24/Meta-Llama-3-8B-Instruct-SDD
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Apr 26, 2026Architecture:Transformer Cold
sdhossain24/Meta-Llama-3-8B-Instruct-SDD is an 8 billion parameter instruction-tuned causal language model, fine-tuned from Meta-Llama-3-8B-Instruct. This model was trained using SFT with the TRL framework. It is designed for general text generation tasks, leveraging the capabilities of the Llama 3 architecture.
Loading preview...
Overview
sdhossain24/Meta-Llama-3-8B-Instruct-SDD is an 8 billion parameter instruction-tuned language model, derived from the Meta-Llama-3-8B-Instruct base model. It has been fine-tuned using the TRL library for supervised fine-tuning (SFT).
Key Capabilities
- Instruction Following: Optimized for understanding and executing user instructions.
- Text Generation: Capable of generating coherent and contextually relevant text based on prompts.
- Llama 3 Architecture: Benefits from the robust architecture of the Meta Llama 3 series.
Training Details
The model underwent supervised fine-tuning (SFT) using the TRL framework (version 0.22.1), with Transformers 4.57.6, Pytorch 2.10.0+cu128, Datasets 4.8.4, and Tokenizers 0.22.2.
Good For
- General-purpose conversational AI.
- Question answering based on provided context.
- Creative writing and content generation tasks where instruction adherence is important.