ExTensaFort/Meta-Llama-3.1-8B-Instruct-Second-Brain-Summarization
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 22, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
ExTensaFort/Meta-Llama-3.1-8B-Instruct-Second-Brain-Summarization is an 8 billion parameter instruction-tuned Llama 3.1 model developed by ExTensaFort. This model is specifically fine-tuned for summarization tasks, leveraging the Meta-Llama-3.1 architecture. It was trained using Unsloth and Huggingface's TRL library, enabling faster training. Its 32768 token context length supports processing longer texts for summarization.
Loading preview...