ElenaSenger/DiSTER-Llama-3-8B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Oct 6, 2025License:llama3Architecture:Transformer0.0K Cold
ElenaSenger/DiSTER-Llama-3-8B-Instruct is an 8 billion parameter instruction-tuned causal language model, fine-tuned by ElenaSenger from Meta-Llama-3-8B-Instruct. This model is specifically optimized for cross-domain technical and scientific term extraction, designed to output lists of domain-relevant terms from input text. It leverages an 8192 token context length and is specialized for tasks requiring precise term identification.
Loading preview...