ElenaSenger/DiSTER-Llama-3-8B-Instruct is an 8 billion parameter instruction-tuned causal language model, fine-tuned by ElenaSenger from Meta-Llama-3-8B-Instruct. This model is specifically optimized for cross-domain technical and scientific term extraction, designed to output lists of domain-relevant terms from input text. It leverages an 8192 token context length and is specialized for tasks requiring precise term identification.
No reviews yet. Be the first to review!