EXAONE-3.0-7.8b-it Overview
EXAONE-3.0-7.8b-it is an instruction-tuned language model built on the Llama architecture, featuring approximately 8 billion parameters. Developed by Bingsu, this model is notable for its bilingual capabilities, supporting both English and Korean.
Key Capabilities
- Bilingual Support: Excels in processing and generating text in both English and Korean.
- Instruction-Tuned: Optimized to follow instructions effectively for a wide range of tasks.
- General-Purpose Assistant: Capable of handling diverse conversational and generative AI applications.
Usage Notes
- The model's chat template is designed for
llama-cpp-python using Jinja2; compatibility with other runtimes may vary. - Metadata indicates a context length of 4096 tokens and a vocabulary size of 102400, with a
rope.freq_base of 500000.0.
Good For
- Applications requiring robust performance in both English and Korean.
- Building conversational agents and assistants that need to follow complex instructions.
- General text generation and understanding tasks where a balanced bilingual model is beneficial.