ellamind/propella-1-4b is a 4 billion parameter multilingual large language model from the propella-1 family, developed by ellamind. It is specifically designed for annotating text documents across 18 properties in six categories, such as content quality, educational value, and safety. This model excels at high-throughput inference for curating LLM training data at scale, supporting 57 languages and handling various text formats like web pages, PDFs, and code.
No reviews yet. Be the first to review!