NingLab/eCeLLM-S
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Feb 14, 2024License:cc-by-4.0Architecture:Transformer0.0K Open Weights Cold
NingLab/eCeLLM-S is a 3 billion parameter instruction-tuned language model developed by NingLab, based on the Phi-2 architecture. It is specifically optimized for e-commerce applications, leveraging large-scale, high-quality instruction data. This model excels at tasks relevant to the e-commerce domain, providing specialized language understanding and generation capabilities.
Loading preview...
eCeLLM-S: E-commerce Specialized LLM
eCeLLM-S is a 3 billion parameter instruction-tuned large language model developed by NingLab, specifically designed for e-commerce applications. It is built upon the Phi-2 base model and has been fine-tuned using the proprietary ECInstruct dataset, which comprises large-scale, high-quality instruction data relevant to the e-commerce domain.
Key Capabilities
- E-commerce Specific Understanding: Optimized for processing and generating language related to product descriptions, customer reviews, marketing content, and other e-commerce contexts.
- Instruction Following: Enhanced ability to follow instructions for tasks within the e-commerce sector due to specialized instruction tuning.
- Efficient Performance: As a 3 billion parameter model, eCeLLM-S offers a balance between performance and computational efficiency, making it suitable for various e-commerce deployments.
Good For
- Developers building applications that require specialized language understanding and generation for the e-commerce industry.
- Tasks such as product summarization, review analysis, customer service automation, and content generation for online retail platforms.
- Researchers interested in domain-specific instruction tuning for large language models.