OEvortex/HelpingAI-Lite-1.5T
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Mar 9, 2024License:hsulArchitecture:Transformer0.0K Warm

OEvortex/HelpingAI-Lite-1.5T is a 1.1 billion parameter language model developed by OEvortex, trained on 1.5 trillion tokens including specialized datasets like bigcode/starcoderdata. This advanced version of the HelpingAI-Lite model is particularly optimized for providing precise and insightful responses for coding tasks. It leverages a vast training corpus to enhance its capabilities in English language processing.

Loading preview...