lightgpt/LightGPT-0.5B-Qwen2
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Jul 22, 2024License:mitArchitecture:Transformer Open Weights Warm
The lightgpt/LightGPT-0.5B-Qwen2 model is a 0.5 billion parameter language model based on the Qwen2 architecture, developed by LightGPT. It is specifically trained and optimized for use as an agent in traffic signal control systems, as detailed in the LLMLight research. With a context length of 32768 tokens, this model is designed to process complex environmental data for intelligent traffic management applications.
Loading preview...