NLUHOPOE/test-case-2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 20, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

NLUHOPOE/test-case-2 is a 7 billion parameter large language model developed by Juhwan Lee, based on the Mistral-7B-v0.1 architecture. This model has been fine-tuned specifically for data ordering tasks. It incorporates architectural features such as Grouped-Query Attention, Sliding-Window Attention, and a Byte-fallback BPE tokenizer. Its primary application is in scenarios requiring precise data sequence arrangement.

Loading preview...