NLUHOPOE/test-case-6
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 28, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
NLUHOPOE/test-case-6 is a 7 billion parameter large language model developed by Juhwan Lee, fine-tuned for data ordering tasks. Based on the Mistral-7B-v0.1 architecture, it incorporates Grouped-Query Attention, Sliding-Window Attention, and a Byte-fallback BPE tokenizer. This model is specifically designed to excel in tasks requiring structured data arrangement and sequence prediction, leveraging its fine-tuning on a random sample of the SlimOrca dataset. Its primary application is in scenarios where precise data ordering is critical.
Loading preview...