NLUHOPOE/test-case-5
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 18, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

NLUHOPOE/test-case-5 is a 7 billion parameter Large Language Model developed by Juhwan Lee, fine-tuned for data ordering tasks. Based on the Mistral-7B-v0.1 architecture, it incorporates Grouped-Query Attention and Sliding-Window Attention. This model is specifically optimized for tasks requiring structured data arrangement and sequence prediction.

Loading preview...