Guanglong/mojing-llm-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer Open Weights Cold

Guanglong/mojing-llm-7b is a 7 billion parameter language model developed by Guanglong, fine-tuned from Llama-2-7b. This model is specifically instruction-tuned using the mojing-llm dataset. Its primary strength lies in its specialized fine-tuning, making it suitable for tasks aligned with the dataset's characteristics.

Loading preview...