liuda1/dm7b_sft_gpt88w_merge
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 25, 2023License:apache-2.0Architecture:Transformer Open Weights Cold
The liuda1/dm7b_sft_gpt88w_merge is a 7 billion parameter language model developed by liuda1, fine-tuned with an English chat dataset and further reinforced with specific datasets. This model demonstrates enhanced chat capabilities, particularly in English, and is intended for conversational AI applications. It features a context length of 4096 tokens, making it suitable for processing moderately sized conversational inputs.
Loading preview...