tenyx/TenyxChat-7B-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 5, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

TenyxChat-7B-v1 is a 7 billion parameter instruction-tuned causal language model developed by Tenyx, based on OpenChat 3.5. It is aligned using Direct Preference Optimization (DPO) on the UltraFeedback dataset, leveraging Tenyx's fine-tuning technology to mitigate catastrophic forgetting. This model is designed as a useful assistant, excelling in multi-turn chat benchmarks like MT-Bench while maintaining performance on general reasoning tasks.

Loading preview...