TunyTrinh/test_mistral_03
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 5, 2024Architecture:Transformer0.0K Cold

TunyTrinh/test_mistral_03 is a 7 billion parameter language model created by TunyTrinh, formed by merging minhtt/vistral-7b-chat and EmbeddedLLM/Mistral-7B-Merge-14-v0.3 using the SLERP method. This model leverages the strengths of its base components, offering a 4096-token context length. It is designed to combine the capabilities of its merged predecessors, making it suitable for general language generation tasks.

Loading preview...