wenqiglantz/MistralTrinity-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 13, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
MistralTrinity-7B-slerp by wenqiglantz is a 7 billion parameter language model created by merging Mistral-7B-Instruct-v0.2 and jan-hq/trinity-v1 using a slerp (spherical linear interpolation) method. This model combines the strengths of its base components, offering a versatile foundation for general-purpose language generation tasks. Its 4096-token context length supports a range of applications requiring moderate input and output lengths.
Loading preview...