ntnq/Stork-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

Stork-7B-slerp is a 7 billion parameter language model created by ntnq, formed by an slerp merge of bofenghuang/vigostral-7b-chat and jpacifico/French-Alpaca-7B-Instruct-beta. This model leverages the strengths of its base components, combining a general-purpose chat model with one specifically tuned for French instruction following. It is designed for conversational AI applications requiring a blend of broad knowledge and specialized French language capabilities within a 4096 token context window.

Loading preview...