jeiku/Lunar_10.7B
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Feb 19, 2024License:cc-by-nc-sa-4.0Architecture:Transformer Open Weights Cold
jeiku/Lunar_10.7B is a 10.7 billion parameter language model developed by jeiku, created by SLERP merging with Sao10K/Sensualize-Solar-10.7B. This model is specifically fine-tuned for companion bot applications, excelling in both intimate and general conversational interactions. It is designed to provide engaging and personalized dialogue experiences, making it suitable for use cases requiring advanced conversational AI.
Loading preview...