leejaymin/etri-ones-solar
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Mar 31, 2024License:mitArchitecture:Transformer Open Weights Cold

etri-ones-solar is a 10.7 billion parameter auto-regressive language model developed by leejaymin, fine-tuned from the SOLAR transformer architecture. This model is based on the SOLAR-10.7B-v1.0 base model and has a context length of 4096 tokens. It is fine-tuned using an open instruction dataset, making it suitable for general language generation tasks.

Loading preview...