Sao10K/L3.1-8B-Niitama-v1.1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Aug 3, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

Sao10K/L3.1-8B-Niitama-v1.1 is an experimental 8 billion parameter language model developed by Sao10K, featuring a 32768 token context length. This model explores alternative data shuffling and formatting methods compared to its L3 counterpart, resulting in distinct performance characteristics. It is designed for experimental evaluation of training methodologies rather than specific production use cases.

Loading preview...