alnrg2arg/blockchainlabs_tinyllama_fusion_LHK_yunkong_v2
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Feb 19, 2024License:mitArchitecture:Transformer Open Weights Warm
The alnrg2arg/blockchainlabs_tinyllama_fusion_LHK_yunkong_v2 is a 1.1 billion parameter language model based on TinyLlama, created by alnrg2arg. This model is a fusion of three distinct models, including TinyLlama-1.1B-Chat-v1.0, HanNayeoniee/LHK_DPO_v1, and yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B, optimized for on-device small language model (sLM) applications. It leverages a fusion strategy to combine their strengths, making it suitable for resource-constrained environments.
Loading preview...