BarryFutureman/NeuralLake-Variant1-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 23, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

NeuralLake-Variant1-7B is a 7 billion parameter language model developed by BarryFutureman, created by merging pre-trained models using mergekit. With a 4096-token context length, this model is a composite of BarryFutureman/WildWest-Variant3-7B, BarryFutureman/NeuralTurdusVariant1-7B, and alnrg2arg/blockchainlabs_7B_merged_test2_4. Its unique merged architecture suggests a focus on combining diverse linguistic capabilities from its constituent models.

Loading preview...