rombodawg/LosslessMegaCoder-llama2-7b-mini
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Aug 13, 2023License:llama2Architecture:Transformer0.0K Open Weights Cold

LosslessMegaCoder-llama2-7b-mini is a 7 billion parameter Llama 2-based language model developed by rombodawg and andreaskoepf, with a context length of 4096 tokens. This model is specifically trained on the LosslessMegaCodeTrainingV2_1m_Evol_Uncensored dataset, filtered for entries with at least 100 tokens, making it highly optimized for coding tasks. It demonstrates strong performance in code generation, aiming to be a leading coding model within its size class.

Loading preview...