NovoCode/Novocode7b-v2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 23, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

NovoCode/Novocode7b-v2 is a 7 billion parameter causal language model based on the Mistral architecture, developed by NovoCode. Trained from scratch on the /leet10k-alpaca dataset, it features a 4096-token context length and is optimized for general language understanding and generation tasks. This model demonstrates competitive performance across various benchmarks, including MMLU and HellaSwag, making it suitable for a range of applications requiring robust language capabilities.

Loading preview...