NovusResearch/Novus-7b-tr_v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 25, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold
NovusResearch/Novus-7b-tr_v1 is a 7 billion parameter language model developed by NovusResearch, featuring a 4096-token context length. This model is designed for general language understanding and generation tasks. Its architecture is optimized for efficient deployment and performance across various applications. It serves as a foundational model for further fine-tuning and research.
Loading preview...