kagelabs/KageAI-7B-v1.2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 17, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

KageAI-7B-v1.2 by KageLabs is a 7 billion parameter model based on Mistral-7B-v0.3, specifically fine-tuned for specialized technical intelligence. Utilizing GaLore (Gradient Low-Rank Projection) for training, it excels in hardware architecture, semiconductors, and system troubleshooting. This model is optimized to act as a high-end tech guru, providing detailed insights into complex engineering problems.

Loading preview...