SVECTOR-CORPORATION/Theta-35-Mini
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Apr 28, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

SVECTOR-CORPORATION/Theta-35-Mini is a compact 3 billion parameter language model developed by SVECTOR, built on the Qwen2-style transformer architecture. It is trained using Group Relative Policy Optimization (GRPO) for enhanced alignment and efficiency. This model is designed for low-latency inference, making it ideal for resource-constrained and on-device applications.

Loading preview...