apexion-ai/Nous-1-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kLicense:anvdl-1.0Architecture:Transformer0.0K Cold

Apollo-1-8B by Noema Research is an 8 billion parameter instruction-tuned causal language model based on Qwen3-8B, featuring a 32k token context length. It is optimized for advanced reasoning, instruction following, and high-performance deployment across multi-domain applications. This model excels in complex problem-solving, code generation, and serves as a robust knowledge assistant.

Loading preview...