reaperdoesntknow/Symbiotic-1B
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:May 6, 2025License:afl-3.0Architecture:Transformer0.0K Warm

SymbioticLM-1B by reaperdoesntknow is a 1 billion parameter hybrid symbolic-transformer model, built on a Qwen-1B backbone. It integrates a symbolic processing pipeline and persistent episodic memory for enhanced reasoning. This model is optimized for lightweight, memory-augmented symbolic inference in constrained environments, excelling at tasks like procedural planning, math modeling, and graph-based explanation generation.

Loading preview...