reaperdoesntknow/Symbiotic-1B
SymbioticLM-1B by reaperdoesntknow is a 1 billion parameter hybrid symbolic-transformer model based on Qwen-1B, designed for memory-augmented reasoning. It integrates a rotary transformer backbone with a symbolic processing pipeline and a persistent episodic memory of 2048 symbolic vectors. This model excels at symbolic reasoning, procedural planning, and mathematical modeling in resource-constrained environments, making it suitable for CPU and embedded inference.
Loading preview...
Overview
SymbioticLM-1B is a compact, 1 billion parameter hybrid model developed by reaperdoesntknow, fusing a Qwen-1B rotary transformer with a sophisticated symbolic processing pipeline and persistent episodic memory. It's engineered for efficient reasoning in environments with limited resources, such as CPU and embedded systems.
Key Capabilities & Architecture
- Hybrid Design: Combines a Qwen-1B transformer backbone with a symbolic engine for enhanced reasoning.
- Symbolic Processing: Features advanced symbolic modules like ThoughtDynamicsLNN and CrystallineProcessor (DNAConv GNN).
- Memory-Augmented: Utilizes 2048 symbolic vectors with entropic and contextual retrieval for persistent memory.
- Dream Mode: Includes a "Dream Mode" for symbolic simulation via a ThoughtGenerator.
- Discrepancy Calculus Foundation: Developed under the Discrepancy Calculus (DISC) framework, which treats training singularities as structural signals for learning.
Intended Use Cases
- CPU-optimized symbolic inference.
- Educational agents requiring memory and logical processing.
- Graph-based explanation generation.
- Procedural planning, mathematical modeling, and small-scale code generation.
Limitations
- Less fluent in free-form language compared to larger, purely generative models.
- Symbolic accuracy benefits from memory curation.
- Complex queries in "Dream Mode" may require warm-up or symbolic seeding.