ReXeeD/TASX-Cmd-0.5B
ReXeeD/TASX-Cmd-0.5B is a 0.5 billion parameter language model, fine-tuned from Qwen2.5-0.5B, specifically designed for robotics. It translates natural language, including slang and typos, into strict JSON command sequences for ROS2, SLAM, and physical robot control. Optimized for local execution on edge hardware, this model excels at generating precise, execution-ready commands and handling complex human intent for robotic actions. It features a 32768 token context length, enabling robust interpretation of detailed instructions.
Loading preview...
TASX-Command-0.5B: Robotics Control via Natural Language
TASX-Command-0.5B is a highly specialized, lightweight language model (0.5 billion parameters) built upon the Qwen2.5-0.5B base. Its core function is to convert diverse natural language inputs into precise, machine-executable JSON command sequences for robotics applications, including ROS2, SLAM, and direct physical robot control. This model is engineered for efficiency, allowing it to run locally on edge hardware like a Raspberry Pi using llama.cpp.
Key Capabilities
- Strict JSON Output: Guarantees valid JSON arrays as output, avoiding conversational filler.
- Robust Language Understanding: Effectively processes natural language inputs containing slang, typos, and complex phrasing, mapping them to accurate commands and numerical values.
- Dynamic Location Extraction: Automatically converts spoken location names (e.g., "Professor Xavier's Office") into standardized
snake_caseformat (e.g.,professor_xavier_office). - Physical Constraint Logic (Macros): Incorporates advanced robotic logic to generate implicit macro sequences. For instance, a command to "fetch my laptop" will automatically include necessary posture adjustments like
sitandstandactions. - Supported Actions: Trained to output a strict set of 20 commands across categories such as Teleop (movement, speed, stops), Nav2 (autonomous navigation to waypoints), and Stunts (posture and tricks).
Good For
- Edge Computing Robotics: Ideal for deploying on resource-constrained robotic platforms requiring local, real-time natural language processing.
- Human-Robot Interaction: Enables intuitive control of robots through spoken or typed commands, even with imperfect human input.
- Automated Task Execution: Facilitates the creation of complex, multi-step robotic behaviors from simple, high-level instructions.