Arbitrarily Applicable Same/Opposite Relational Responding with NARS
By: Robert Johansson, Patrick Hammer, Tony Lofthouse
Potential Business Impact:
Computer learns to connect ideas like humans.
Same/opposite relational responding, a fundamental aspect of human symbolic cognition, allows the flexible generalization of stimulus relationships based on minimal experience. In this study, we demonstrate the emergence of \textit{arbitrarily applicable} same/opposite relational responding within the Non-Axiomatic Reasoning System (NARS), a computational cognitive architecture designed for adaptive reasoning under uncertainty. Specifically, we extend NARS with an implementation of \textit{acquired relations}, enabling the system to explicitly derive both symmetric (mutual entailment) and novel relational combinations (combinatorial entailment) from minimal explicit training in a contextually controlled matching-to-sample (MTS) procedure. Experimental results show that NARS rapidly internalizes explicitly trained relational rules and robustly demonstrates derived relational generalizations based on arbitrary contextual cues. Importantly, derived relational responding in critical test phases inherently combines both mutual and combinatorial entailments, such as deriving same-relations from multiple explicitly trained opposite-relations. Internal confidence metrics illustrate strong internalization of these relational principles, closely paralleling phenomena observed in human relational learning experiments. Our findings underscore the potential for integrating nuanced relational learning mechanisms inspired by learning psychology into artificial general intelligence frameworks, explicitly highlighting the arbitrary and context-sensitive relational capabilities modeled within NARS.
Similar Papers
Modeling Arbitrarily Applicable Relational Responding with the Non-Axiomatic Reasoning System: A Machine Psychology Approach
Artificial Intelligence
AI learns to understand and use language like humans.
Parallel Thinking, Sequential Answering: Bridging NAR and AR for Efficient Reasoning
Artificial Intelligence
Makes computers solve hard problems much faster.
Enhancing Large Language Models with Neurosymbolic Reasoning for Multilingual Tasks
Computation and Language
Helps computers understand and connect many facts.