TY - GEN
T1 - Controlling combinatorial explosion in inference via synergy with nonlinear-dynamical attention allocation
AU - Goertzel, Ben
AU - Belachew, Misgana Bayetta
AU - Ikle’, Matthew
AU - Yu, Gino Tu
PY - 2016/1/1
Y1 - 2016/1/1
N2 - One of the core principles of the OpenCog AGI design, “cognitive synergy”, is exemplified by the synergy between logical reasoning and attention allocation. This synergy centers on a feedback in which nonlinear-dynamical attention-spreading guides logical inference control, and inference directs attention to surprising new conclusions it has created. In this paper we report computational experiments in which this synergy is demonstrated in practice, in the context of a very simple logical inference problem. More specifically: First-order probabilistic inference generates conclusions, and its inference steps are pruned via “Short Term importance” (STI) attention values associated to the logical Atoms it manipulates. As inference generates conclusions, information theory is used to assess the surprisingness value of these conclusions, and the “short term importance” attention values of the Atoms representing the conclusions are updated accordingly. The result of this feedback is that meaningful conclusions are drawn after many fewer inference steps than would be the case without the introduction of attention allocation dynamics and feedback therewith. This simple example demonstrates a cognitive dynamic that is hypothesized to be very broadly valuable for general intelligence.
AB - One of the core principles of the OpenCog AGI design, “cognitive synergy”, is exemplified by the synergy between logical reasoning and attention allocation. This synergy centers on a feedback in which nonlinear-dynamical attention-spreading guides logical inference control, and inference directs attention to surprising new conclusions it has created. In this paper we report computational experiments in which this synergy is demonstrated in practice, in the context of a very simple logical inference problem. More specifically: First-order probabilistic inference generates conclusions, and its inference steps are pruned via “Short Term importance” (STI) attention values associated to the logical Atoms it manipulates. As inference generates conclusions, information theory is used to assess the surprisingness value of these conclusions, and the “short term importance” attention values of the Atoms representing the conclusions are updated accordingly. The result of this feedback is that meaningful conclusions are drawn after many fewer inference steps than would be the case without the introduction of attention allocation dynamics and feedback therewith. This simple example demonstrates a cognitive dynamic that is hypothesized to be very broadly valuable for general intelligence.
UR - http://www.scopus.com/inward/record.url?scp=84977503282&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-41649-6_34
DO - 10.1007/978-3-319-41649-6_34
M3 - Conference article published in proceeding or book
SN - 9783319416489
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 334
EP - 343
BT - Artificial General Intelligence - 9th International Conference, AGI 2016, Proceedings
PB - Springer Verlag
T2 - 9th International Conference on Artificial General Intelligence, AGI 2016
Y2 - 16 July 2016 through 19 July 2016
ER -