Guiding symbolic natural language grammar induction via transformer-based sequence probabilities

Ben Goertzel, Andrés Suárez-Madrigal, Gino Yu

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

2 Citations (Scopus)


A novel approach to automated learning of syntactic rules governing natural languages is proposed, based on using probabilities assigned to sentences (and potentially longer word sequences) by transformer neural network language models to guide symbolic learning processes like clustering and rule induction. This method exploits the learned linguistic knowledge in transformers, without any reference to their inner representations; hence, the technique is readily adaptable to the continuous appearance of more powerful language models. We show a proof-of-concept example of our proposed technique, using it to guide unsupervised symbolic link-grammar induction methods drawn from our prior research.

Original languageEnglish
Title of host publicationArtificial General Intelligence - 13th International Conference, AGI 2020, Proceedings
EditorsBen Goertzel, Alexey Potapov, Aleksandr I. Panov, Roman Yampolskiy
PublisherSpringer Nature Switzerland AG
Number of pages11
ISBN (Print)9783030521516
Publication statusPublished - Jul 2020
Event13th International Conference on Artificial General Intelligence, AGI 2020 - St. Petersburg, Russian Federation
Duration: 16 Sep 202019 Sep 2020

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12177 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference13th International Conference on Artificial General Intelligence, AGI 2020
Country/TerritoryRussian Federation
CitySt. Petersburg


  • BERT
  • Transformers
  • Unsupervised grammar induction

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Cite this