A multi-level text representation model within background knowledge based on human cognitive process for big data analysis

X. Wei, J. Zhang, D.D. Zeng, Qing Li

Research output: Journal article publicationJournal articleAcademic researchpeer-review

4 Citations (Scopus)


© 2016, Springer Science+Business Media New York. Text representation is part of the most fundamental work in text comprehension, processing, and search. Various kinds of work has been proposed to mine the semantics in texts and then to represent them. However, most of them only focus on how to mine semantics from the text itself, while few of them take the background knowledge into consideration, which is very important to text understanding. In this paper, on the basis of human cognitive process, we propose a multi-level text representation model within background knowledge, called TRMBK. It is composed of three levels, which are machine surface code, machine text base and machine situational model. All of them are able to be constructed automatically to acquire semantics both inside and outside of the texts. Simultaneously, we also propose a method to establish background knowledge automatically and offer supports for the current text comprehension. Finally, experiments and comparisons have been presented to show the better performance of TRMBK.
Original languageEnglish
Pages (from-to)1475-1487
Number of pages13
JournalCluster Computing
Issue number3
Publication statusPublished - 1 Sep 2016
Externally publishedYes


  • Background knowledge
  • Human cognitive process
  • Semantics
  • Situational model
  • Surface code
  • Text base
  • Text comprehension
  • Text representation

ASJC Scopus subject areas

  • Software
  • Computer Networks and Communications

Cite this