Scalable TSK fuzzy modeling for very large datasets using minimal-enclosing-ball approximation

Zhaohong Deng, Kup Sze Choi, Fu Lai Korris Chung, Shitong Wang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

101 Citations (Scopus)

Abstract

In order to overcome the difficulty in TakagiSugenoKang (TSK) fuzzy modeling for large datasets, scalable TSK (STSK) fuzzy-model training is investigated in this study based on the core-set-based minimal-enclosing-ball (MEB) approximation technique. The specified L2-norm penalty-based ε-insensitive criterion is first proposed for TSK-model training, and it is found that such TSK fuzzy-model training can be equivalently expressed as a center-constrained MEB problem. With this finding, an STSK fuzzy-model-training algorithm, which is called STSK, for large or very large datasets is then proposed by using the core-set-based MEB-approximation technique. The proposed algorithm has two distinctive advantages over classical TSK fuzzy-model training algorithms: The maximum space complexity for training is not reliant on the size of the training dataset, and the maximum time complexity for training is linear with the size of the training dataset, as confirmed by extensive experiments on both synthetic and real-world regression datasets.
Original languageEnglish
Article number5629439
Pages (from-to)210-226
Number of pages17
JournalIEEE Transactions on Fuzzy Systems
Volume19
Issue number2
DOIs
Publication statusPublished - 1 Apr 2011

Keywords

  • ε-insensitive training
  • Core set
  • core vector machine (CVM)
  • minimal-enclosing-ball (MEB) approximation
  • TakagiSugenoKang (TSK) fuzzy modeling
  • very large datasets

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computational Theory and Mathematics
  • Artificial Intelligence
  • Applied Mathematics

Cite this