Multi-view learning for emotion detection in code-switching texts

Yat Mei Lee, Zhongqing Wang

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

10 Citations (Scopus)

Abstract

Previous researches have placed emphasis on analyzing emotions in monolingual text, neglecting the fact that emotions are often found in bilingual or code-switching posts in social media. Traditional methods for the identification or classification of emotion fail to accommodate the code-switching content. To address this challenge, in this paper, we propose a multi-view learning framework to learn and detect the emotions through both monolingual and bilingual views. In particular, the monolingual views are extracted from the monolingual text separately, and the bilingual view is constructed with both monolingual and translated text collectively. Empirical studies demonstrate the effectiveness of our proposed approach in detecting emotions in code-switching texts.
Original languageEnglish
Title of host publicationProceedings of 2015 International Conference on Asian Language Processing, IALP 2015
PublisherIEEE
Pages90-93
Number of pages4
ISBN (Electronic)9781467395953
DOIs
Publication statusPublished - 12 Apr 2016
EventInternational Conference on Asian Language Processing, IALP 2015 - Suzhou, China
Duration: 24 Oct 201525 Oct 2015

Conference

ConferenceInternational Conference on Asian Language Processing, IALP 2015
Country/TerritoryChina
CitySuzhou
Period24/10/1525/10/15

Keywords

  • code-switching
  • emotion analysis
  • multi-view learning

ASJC Scopus subject areas

  • Linguistics and Language
  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Signal Processing

Fingerprint

Dive into the research topics of 'Multi-view learning for emotion detection in code-switching texts'. Together they form a unique fingerprint.

Cite this