Distilling Privileged Knowledge for Anomalous Event Detection From Weakly Labeled Videos

Tianshan Liu, Kin Man Lam, Jun Kong

Research output: Journal article publicationJournal articleAcademic researchpeer-review

12 Citations (Scopus)

Abstract

Weakly supervised video anomaly detection (WS-VAD) aims to identify the snippets involving anomalous events in long untrimmed videos, with solely video-level binary labels. A typical paradigm among the existing WS-VAD methods is to employ multiple modalities as inputs, e.g., RGB, optical flow, and audio, as they can provide sufficient discriminative clues that are robust to the diverse, complicated real-world scenes. However, such a pipeline has high reliance on the availability of multiple modalities and is computationally expensive and storage demanding in processing long sequences, which limits its use in some applications. To address this dilemma, we propose a privileged knowledge distillation (KD) framework dedicated to the WS-VAD task, which can maintain the benefits of exploiting additional modalities, while avoiding the need for using multimodal data in the inference phase. We argue that the performance of the privileged KD framework mainly depends on two factors: 1) the effectiveness of the multimodal teacher network and 2) the completeness of the useful information transfer. To obtain a reliable teacher network, we propose a cross-modal interactive learning strategy and an anomaly normal discrimination loss, which target learning task-specific cross-modal features and encourage the separability of anomalous and normal representations, respectively. Furthermore, we design both representation-and logits-level distillation loss functions, which force the unimodal student network to distill abundant privileged knowledge from the well-trained multimodal teacher network, in a snippet-to-video fashion. Extensive experimental results on three public benchmarks demonstrate that the proposed privileged KD framework can train a lightweight yet effective detector, for localizing anomaly events under the supervision of video-level annotations.

Original languageEnglish
Article number10098140
Pages (from-to)1-15
Number of pages15
JournalIEEE Transactions on Neural Networks and Learning Systems
DOIs
Publication statusPublished - Apr 2023

Keywords

  • Annotations
  • Anomaly detection
  • Feature extraction
  • Knowledge engineering
  • Privileged knowledge distillation (KD)
  • Task analysis
  • teacher–student model
  • Training
  • video anomaly detection (VAD)
  • Videos
  • weakly supervised learning

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Distilling Privileged Knowledge for Anomalous Event Detection From Weakly Labeled Videos'. Together they form a unique fingerprint.

Cite this