PhotoHelper: Portrait photographing guidance via deep feature retrieval and fusion

Nan Jiang, Bin Sheng, Ping Li, Tong Yee Lee

Research output: Journal article publicationJournal articleAcademic researchpeer-review

17 Citations (Scopus)


We introduce a new photographing guidance (PhotoHelper) for amateur photographers to enhance their portrait photo quality using deep feature retrieval and fusion. In our model, we comprehensively integrate empirical aesthetic rules, traditional machine learning algorithms and deep neural networks to extract different kinds of features in both color and space aspects. With these features, we build a modified random forest with a structured photograph collection to identify types of photos. We also define the composition matching score to measure the similarity between the given photo and the reference photo. By combining all of the above processes, a one-stop deep portrait photographing guidance is constructed to provide users with professional reference photographs that are similar to the current scene and automatically generate spatial composition guidance according to the user-selected reference photo. Experiments and evaluations show that the aesthetic quality of portrait photos can be significantly improved via the composition guidance of our photographing guidance approach.

Original languageEnglish
Pages (from-to)1-14
Number of pages14
JournalIEEE Transactions on Multimedia
Publication statusAccepted/In press - Jan 2022


  • aesthetic assessment
  • Deep feature fusion
  • Deep learning
  • Feature extraction
  • Image color analysis
  • image retrieval
  • Neural networks
  • photographing guidance
  • Real-time systems
  • spatial composition rule
  • Task analysis
  • Visualization

ASJC Scopus subject areas

  • Signal Processing
  • Media Technology
  • Computer Science Applications
  • Electrical and Electronic Engineering


Dive into the research topics of 'PhotoHelper: Portrait photographing guidance via deep feature retrieval and fusion'. Together they form a unique fingerprint.

Cite this