Developing a toolkit for prototyping machine learning-empowered products: The design and evaluation of ml-rapid

Lingyun Sun, Zhibin Zhou, Wenqi Wu, Yuyang Zhang, Rui Zhang, Wei Xiang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

6 Citations (Scopus)


Machine learning (ML) and design co-support the development of intelligent products, which makes ML an emerging technology that needs to be further understood in design practice. However, the unusual attributes of ML and the transformations in the prototyping process frequently prevent most designers from continuous innovation. Thus, we invited designers to work together in a participatory design process and then developed ML-Rapid, an easy-to-use and flexible ML prototyping toolkit. ML-Rapid helps designers to rapidly empower their physical prototype with ML by invoking simple code while exploring more design possibilities. A method of applying the toolkit within the design process was also proposed to promote meaningful innovation opportunities. We evaluated our work in a project called Design for Information Product. The evaluation results showed that designers who were new to ML programming increased their understanding of ML after participating in the project, and ML-Rapid lowered the barrier to ML for designers by allowing them to explore design possibilities throughout the main steps of the ML process.

Original languageEnglish
Pages (from-to)35-50
Number of pages16
JournalInternational Journal of Design
Issue number2
Publication statusPublished - Aug 2020
Externally publishedYes


  • Design
  • Machine Learning
  • Prototyping
  • Toolkit

ASJC Scopus subject areas

  • Visual Arts and Performing Arts
  • Arts and Humanities (miscellaneous)
  • General Social Sciences
  • General Engineering
  • Computer Science Applications


Dive into the research topics of 'Developing a toolkit for prototyping machine learning-empowered products: The design and evaluation of ml-rapid'. Together they form a unique fingerprint.

Cite this