EdgeTB: a Hybrid Testbed for Distributed Machine Learning at the Edge with High Fidelity

Lei Yang, Fulin Wen, Jiannong Cao, Zhenyu Wang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

1 Citation (Scopus)


—Distributed Machine Learning (DML) at the edge has become an essential topic for providing low-latency intelligence near
the data sources. However, both the development and testing of DMLs lack sufficient support. Reusable libraries that abstract the
general functionalities of DMLs are needed for rapid development. Moreover, existing physical testbeds are usually small and lack
network flexibility, while virtual testbeds like simulators and emulators lack fidelity. This paper proposes a novel hybrid testbed EdgeTB,
which provides numerous emulated nodes to generate large-scale and network-flexible test environments while incorporating physical
nodes to guarantee fidelity. EdgeTB manages physical nodes and emulated nodes uniformly and supports arbitrary network topologies
between nodes through dynamic configurations. Importantly, we propose Role-oriented development to support the rapid development
of DMLs. Through case studies and experiments, we demonstrate that EdgeTB provides convenience for efficiently developing and
testing DMLs in various structures with high fidelity and scalability
Original languageEnglish
Pages (from-to)2540-2553
Number of pages14
JournalIEEE Transactions on Parallel and Distributed Systems
Issue number10
Publication statusPublished - 1 Oct 2022


  • Testbed
  • emulator
  • edge computing
  • distributed machine learning


Dive into the research topics of 'EdgeTB: a Hybrid Testbed for Distributed Machine Learning at the Edge with High Fidelity'. Together they form a unique fingerprint.

Cite this