Abstract
While mass personalization manufacturing paradigm increasingly requires robots to handle complex and variable tasks, traditional robot-centric programming methods remain constrained by their expert-dependent nature and lack of adaptability. To address these limitations, this research proposes a scene-centric robot programming approach using MR-assisted interactive 3D segmentation, where operators naturally manipulate the digital twin (DT) of real-world objects to control the robot, rather than considering cumbersome end-effector programming. This framework combines Segment Anything Model (SAM) and 3D Gaussian Splatting (3DGS) for cost-effective, zero-shot, and flexible scene reconstruction and segmentation. Scale consistency and multi-coordinate calibration ensure seamless MR-driven interaction and robot execution. Finally, experimental results verify improved segmentation accuracy and computational efficiency, particularly in cluttered industrial environments, while case studies validate the method's feasibility for real-world implementation. This research illustrates a promising human–robot collaborative manufacturing paradigm where virtual scene editing directly informs robot actions, demonstrating a novel MR-assisted interaction method beyond low-level robot movement control.
| Original language | English |
|---|---|
| Article number | 103146 |
| Number of pages | 14 |
| Journal | Robotics and Computer-Integrated Manufacturing |
| Volume | 98 |
| DOIs | |
| Publication status | Published - Apr 2026 |
Keywords
- 3D segmentation
- Human–robot interaction
- Mixed reality
- Robot programming
- Smart manufacturing
ASJC Scopus subject areas
- Control and Systems Engineering
- Software
- General Mathematics
- Computer Science Applications
- Industrial and Manufacturing Engineering