Mental stress can cause mental illnesses like anxiety and depression. It is often detected by experts, or via intrusive devices attached to human, both are not viable in daily life. Recent research demonstrated that non-intrusive stress detection is viable through the analysis of mouse and gaze dynamics as a user is using the computer. However, these works require prior knowledge of the user interface (UI) layout information to build up appropriate models. This greatly reduces their generalizability to real usage, where most tasks are carried out with dynamic UIs. This paper presents a UI-agnostic stress detection system, Moga, to infer user stress level in tasks without being concerned on the actual UI. Moga adopts an innovative mouse-gaze attraction model, which leverages the coordination between mouse and eye movement without considering the UI information. We evaluate the performance of Moga with tasks on dynamic and fixed UI. Our experimental results demonstrate the effectiveness of the underlying mouse-gaze attraction model, which can achieve stress detection accuracy of 74.3% for dynamic UI tasks, beating the state-of-the-art approach by around 15%. It opens up a new avenue for cognitive-aware intelligent user interface and numerous advanced HCI studies.