The use of hand gestures is one of the commonly used communication approaches in human daily life, especially for the deaf and dumb. Hand gesture recognition can be adopted in human-computer interaction for converting hand gestures into words or sentences. Unfortunately, the same gesture may have diverse meanings in different countries. With the aim of eliminating the communication barriers between hearing-impaired communities and the general people, an efficient interaction user interface created with the augmented reality technique and leap motion controller for hand gesture recognition and translation is proposed in this paper. Five hand gestures captured by a leap motion controller were used for learning and recognizing through machine learning methodologies, including Support Vector Machine, K-Nearest Neighbor, Convolutional Neural Network, Deep Neural Network and Decision Tree. The experimental results from different classifiers reveal the practicability of employing hand gesture recognition in text translation. The hand gesture recognition system should be capable of reducing the communication gap between hearing disabilities and the public so as to avoid deaf and mute people being isolated from society.