Backpropagation (BP) algorithm is very popular in supervised learning for feed-forward neural networks. However, it is sometimes slow and easily trapped into a local minimum or a flat-spot area (known as the local minimum and flat-spot area problem respectively). Many modifications have been proposed to speed up its convergence rate but they seldom improve the global convergence capability. Some fast learning algorithms have been proposed recently to solve these two problems: Wrong Output Modification (WOM) is one new algorithm that can improve the global convergence capability significantly. However, some limitations exist in WOM so that it cannot solve the local minimum and flat-spot problem effectively. In this paper, some enhancements are proposed to further improve the performance of WOM by (a) changing the mechanism to escape from a local minimum or a flat-spot area and (b) adding a fast checking procedure to identify the existence of a local minimum or a flat-spot area. The performance investigation shows that the proposed enhancements can improve the performance of WOM significantly when it is applied into different fast learning algorithms. Moreover, WOM with these enhancements is also applied to a very popular second-order gradient descent learning algorithm, Levenberg-Marquardt (LM) algorithm. The performance investigation shows that it can significantly improve the performance of LM.