Cascading correlation learning (CasCor) is a constructive algorithm which determines its own network size and typology by adding hidden units one at a time based on covariance with output error. Its generalization performance and computational time depends on the cascade architecture and iteratively tuning of the connection weights. CasCor was developed to address the slowness of backpropagation (BP), however, recent studies have concluded that in many applications, CasCor generalization performance does not guarantee to be optimal. Apart from BP, CasCor learning speed can be considered slow because of iterative tuning of connection weights by numerical optimization techniques. Therefore, this paper addresses CasCor bottlenecks and introduces a new algorithm with improved cascade architecture and tuning free learning to achieve the objectives of better generalization performance and fast learning ability. The proposed algorithm determines input connection weights by orthogonally transforming a set of correlated input units into uncorrelated hidden units and output connection weights by considering hidden units and the output units in a linear relationship. This research work is unique in that it does not need a random generation of connection weights. A comparative study on nonlinear classification and regression tasks has proven that the proposed algorithm has better generalization performance and learns many times faster than CasCor.