Accurate and globally referenced positioning is fatal to the safety-critical autonomous driving vehicles (ADV). Multi-sensor integration is becoming ubiquitous for ADV to guarantee the robustness and accuracy of the navigation system. Unfortunately, the existing sensor integration systems are still heavily challenged in urban canyons, such as Tokyo and Hong Kong. The main reason behind the performance degradation is due to the varying environmental conditions, such as tall buildings and surrounded dynamic objects. GNSS receiver is an indispensable sensor for ADV, which relies heavily on the environmental conditions. The performance of GNSS can be significantly affected by signal reflections and blockages from buildings or dynamic objects. With the enhanced capability of perception, fully or partially sensing the environment real-time becomes possible using onboard sensors, such as camera or LiDAR. Inspired by the fascinating progress in perception, this paper proposes a new integrated navigation scheme, the perception aided sensor integrated navigation (PASIN). Instead of directly integrating the sensor measurements from diverse sensors, the PASIN leverages the onboard and real-time perception to assist the single measurement, such as GNSS positioning, before it is integrated with other sensors including inertial navigation systems (INS). This paper reviews several PASIN, especially on the GNSS positioning. As an example, GNSS is aided by the perception of a camera or LiDAR sensors, are conducted in dense urban canyons to validate this novel sensor integration scheme. The proposed PASINS can also be extended to LiDAR- or visual- centered navigation system in the future.