Automatic kidney and tumor segmentation from CT volumes is essential for clinical diagnosis and surgery planning. However, it is still a very challenging problem as kidney and tumor usually exhibit various scales, irregular shapes and blurred contours. In this paper, we propose a memory efficient automatic kidney and tumor segmentation algorithm based on non-local context guided 3D U-Net. Different from the traditional 3D U-Net, we implement a lightweight 3D U-Net with depthwise separable convolution (DSC), which can not only avoid over fitting but also improve the generalization ability. By encoding long range pixel-wise dependencies in features and recalibrating the weight of channels, we also develop a non-local context guided mechanism to capture global context and fully utilize the long range dependencies during the feature selection. Thanks to the non-local context guidance (NCG), we can successfully complement high-level semantic information with the spatial information simply based on a skip connection between encoder and decoder in the 3D U-Net, and finally realize a more accurate 3D kidney and tumor segmentation network. Our proposed method was validated and evaluated with KiTS dataset, including various 3D kidney and tumor patient cases. Convincing visual and statistical results verified effectiveness of our method. Comparisons with state-of-the-art methods were also conducted to demonstrate its advantages in terms of both efficiency and accuracy.