Relative CNN RNN Learning Relative Atmospheric Visibility From Images in Matlab

Relative CNN RNN Learning Relative Atmospheric Visibility From Images in Matlab

Abstract:

We propose a deep learning approach for directly estimating relative atmospheric visibility from outdoor photos without relying on weather images or data that require expensive sensing or custom capture. Our data-driven approach capitalizes on a large collection of Internet images to learn rich scene and visibility varieties. The relative CNNndash;RNN coarse-to-fine model, where CNN stands for convolutional neural network and RNN stands for recurrent neural network, exploits the joint power of relative support vector machine, which has a good ranking representation, and the data-driven deep learning features derived from our novel CNNndash;RNN model. The CNNndash; RNN model makes use of shortcut connections to bridge a CNN module and an RNN coarse-to-fine module. The CNN captures the global view while the RNN simulates humanrsquo;s attention shift, namely, from the whole image (global) to the farthest discerned region (local). The learned relative model can be adapted to predict absolute visibility in limited scenarios. Extensive experiments and comparisons are performed to verify our method. We have built an annotated dataset consisting of about 40 000 images with 0.2 million human annotations. The large-scale, annotated visibility data set will be made available to accompany this paper.