HIDeGAN: A Hyperspectral-guided Image Dehazing GAN

Abstract

Haze removal in images captured from a diverse set of scenarios is a very challenging problem. The existing dehazing methods either reconstruct the transmission map or directly estimate the dehazed image in RGB color space. In this paper, we make a first attempt to propose a Hyperspectral-guided Image Dehazing Generative Adversarial Network (HIDEGAN). The HIDEGAN architecture is formulated by designing a enhanced version of CYCLEGAN named R2HCYCLE and an enhanced conditional GAN named H2RGAN. The R2HCYCLE makes use of the hyperspectral-image (HSI) in combination with cycle-consistency and skeleton losses in order to improve the quality of information recovery by analyzing the entire spectrum. The H2RGAN estimates the clean RGB image from the hazy hyperspectral image generated by the R2HCYCLE. The models designed for spatial-spectralspatial mapping generate visually better haze-free images. To facilitate HSI generation, datasets from spectral reconstruction challenge at NTIRE 2018 and NTIRE 2020 are used. A comprehensive set of experiments were conducted on the D-Hazy,and the recent RESIDE-Standard (SOTS), RESIDE-β (OTS) and RESIDE-Standard (HSTS) datasets. The proposed HIDEGAN outperforms the existing state-ofthe-art in all these datasets.

Publication
Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 2020
Harsh Sinha
Harsh Sinha
Graduate Student

My research interests include computer vision, biometrics, domain adaptation, machine learning.