His Image Processing and Computer Vision Lab at the institute uses the power of artificial neural networks to restore images affected by rain-streaks, raindrops, haze, or motion blur. The team found that it wasn’t easy for a single neural network to identify the degraded portion of a picture and clean it. Hence, the team decided the split the task into two stages. First, a network localises the degraded or the blurred part. In the following step then, a subsequent network uses the acquired information to restore the image.
“Our premise is to use the auxiliary task of degradation mask prediction to guide the restoration process. We demonstrate that solving this auxiliary task injects crucial localising ability in network layers. We transfer this ability to the main restoration network using attentive knowledge-distillation and focus on the refinement of degraded regions by exploiting this additional knowledge,” he explains.
The method used by Dr. Rajagopalan’s team, detailed in a paper titled ‘Degradation Aware Approach to Image Restoration Using Knowledge Distillation’ on IEEE, appeared to have outperformed other strategies previously attempted to restore old images. The team said it utilised “publicly available datasets” of rain streak, haze, raindrop, and motion blur to test their model.