Menu
Menu
MIT uses algorithm to banish window reflections in pics

MIT uses algorithm to banish window reflections in pics

The software can automatically remove reflections when photographing through windows

Researchers at MIT have developed an algorithm that can automatically separate window reflections from a digital image (left) and remove them (top right).

Researchers at MIT have developed an algorithm that can automatically separate window reflections from a digital image (left) and remove them (top right).

You've got the perfect shot of a cityscape from your hotel room -- if it weren't for those pesky reflections in the window.

Photographers are often stymied by their own reflection or that of their camera when shooting through glass, but researchers at MIT and Google Research have developed a method to remove them automatically.

The technique finds glass reflections in photos by using the fact that they're usually made up of two reflections, one slightly offset from the other.

The two reflections can be caused by double panes in windows or even a single, thick pane, resulting in a double or "ghosted" reflection. Since the second reflection is a set distance from the first, the researchers used an algorithm to distinguish the reflections from all the other data in the image.

The algorithm, using a technique developed by MIT's Daniel Zoran and Yair Weiss of Hebrew University of Jerusalem, separates images into 8-by-8-pixel squares. It uses statistics related to the pixels to first detect the separation between the two reflections and then remove the reflections.

The algorithm was trained using tens of thousands of images in databases, and the researchers say it can work in most situations.

Sample images distributed by MIT and in the researchers' paper on the topic showed a variety of results.

In one, the original image shows a fire-escape staircase shot through a window, with the photographer's reflection clearly visible. After processing with the algorithm, the reflection is nearly all gone from the photo, but faint traces of the person's face are visible in the wall by the fire escape.

With refinements, such image-processing software could be useful in smartphones and digital cameras. It could also help machines that use computer vision, such as robots and self-driving cars, make sense of environments where reflections are present.

"The work has potential in improving degraded pictures due to window reflection, like when window shopping," graduate student YiChang Shih from MIT's Computer Science and Artificial Intelligence Laboratory said via email.

Shih said he wants to make the algorithm's ability to detect reflections more robust, but he wouldn't say if or when the research could be commercialized.

The research is to be presented at the Computer Vision and Pattern Recognition conference in Boston in June.

Tim Hornyak covers Japan and emerging technologies for The IDG News Service. Follow Tim on Twitter at @robotopia.

Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the CIO New Zealand newsletter!

Error: Please check your email address.

Tags consumer electronicsGoogleGoogle researchMassachusetts Institute of Technologydigital cameras

More about GoogleIDGMITNewsZoran

Show Comments