Researchers Create First Artificial Vision System for Land and Water | MIT News

Giving our hardware sight has enabled a host of applications in self-driving cars, object detection, and crop monitoring. But unlike animals, synthetic vision systems cannot simply evolve in natural habitats. Dynamic visual systems capable of navigating both land and water have therefore yet to power our machines – leading researchers from MIT, the Gwangju Institute of Science and Technology (GIST) and Seoul National University in Korea to develop a new artificial vision system that faithfully reproduces the vision of the fiddler crab and is able to tackle both terrains.

The semi-terrestrial species – known affectionately as the calling crab, as it appears to beckon with its huge claws – has amphibious imaging capability and an extremely wide field of view, as all current systems are limited to the hemispherical. The new artificial eye, resembling a small, largely indefinable spherical black ball, gives meaning to its entrances through a mix of materials that process and understand light. Scientists combined a flat microlens array with a graduated refractive index profile and a flexible photodiode array with comb-like patterns, all wrapped around the 3D spherical structure. This setup meant that light rays from multiple sources would always converge on the same spot on the image sensor, regardless of the refractive index of its surroundings.

A article on this systemco-authored by Frédo Durand, professor of electrical engineering and computer science at MIT and affiliated with the Computer Science and Artificial Intelligence Laboratory (CSAIL), and 15 others, appears in the July issue of the journal Natural electronics.

Amphibious and panoramic imaging capabilities were tested in air and water experiments by imaging five objects with different distances and directions, and the system provided consistent image quality and field of view. near 360 degree vision in land and water environments. . Meaning: He could see both underwater and on land, where previous systems were limited to one area.

There’s more than meets the eye when it comes to fiddler crabs. Behind their massive claws lies a powerful and unique vision system that evolved by living both underwater and on land. The creatures’ flat corneas, combined with a graduated refractive index, counter defocus effects resulting from changes in the external environment – a crushing limitation for other compound eyes. Crabs also have an omnidirectional 3D field of vision, from an ellipsoidal, stalked structure. They evolved to watch almost everything at once to avoid attacks on open mudflats, and to communicate and interact with their companions.

Of course, biomimetic cameras are not new. In 2013, a wide field of view (FoV) camera that mimicked the compound eyes of an insect was reported in Nature, and in 2020, a wide FoV camera mimicking a fish eye emerged. Although these cameras can capture large areas at once, it is structurally difficult to go beyond 180 degrees, and more recently commercial products with 360 degree FoV have come into play. These can be clunky, however, as they need to merge images taken from two or more cameras, and to enlarge the field of view, you need an optical system with a complex configuration, which causes image distortion. It is also difficult to maintain focusability when the surrounding environment changes, such as in the air and underwater, hence the impulse to turn towards the calling crab.

The crab turned out to be a worthy muse. During the tests, five cute objects (dolphin, plane, submarine, fish and ship), at different distances, were projected on the machine vision system from different angles. The team performed multi-laser dot imaging experiments, and the artificial images matched the simulation. To go deep, they submerged the device halfway in water from a container.

A logical extension of the work includes examining biologically inspired light adaptation schemes in pursuit of higher resolution and superior image processing techniques.

“It is a spectacular piece of optical engineering and non-planar imaging, combining aspects of bio-inspired design and advanced flexible electronics to achieve unique capabilities unavailable in conventional cameras,” said John A. Rogers, Louis Simpson and Kimberly Querrey Professor of Materials Science and Engineering, Biomedical Engineering, and Neurological Surgery at Northwestern University, who was not involved in the work. “Potential uses range from population monitoring to environmental monitoring.”

This research was supported by the Institute for Basic Science, the National Research Foundation of Korea, and the GIST-funded GIST-MIT Research Collaboration grant in 2022.