Robot Bees Fly and Swim, Soon they’ll have Laser Eyes

Reading time ( words)

How do you teach robotic insects to see?

By equipping them with tiny laser-powered sensors that act as eyes, enabling the miniature machines to sense the size, shape and distance of approaching objects.

“Essentially, it’s the same technology that automakers are using to ensure that driverless cars don’t crash into things,” says University at Buffalo computer scientist Karthik Dantu. “Only we need to shrink that technology so it works on robot bees that are no bigger than a penny.”

The UB-led research project, funded by a $1.1 million National Science Foundation grant, includes researchers from Harvard University and the University of Florida. It is an offshoot of the RoboBee initiative, led by Harvard and Northeastern University, which aims to create insect-inspired robots that someday may be used in agriculture and disaster relief.

Researchers have shown that robot bees are capable of tethered flight and moving while submerged in water. One of their limitations, however, is a lack of depth perception. For example, a robot bee cannot sense what’s in front of it.

This is problematic if you want the bee to avoid flying into a wall or have it land in a flower, says Dantu, who worked on the RoboBee project as a postdoctoral researcher at Harvard before joining UB’s School of Engineering and Applied Sciences in 2013 as an assistant professor.

The UB-led research team will address the limitation by outfitting the robot bee with remote sensing technology called lidar, the same laser-based sensor system that is making driverless cars possible.

Lidar (short for light detection and ranging) works like radar, only it emits invisible laser beams instead of microwaves. The beams capture light reflected from distant objects. Sensors then measure the time it takes the light to return to calculate the distance and shape of the objects.

The information is then analyzed by computer algorithms to form a coherent image of the car’s path. This enables the car to “see” its environment and follow traffic signs, avoid obstacles and make other adjustments.

These systems, which are typically mounted on the car roof, are about the size of a traditional camping lantern. The team Dantu leads wants to make them much smaller, a version called “micro-lidar.”

University of Florida researchers will develop the tiny sensor that measures the light’s reflection, while Dantu will create novel perception and navigation algorithms that enable the bee to process and map the world around it. Harvard researchers will then incorporate the technology into the bees.

The technology the team develops likely won’t be limited to robot insects. The sensors could be used, among other things, in wearable technology; endoscopic tools; and smartphones, tablets and other mobile devices.



Suggested Items

CES 2020: The Intelligence of Things

01/06/2020 | Nolan Johnson, I-Connect007
Show week for CES 2020 starts well ahead of the actual exhibition dates because it is huge. The organizers of CES state that there are more than 4,400 exhibiting companies and nearly three million net square feet of exhibit space. On the floor, you can find 307 of the 2018 Fortune Global 500 companies. Over the week, I-Connect007 Editors Dan Feinberg and Nolan Johnson will bring you some of the most interesting news, products, and announcements from 5G to IoT, semiconductor developments, autonomous vehicle technology, interconnect, fabrication materials, and much more.

NASA Sounding Rocket Technology Could Enable Simultaneous, Multi-Point Measurements — First-Ever Capability

10/21/2019 | NASA
NASA engineers plan to test a new avionics technology — distributed payload communications — that would give scientists a never-before-offered capability in sounding rocket-based research.

For Climbing Robots, the Sky's the Limit

07/15/2019 | NASA
Robots can drive on the plains and craters of Mars, but what if we could explore cliffs, polar caps and other hard-to-reach places on the Red Planet and beyond? Designed by engineers at NASA's Jet Propulsion Laboratory in Pasadena, California, a four-limbed robot named LEMUR (Limbed Excursion Mechanical Utility Robot) can scale rock walls, gripping with hundreds of tiny fishhooks in each of its 16 fingers and using artificial intelligence (AI) to find its way around obstacles.

Copyright © 2020 I-Connect007. All rights reserved.