Robot Bees Fly and Swim, Soon they’ll have Laser Eyes
November 11, 2015 | University at BuffaloEstimated reading time: 2 minutes
How do you teach robotic insects to see?
By equipping them with tiny laser-powered sensors that act as eyes, enabling the miniature machines to sense the size, shape and distance of approaching objects.
“Essentially, it’s the same technology that automakers are using to ensure that driverless cars don’t crash into things,” says University at Buffalo computer scientist Karthik Dantu. “Only we need to shrink that technology so it works on robot bees that are no bigger than a penny.”
The UB-led research project, funded by a $1.1 million National Science Foundation grant, includes researchers from Harvard University and the University of Florida. It is an offshoot of the RoboBee initiative, led by Harvard and Northeastern University, which aims to create insect-inspired robots that someday may be used in agriculture and disaster relief.
Researchers have shown that robot bees are capable of tethered flight and moving while submerged in water. One of their limitations, however, is a lack of depth perception. For example, a robot bee cannot sense what’s in front of it.
This is problematic if you want the bee to avoid flying into a wall or have it land in a flower, says Dantu, who worked on the RoboBee project as a postdoctoral researcher at Harvard before joining UB’s School of Engineering and Applied Sciences in 2013 as an assistant professor.
The UB-led research team will address the limitation by outfitting the robot bee with remote sensing technology called lidar, the same laser-based sensor system that is making driverless cars possible.
Lidar (short for light detection and ranging) works like radar, only it emits invisible laser beams instead of microwaves. The beams capture light reflected from distant objects. Sensors then measure the time it takes the light to return to calculate the distance and shape of the objects.
The information is then analyzed by computer algorithms to form a coherent image of the car’s path. This enables the car to “see” its environment and follow traffic signs, avoid obstacles and make other adjustments.
These systems, which are typically mounted on the car roof, are about the size of a traditional camping lantern. The team Dantu leads wants to make them much smaller, a version called “micro-lidar.”
University of Florida researchers will develop the tiny sensor that measures the light’s reflection, while Dantu will create novel perception and navigation algorithms that enable the bee to process and map the world around it. Harvard researchers will then incorporate the technology into the bees.
The technology the team develops likely won’t be limited to robot insects. The sensors could be used, among other things, in wearable technology; endoscopic tools; and smartphones, tablets and other mobile devices.
Suggested Items
Keynote Preview: Reshaping our Engagement With the World
03/28/2024 | Shawn DuBravac, IPCThe widespread integration of AI across various sectors is broadening its impact, from revolutionizing healthcare with Smart solutions to transforming homes into intuitive spaces, highlighting its crucial role in boosting efficiency and addressing complex challenges. In healthcare, we're witnessing a trend toward personalized care with AI-driven devices like intelligent pillows to mitigate snoring, sophisticated sleep monitors, and innovative patient monitoring systems.
Indium Corporation to Present, Exhibit at EPP InnovationsFORUM
03/28/2024 | Indium CorporationIndium Corporation is set to present and exhibit at EPP InnovationsFORUM, one of Europe’s premier single-day electronics manufacturing forums, on April 17 in Leinfelden-Echterdingen, Germany. Topic areas for 2024 will include AI, automation, sustainability, and quality.
Accenture Invests in Sanctuary AI to Bring AI-Powered, Humanoid Robotics to Work Alongside Humans
03/27/2024 | BUSINESS WIREAccenture has made a strategic investment, through Accenture Ventures, in Sanctuary AI, a developer of humanoid general-purpose robots that are powered by AI and can perform a wide variety of work tasks quickly, safely and effectively.
AT&S Well Prepared to Benefit from AI Boom
03/26/2024 | AT&SThe rapid progress in the development of artificial intelligence promises to revolutionize all areas of daily life in the coming years. In order to operate such AI systems, an enormous amount of computing power is required, which is provided by a vast network of data centres.
The IMAPS Show: A Conversation with John Andresakis
03/26/2024 | Marcy LaRont, PCB007 MagazineOn the last day of the IMAPS Device Packaging Conference, Marcy LaRont sat down with industry veteran John Andresakis of Quantic Ohmega, who attended the conference this week. Not his first time at this event, he talked about the conference, advanced technology, and trying to get the word out about the advanced packaging substrates solution Quantic is offering.