Engineers program tiny robots to move, think like insects
What do you guys think about this? Would it be successful and what kind of advantages/disadvantages would it bring?
Skynet theories are allowed hahah.
TLDR; Engineers trying to program the already modelled tiny robots to act and behave like insects.
(Robobees in the image)
The amount of computer processing power needed for a robot to sense a gust of wind, using tiny hair-like metal probes imbedded on its wings, adjust its flight accordingly, and plan its path as it attempts to land on a swaying flower would require it to carry a desktop-size computer on its back. Silvia Ferrari, professor of mechanical and aerospace engineering and director of the Laboratory for Intelligent Systems and Controls, sees the emergence of neuromorphic computer chips as a way to shrink a robot's payload.
Unlike traditional chips that process combinations of 0s and 1s as binary code, neuromorphic chips process spikes of electrical current that fire in complex combinations, similar to how neurons fire inside a brain. Ferrari's lab is developing a new class of "event-based" sensing and control algorithms that mimic neural activity and can be implemented on neuromorphic chips. Because the chips require significantly less power than traditional processors, they allow engineers to pack more computation into the same payload.
Ferrari's lab has teamed up with the Harvard Microrobotics Laboratory, which has developed an 80-milligram flying RoboBee outfitted with a number of vision, optical flow and motion sensors. While the robot currently remains tethered to a power source, Harvard researchers are working on eliminating the restraint with the development of new power sources. The Cornell algorithms will help make RoboBee more autonomous and adaptable to complex environments without significantly increasing its weight.
"Getting hit by a wind gust or a swinging door would cause these small robots to lose control. We're developing sensors and algorithms to allow RoboBee to avoid the crash, or if crashing, survive and still fly," said Ferrari. "You can't really rely on prior modeling of the robot to do this, so we want to develop learning controllers that can adapt to any situation."
To speed development of the event-based algorithms, a virtual simulator was created by Taylor Clawson, a doctoral student in Ferrari's lab. The physics-based simulator models the RoboBee and the instantaneous aerodynamic forces it faces during each wing stroke. As a result, the model can accurately predict RoboBee's motions during flights through complex environments.
"The simulation is used both in testing the algorithms and in designing them," said Clawson, who helped has successfully developed an autonomous flight controller for the robot using biologically inspired programming that functions as a neural network. "This network is capable of learning in real time to account for irregularities in the robot introduced during manufacturing, which make the robot significantly more challenging to control."
Aside from greater autonomy and resiliency, Ferrari said her lab plans to help outfit RoboBee with new micro devices such as a camera, expanded antennae for tactile feedback, contact sensors on the robot's feet and airflow sensors that look like tiny hairs.
"We're using RoboBee as a benchmark robot because it's so challenging, but we think other robots that are already untethered would greatly benefit from this development because they have the same issues in terms of power," said Ferrari.
One robot that is already benefiting is the Harvard Ambulatory Microrobot, a four-legged machine just 17 millimeters long and weighing less than 3 grams. It can scamper at a speed of .44 meters-per-second, but Ferrari's lab is developing event-based algorithms that will help complement the robot's speed with agility.
Ferrari is continuing the work using a four-year, $1 million grant from the Office of Naval Research. She's also collaborating with leading research groups from a number of universities fabricating neuromorphic chips and sensors.
Source Text: https://www.sciencedaily.com/releases/2017/12/171214141923.htm
Hi! I am a robot. I just upvoted you! I found similar content that readers might be interested in:
http://news.cornell.edu/stories/2017/12/engineers-program-tiny-robots-move-think-insects
Yeah, my source text also has Cornell University as its source! I don't believe that you are a robot. I think you are a cheetah.
I think I can deal with the tiny robots acting like insects, but please don't allow them to "think".
I think its gonns happen one way or another. When we achieve something, we want always more and more. The practices on advanced AI will be inevitable i guess :/
Congratulations @aegean! You have completed some achievement on Steemit and have been rewarded with new badge(s) :
You published your First Post
Click on any badge to view your own Board of Honor on SteemitBoard.
For more information about SteemitBoard, click here
If you no longer want to receive notifications, reply to this comment with the word
STOP
Congratulations @aegean! You have completed some achievement on Steemit and have been rewarded with new badge(s) :
Award for the number of upvotes
Click on any badge to view your own Board of Honor on SteemitBoard.
For more information about SteemitBoard, click here
If you no longer want to receive notifications, reply to this comment with the word
STOP
Congratulations @aegean! You received a personal award!
Click here to view your Board of Honor
Congratulations @aegean! You received a personal award!
You can view your badges on your Steem Board and compare to others on the Steem Ranking
Vote for @Steemitboard as a witness to get one more award and increased upvotes!