martes, 14 de mayo de 2019
Hummingbird robot using AI to go soon where drones can’t
press release
https://www.purdue.edu/newsroom/releases/2019/Q2/hummingbird-robot-uses-ai-to-soon-go-where-drones-cant.html
WEST LAFAYETTE, Ind. — What can fly like a bird and hover like an insect?
Your friendly neighborhood hummingbirds. If drones had this combo, they would be able to maneuver better through collapsed buildings and other cluttered spaces to find trapped victims.
Purdue University researchers have engineered flying robots that behave like hummingbirds, trained by machine learning algorithms based on various techniques the bird uses naturally every day.
This means that after learning from a simulation, the robot “knows” how to move around on its own like a hummingbird would, such as discerning when to perform an escape maneuver.
Artificial intelligence, combined with flexible flapping wings, also allows the robot to teach itself new tricks. Even though the robot can’t see yet, for example, it senses by touching surfaces. Each touch alters an electrical current, which the researchers realized they could track.
“The robot can essentially create a map without seeing its surroundings. This could be helpful in a situation when the robot might be searching for victims in a dark place – and it means one less sensor to add when we do give the robot the ability to see,” said Xinyan Deng, an associate professor of mechanical engineering at Purdue.
The researchers will present their work on May 20 at the 2019 IEEE International Conference on Robotics and Automation in Montreal.
A YouTube video is available at https://www.youtube.com/watch?v=jhl892dHqfA&feature=youtu.be.
Drones can’t be made infinitely smaller, due to the way conventional aerodynamics work. They wouldn’t be able to generate enough lift to support their weight.
But hummingbirds don’t use conventional aerodynamics – and their wings are resilient. “The physics is simply different; the aerodynamics is inherently unsteady, with high angles of attack and high lift. This makes it possible for smaller, flying animals to exist, and also possible for us to scale down flapping wing robots,” Deng said.
Researchers have been trying for years to decode hummingbird flight so that robots can fly where larger aircraft can’t. In 2011, the company AeroVironment, commissioned by DARPA, an agency within the U.S. Department of Defense, built a robotic hummingbird that was heavier than a real one but not as fast, with helicopter-like flight controls and limited maneuverability. It required a human to be behind a remote control at all times.
Deng’s group and her collaborators studied hummingbirds themselves for multiple summers in Montana. They documented key hummingbird maneuvers, such as making a rapid 180-degree turn, and translated them to computer algorithms that the robot could learn from when hooked up to a simulation.
Further study on the physics of insects and hummingbirds allowed Purdue researchers to build robots smaller than hummingbirds – and even as small as insects – without compromising the way they fly. The smaller the size, the greater the wing flapping frequency, and the more efficiently they fly, Deng says.
The robots have 3D-printed bodies, wings made of carbon fiber and laser-cut membranes. The researchers have built one hummingbird robot weighing 12 grams – the weight of the average adult magnificent hummingbird – and another insect-sized robot weighing 1 gram. The hummingbird robot can lift more than its own weight, up to 27 grams.
Designing their robots with higher lift gives the researchers more wiggle room to eventually add a battery and sensing technology, such as a camera or GPS. Currently, the robot needs to be tethered to an energy source while it flies – but that won’t be for much longer, the researchers say.
The robots could fly silently just as a real hummingbird does, making them more ideal for covert operations. And they stay steady through turbulence, which the researchers demonstrated by testing the dynamically scaled wings in an oil tank.
The robot requires only two motors and can control each wing independently of the other, which is how flying animals perform highly agile maneuvers in nature.
“An actual hummingbird has multiple groups of muscles to do power and steering strokes, but a robot should be as light as possible, so that you have maximum performance on minimal weight,” Deng said.
Robotic hummingbirds wouldn’t only help with search-and-rescue missions, but also would allow biologists to study hummingbirds more reliably in their natural environment through the senses of a realistic robot.
“We learned from biology to build the robot, and now biological discoveries can happen with extra help from robots,” Deng said.
Simulations of the technology are available open-source at https://github.com/purdue-biorobotics/flappy.
Early stages of the work, including the Montana hummingbird experiments in collaboration with Bret Tobalske’s group at the University of Montana, were financially supported by the National Science Foundation.
This work aligns with Purdue's Giant Leaps celebration, acknowledging the university’s global advancements made in AI, algorithms and automation as part of Purdue’s 150th anniversary. This is one of the four themes of the yearlong celebration’s Ideas Festival, designed to showcase Purdue as an intellectual center solving real-world issues.
Writer: Kayla Wiles, 765-494-2432, wiles5@purdue.edu
Source: Xinyan Deng, 765-494-1513, xdeng@purdue.edu
Note to Journalists: Links to the paper preprints are available in the abstracts. A YouTube video is available at https://www.youtube.com/watch?v=jhl892dHqfA&feature=youtu.be and other multimedia can be found in a Google Drive folder at https://drive.google.com/open?id=1XrFz3MOj_2jotVjVQOmC5upfD8kWIOnF. Video and photos were prepared by Jared Pike, communications specialist for Purdue University’s School of Mechanical Engineering.
ABSTRACTS
Learning Extreme Hummingbird Maneuvers on Flapping Wing Robots
Fan Fei, Zhan Tu, Jian Zhang, and Xinyan Deng
Purdue University, West Lafayette, IN, USA
https://arxiv.org/abs/1902.09626
Biological studies show that hummingbirds can perform extreme aerobatic maneuvers during fast escape. Given a sudden looming visual stimulus at hover, a hummingbird initiates a fast backward translation coupled with a 180-degree yaw turn, which is followed by instant posture stabilization in just under 10 wingbeats. Consider the wingbeat frequency of 40Hz, this aggressive maneuver is carried out in just 0.2 seconds. Inspired by the hummingbirds’ near-maximal performance during such extreme maneuvers, we developed a flight control strategy and experimentally demonstrated that such maneuverability can be achieved by an at-scale 12- gram hummingbird robot equipped with just two actuators. The proposed hybrid control policy combines model-based nonlinear control with model-free reinforcement learning. We use model-based nonlinear control for nominal flight control, as the dynamic model is relatively accurate for these conditions. However, during extreme maneuver, the modeling error becomes unmanageable. A model-free reinforcement learning policy trained in simulation was optimized to ’destabilize’ the system and maximize the performance during maneuvering. The hybrid policy manifests a maneuver that is close to that observed in hummingbirds. Direct simulation-to-real transfer is achieved, demonstrating the hummingbird-like fast evasive maneuvers on the at-scale hummingbird robot.
Acting is Seeing: Navigating Tight Space Using Flapping Wings
Zhan Tu, Fan Fei, Jian Zhang, and Xinyan Deng
Purdue University, West Lafayette, IN, USA
https://arxiv.org/abs/1902.08688
Wings of flying animals can not only generate lift and control torques but also can sense their surroundings. Such dual functions of sensing and actuation coupled in one element are particularly useful for small sized bio-inspired robotic flyers, whose weight, size, and power are under stringent constraint. In this work, we present the first flapping-wing robot using its flapping wings for environmental perception and navigation in tight space, without the need for any visual feedback. As the test platform, we introduce the Purdue Hummingbird, a flapping-wing robot with 17cm wingspan and 12 grams weight, with a pair of 30-40Hz flapping wings driven by only two actuators. By interpreting the wing loading feedback and its variations, the vehicle can detect the presence of environmental changes such as grounds, walls, stairs, obstacles and wind gust. The instantaneous wing loading can be obtained through the measurements and interpretation of the current feedback by the motors that actuate the wings. The effectiveness of the proposed approach is experimentally demonstrated on several challenging flight tasks without vision: terrain following, wall following and going through a narrow corridor. To ensure flight stability, a robust controller was designed for handling unforeseen disturbances during the flight. Sensing and navigating one’s environment through actuator loading is a promising method for mobile robots, and it can serve as an alternative or complementary method to visual perception.
Flappy Hummingbird: An Open Source Dynamic Simulation of Flapping Wing Robots and Animals
Fan Fei, Zhan Tu, Yilun Yang, Jian Zhang, and Xinyan Deng
Purdue University, West Lafayette, IN, USA
https://arxiv.org/abs/1902.09628
Insects and hummingbirds exhibit extraordinary flight capabilities and can simultaneously master seemingly conflicting goals: stable hovering and aggressive maneuvering, unmatched by small scale man-made vehicles. Flapping Wing Micro Air Vehicles (FWMAVs) hold great promise for closing this performance gap. However, design and control of such systems remain challenging due to various constraints. Here, we present an open source high fidelity dynamic simulation for FWMAVs to serve as a testbed for the design, optimization and flight control of FWMAVs. For simulation validation, we recreated the hummingbird-scale robot developed in our lab in the simulation. System identification was performed to obtain the model parameters. The force generation, open- loop and closed-loop dynamic response between simulated and experimental flights were compared and validated. The unsteady aerodynamics and the highly nonlinear flight dynamics present challenging control problems for conventional and learning control algorithms such as Reinforcement Learning. The interface of the simulation is fully compatible with OpenAI Gym environment. As a benchmark study, we present a linear controller for hovering stabilization and a Deep Reinforcement Learning control policy for goal-directed maneuvering. Finally, we demonstrate direct simulation-to-real transfer of both control policies onto the physical robot, further demonstrating the fidelity of the simulation.
No hay comentarios:
Publicar un comentario