Sign in or Register

Fictron Industrial Supplies Sdn Bhd
No. 7 & 7A,
Jalan Tiara, Tiara Square,
Taman Perindustrian Sime UEP,
47600 Subang Jaya,
Selangor, Malaysia.
+603-8023 9829
+603-8023 7089
Fictron Industrial
Automation Pte Ltd

140 Paya Lebar Road, #03-01,
AZ @ Paya Lebar 409015,
Singapore.
+65 31388976
sg.sales@fictron.com

To Fly Solo, Racing Drones Have a Need for AI Speed Training

10 Jun 2019
To Fly Solo, Racing Drones Have a Need for AI Speed Training
View Full Size
Drone racing’s ultimate vision of quadcopters weaving nimbly through hurdle courses has attracted far less excitement and investment than self-driving cars directed at reshaping ground transportation. But the U.S. military and defense industry are betting on autonomous drone racing as the next frontier for developing AI so that it can handle high-speed navigation within tight spaces without human involvement.
 
The autonomous drone challenge needs split-second move-making with six degrees of freedom instead of a car’s mere two degrees of road freedom. One research team developing the AI necessary for controlling autonomous racing drones is the Robotics and Perception Group at the University of Zurich in Switzerland. In late May, the Swiss researchers were among nine teams revealed to be contending in the two-year AlphaPilot open innovation challenge sponsored by U.S. aerospace company Lockheed Martin. The winning team will walk away with up to $2.25 million for defeating other autonomous racing drones and a professional human drone pilot in head-to-head events.
 
“I think it is crucial to first point out that with an autonomous drone to finish a racing track at high speeds or even beating a human pilot does not imply that we can have autonomous drones [capable of] navigating in real-world, complex, unstructured, unknown environments such as disaster zones, collapsed buildings, caves, tunnels or narrow pipes, forests, military scenarios, and so on,” says Davide Scaramuzza, a professor of robotics and perception at the University of Zurich and ETH Zurich. “However, the robust and computationally efficient state evaluation algorithms, control, and planning algorithms formulated for autonomous drone racing would represent a starting point.”
 
The nine teams that made the cut—from a pool of 424 AlphaPilot applicants—will compete in four 2019 racing events organized under the Drone Racing League’s Artificial Intelligence Robotic Racing Circuit, says Keith Lynn, program manager for AlphaPilot at Lockheed Martin. To guarantee an apples-to-apples comparison of each team’s AI secret sauce, each AlphaPilot team will upload its AI code into similar, specially-built drones that have the NVIDIA Xavier GPU at the core of the onboard computing hardware.
 
“Lockheed Martin is providing mentorship to the nine AlphaPilot teams to assist their AI tech development and innovations,” says Lynn. The company “will be hosting a week-long Developers Summit at MIT in July, devoted to workshopping and improving AlphaPilot teams’ code,” he added. He notes that each team will keep the intellectual property rights to its AI code.
 
The AlphaPilot challenge takes determination from older autonomous drone racing events hosted by academic researchers, Scaramuzza says. He credits Hyungpil Moon, a professor of robotics and mechanical engineering at Sungkyunkwan University in South Korea, for having organized the annual autonomous drone racing competition at the International Conference on Intelligent Robots and Systems since 2016.
 
It’s no easy task to build and train AI that can perform high-speed flight through intricate environments by relying on visual navigation. One big problem comes from how drones can accelerate sharply, take sharp turns, fly sideways, do zig-zag patterns and even perform back flips. That means camera images can suddenly appear tilted or even upside down during drone flight. Motion blur may occur when a drone flies very close to structures at high speeds and camera pixels collect light from multiple directions. Both cameras and visual software can also challenge to compensate for sudden changes between light and dark parts of an environment.
 
To lend AI a helping hand, Scaramuzza’s group recently exhibited a drone racing dataset that contains sensible training data taken from a drone flown by a professional pilot in both indoor and outdoor spaces. The data, which consists of complicated aerial maneuvers such as back flips, flight sequences that cover hundreds of meters, and flight speeds of up to 83 kilometers per hour, was offered at the 2019 IEEE International Conference on Robotics and Automation.
 
The drone racing dataset also features data grabbed by the group’s special bioinspired event cameras that can identify changes in motion on a per-pixel basis within microseconds. By comparison, common cameras need milliseconds (each millisecond being 1,000 microseconds) to compare motion changes in each image frame. The event cameras have already proven capable of helping drones nimbly dodge soccer balls thrown at them by the Swiss lab’s researchers.
 
The Swiss group’s work on the racing drone dataset obtained funding in part from the U.S. Defense Advanced Research Projects Agency (DARPA), which acts as the U.S. military’s special R&D arm for more innovative projects. Exclusively, the funding came from DARPA’s Fast Lightweight Autonomy program that envisions smaller autonomous drones capable of flying at high speeds through cluttered environments without GPS guidance or communication with human pilots.
 
Such speedy drones could serve as military scouts checking out dangerous buildings or alleys. They could also someday help search-and-rescue teams find people trapped in semi-collapsed buildings or lost in the woods. Being able to fly at high speed without crashing into things also makes a drone more excellent at all kinds of tasks by making the most of restricted battery life, Scaramuzza says. After all, most drone battery life gets used up by the need to hover in flight and doesn’t get emptied much by flying faster.
 
Even if AI manages to conquer the drone racing obstacle courses, that would be the end of the beginning of the technology’s development. What would still be involved? Scaramuzza specifically singled out the need to handle low-visibility conditions involving smoke, dust, fog, rain, snow, fire, hail, as some of the biggest challenges for vision-based algorithms and AI in complicated real-life environments.
 
“I think we should develop and release datasets containing smoke, dust, fog, rain, fire, etc. if we require to allow using autonomous robots to complement human rescuers in saving people lives after an earthquake or natural disaster in the future,” Scaramuzza says.



This article is originally posted on Tronserve.com

You have 0 items in you cart. Would you like to checkout now?
0 items
Switch to Mobile Version