Artificial intelligence and machine learning for unmanned vehicles |
Military experts are developing new enabling technologies to help unmanned aircraft, ground vehicles, submarines, and surface vessels swarm and make decisions without human intervention. What was once the realm science fiction writers is growing as unmanned vehicles are given more capability for autonomous decision making thanks to improvements in and machine learning. First, unmanned aerial vehicles (UAVs) took to the skies. In fact, the British military developed the first radio-controlled unmanned aircraft during World War I — a scant 14 years after the Wright Brothers’ first flight in 1903. UAVs really came into their own during the Vietnam War and have become even more prevalent and essential since then. As UAVs proved themselves more invaluable over the years, the Department of Defense (DOD) asked industry experts to bring the unmanned revolution down to Earth in the form of unmanned ground vehicles (UGVs) and even to the seas with unmanned underwater vehicles (UUVs). Though each domain had unique problems to solve, the difficulty communicating from the surface to UUVs meant a greater need for machine autonomy — the ability to make decisions without direct human input — made possible with artificial intelligence (AI) and machine learning (machine learning). Even though and methods allow for relatively easy contact with UGVs and UAVs, the prospect of “smart” unmanned systems on the ground and in the sky also is helping drive the autonomous revolution. The ability for unmanned systems underwater, on the ground, and in the skies with full autonomy requires a lot of processing power. Mike Southworth, the senior product manager at Curtiss-Wright Defense Solutions division in Salt Lake City notes that tying together all the different technologies needed to make a vehicle autonomous is a large source for the need for big computing power. “Analyzing large amounts of data to complete complex deep-learning algorithms can require significant processing capabilities. Think of the number of devices, such as cameras and sensors, that are required for an autonomous vehicle to safely drive without human intervention, for example,” Southworth says. “The more devices there are collecting data, the higher the competition for bandwidth. That’s why so much of today’s data processing for AI takes place within traditional computer data centers. “Relying on centralized processing exclusively at the data center has inherent limitations, however, including bandwidth, security, and availability,” Southworth continues. “Consider the paradigm where all of the sensor data is uploaded to a centralized data center for processing, and then edge computers wait for a response from the data center to execute a command. If relying on the cloud alone, there could be a tangible delay or latency, comparing when an application issues an instruction and when it receives a response. Autonomous vehicles may need real-time vision and perception for safe navigation, path planning, or active protection. Imagine the consequences in a battlefield scenario where an incoming threat is detected, but there’s a measurable network delay before any countermeasures can be taken. Lives may be lost while threats are not eliminated. High-performance embedded computing integrated onto vehicle platforms can potentially overcome those obstacles and enable deep learning on the battlefield.” When it comes to unmanned systems — no matter the domain — size matters. With the need for many integrated technologies to fit into something with a finite footprint, high-performance embedded computing (HPEC) helps make it all possible. “As an industry, we are always trying to get more out of the systems we develop. HPEC computing, when coupled with AI capabilities, brings forth a powerful computing solution in a SWaP [size, weight, and power]-optimized form factor,” says Valerie Andrew, who manages strategic marketing and communication for embedded computing specialist Elma Electronic Inc. in Fremont, Calif. “With the large number of data points in use within any given embedded system, having the high-performance infrastructure to manage and process that information, which in turn fuels AI and deep learning, gives unmanned systems access to on-demand intelligence that can be used for operational activities.” Andrew, who also is the Sensor Open Systems Architecture Consortium Business Working Group Outreach Lead, notes that the DOD’s move to interoperability through a common modular open systems architecture (MOSA), can remove barriers to development — even in unmanned systems built around AI. “Advanced learning technologies will become even more critical, not just, ‘do these systems work together,’” Andrew says. “This will be exponential in terms of system functionality, where more rapid development and deployment of unmanned systems will start to take place. For the DOD, this means having access to best-in-class technologies fast and affordably. For the warfighter, it means increased knowledge and decision-making abilities during a mission.” There are twofold benefits of using HPEC in independent or partially independent embedded supercomputers, says Dan Mor, director of video and general-purpose graphic processing unit (GPGPU) product line at Aitech Defense Systems Inc. in Chatsworth, Calif. “HPEC systems enable far more functionality to be housed in a smaller framework, so that processing of larger datasets can happen closer to the sensor, where it is needed most,” Mor says. “And ruggedizing these HPEC systems means computing power can be used in a wider number of remote and mobile locations. As these processing demands increase, SFF systems using GPGPU technology and AI-based solutions are providing a path to next generation embedded systems, poised to tackle the growing field of mobile, unmanned and autonomous vehicle technologies, bringing computing power to areas never-before conceivable.” Small form factor Aneesh Kothari, vice president of marketing at Systel Inc. in Sugar Land, Texas, says that unmanned systems can take advantage of on-board AI and machine learning to reduce liabilities brought on by the realities of operating in a contested environment. “High-performance embedded edge computing is critical to deployed AI mission success. Operating in a contested environment with restricted bandwidth and degraded communications make the tactical use of cloud-based computing and AI a liability. Computational processing capability must reside on-premise to ensure the low latency and near real-time speed demanded of AI-based applications. Advances in COTS (commercial off-the-shelf) technologies in recent years have allowed for practical embedded edge computing use in unmanned vehicles,” Kothari says. “Systel’s rugged embedded systems such as Kite-Strike and Raven-Strike integrate commercial hardware components such as video capture cards and encoders, and the latest NVIDIA Ampere-based and Jetson Xavier GPUs in small-form factor (SFF) rugged embedded computers, making them ideal for use in unmanned vehicles.” With high-powered embedded computers providing the processing power needed to run the “smarts” of the AI and machine learning technology keeping UAVs, UUVs, and UGVs on the move, there is a need to keep everything in the vehicle cool. Mercury Systems in Andover, Mass., provides several processing solutions for the military-aerospace sector. Mercury’s Karen Haigh, who is the company’s chief technologist, says that cooling is key for unmanned systems. “Higher, bigger processing power is needed for these unmanned systems, so when you compute, you generate heat. So, the better you can control the heat through a cooling system, the more you can get done in a smaller space. As things get smaller, it gets hotter in a more compact space. So you need to be able to cool it more effectively. And if you’ve got a system that you can’t stick a fan on the back of the computer to make it not get hot, you need to be able to do these things in these tiny spaces under high difficulty. “So, at the bottom of the sea bed, you’re not going to be wanting to have anywhere near the same ki |
Apr 26th, 2021 |
source |