AI Enters the Front Lines of National Defense and Security

July 29, 2019

Artificial intelligence is now an essential component of defense and security strategies — from combat systems to operational processes.

When it comes to protecting the nation and its allies, the United States is armed with some of the world’s most sophisticated defense and security systems. Increasingly, artificial intelligence is at the heart of these systems.

For decades, the United States Department of Defense (DoD) has pioneered innovative uses for AI in defense and security, from assessing the readiness of military vehicles to identifying insurgent targets. Today, these efforts have shifted into high gear under a U.S. strategic initiative focused on harnessing AI to advance the security and prosperity of the nation.

As for that initiative, a 2018 DoD summary of the nation’s AI strategy gets right to the point: “We will harness the potential of AI to transform all functions of the Department positively, thereby supporting and protecting U.S. servicemembers, safeguarding U.S. citizens, defending allies and partners, and improving the affordability, effectiveness, and speed of our operations.”[1]

With those goals in mind, the use cases for AI in defense and national security are virtually unlimited. AI can be embedded into weapons and surveillance systems to enhance performance. It can be used to improve target recognition, combat simulation and training, and threat monitoring. It can be used in logistics and transportation systems, to help the military get the right equipment and people to the right places at the right time.

Let’s look at some of the specific use cases for AI in defense and security.

Combat systems

Military systems equipped with AI are capable of handling larger volumes of data and to do so more efficiently than conventional systems. AI also enables advanced computing and decision‑making capabilities that help military commanders improve the control, regulation and actuation of combat systems.

In an example of the importance of AI on the battlefield, the research firm KPMG notes that a defense agency could have just 8–10 minutes to decide whether a missile launch represents a threat, share the findings with allies, and decide what to do in response. It takes AI to rapidly integrate real-time data from satellites and sensors and to present findings immediately to help commanders decide what actions to take.[2]

Video surveillance and image analysis

When it comes to analyzing video and images captured by surveillance systems and aerial vehicles, AI is a huge advantage. For example, algorithms can be trained to recognize terrorist activity evident in streams of video, just as they can be trained to recognize cats in datasets filled with all kinds of images.[3]

“AI applied to perception tasks such as imagery analysis can extract useful information from raw data and equip leaders with increased situational awareness,” the DoD notes in its AI strategy report. “AI can generate and help commanders explore new options so that they can select courses of action that best achieve mission outcomes, minimizing risks to both deployed forces and civilians.”[4]

Cybersecurity

Cyber-warfare will clearly be one of the battlefields of the future. AI can help military organizations combat the threat of cyber-attacks, which can now be launched from virtually anywhere in the world.

A few examples of the way AI is being deployed in this new digital battlefield:

  • The U.S. Department of Homeland Security has piloted AI tools for detecting cyber-network intrusions and malicious activities.[5]
  • The DoD has a project under way that will develop an algorithm to detect and deter cyber-attackers whose skills are aimed at DoD information systems.[6]

Predictive maintenance

Keeping equipment battlefield-ready is a huge challenge for national defense agencies. AI can help with this labor-intensive work.

A case in point: The DoD plans to use AI to predict the failure of critical parts, automate diagnostics, and plan maintenance based on data and equipment condition. Similar technology will be used to guide the provisioning of spare parts and optimize inventory levels. The department says these advances will ensure appropriate inventory levels, assist in troubleshooting, and enable more rapidly deployable and adaptable forces at reduced cost.[7]

Service automation

Defense and security agencies increasingly rely on AI-driven applications to automate services, streamline business processes, cut the time spent on repetitive tasks, and reduce the chances for human errors.

For example:

  • In the recruiting process, the U.S. Army uses an AI-driven interactive virtual assistant to answer questions from prospective recruits, check users’ qualifications, and refer prospects to human recruiters. In a report on AI-augmented government, Deloitte says that the interactive virtual assistant does the work of 55 recruiters, with an accuracy rate of more than 94 percent.[8]
  • The Defense Advanced Research Projects Agency developed a digital tutor for computer skills training. After 16 weeks of training with the tutor, Navy students who had no prior IT experience scored higher in tests of IT knowledge and job-sample troubleshooting than others who received 35 weeks of classroom instruction.[9]

Infrastructure considerations

As AI makes ever-deeper inroads in defense and security applications, machine learning algorithms and technologies continue to develop and increase in scope. All of this puts new demands on the computing infrastructure that powers the AI-driven solutions. At the same time, it creates the need for infrastructure that can be adapted to the unique and changing requirements of defense and security use cases.

In this rapidly growing field, reprogrammable FPGAs give defense and security agencies a new class of IT armaments. With their flexible programming model, FPGAs allow for the continual implementation of the newest algorithms and neural network topologies.

Intel offers these examples of FPGA use cases for defense and security applications that are ideally suited for FPGAs.[10]

  • Radar and sensors — Radar has been a foundational technology area in which the semiconductor industry has played a large role for the last two decades. FPGAs can help system designers meet requirements for high-performance data processing, ultra-wide bandwidth, high dynamic range, and adaptive systems.
  • Electronic warfare — In electronic warfare systems, key drivers for continuous enhancements are electronic counter-counter-measures, stealth technologies, closely interlinked smart sensor networks, and intelligent guided weapons. Systems must be able to rapidly analyze and respond to multiple threats in extremely short timeframes. FPGAs can help meet these challenges.
  • Security — Strong encryption and authentication are keys to ensuring communications and data security at ever increasing data throughput rates. Strong cryptographic algorithms implemented on FPGAs that are secure by design provide the foundation for trusted information assurance systems.

Key takeaways

AI is now an essential technology for military and defense agencies across the wide range of their operations. It’s one of the keys to protecting the safety and security of our Armed Forces, our citizens and our allies. It’s also one of the keys to increasing the efficiency and effectiveness of military operations, at both home and abroad.

Ultimately, to be the best at what they do, our defense and security agencies need the real-time insights and intelligent automation that is possible only with AI-driven systems.

To learn more

For a look at leading-edge systems and solutions to power AI applications, including new Ready Solutions for AI – Deep Learning with Intel, visit dellemc.com/ai. Learn more about Dell EMC High Performance Computing and AI solutions at dellemc.com/hpc.

 


[1] U.S. Department of Defense, “Summary of the 2018 Department of Defense Artificial Intelligence Strategy.”

[2] KPMG, “Artificial intelligence in Defense,” accessed July 15, 2019.

[3] Congressional Research Service, “Artificial Intelligence and National Security,” January 30, 2019.

[4] U.S. Department of Defense, “Summary of the 2018 Department of Defense Artificial Intelligence Strategy.”

[5] Washington Technology, “More signs pointing to AI’s growth in the federal market,” May 15, 2018.

[6] Dell Technologies, “Pentagon Projects Move Military into AI Arena,” April 25, 2019.

[7] U.S. Department of Defense, “Summary of the 2018 Department of Defense Artificial Intelligence Strategy.”

[8] Deloitte, “AI-augmented government,” 2017.

[9] Defense Technical Information Center, “Accelerating Development of Expertise: A Digital Tutor for Navy Technical Training,” November 1, 2014.

[10] Intel, Military, Aerospace, and Government website, accessed July 17, 2019.

 

 

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Mira Supercomputer Enables Cancer Research Breakthrough

November 11, 2019

Dynamic partial-wave spectroscopic (PWS) microscopy allows researchers to observe intracellular structures as small as 20 nanometers – smaller than those visible by optical microscopes – in three dimensions at a mill Read more…

By Staff report

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quantum annealing) – ion trap technology is edging into the QC Read more…

By John Russell

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. That’s the conclusion drawn by the scientists and researcher Read more…

By Jan Rowell

What’s New in HPC Research: Cosmic Magnetism, Cryptanalysis, Car Navigation & More

November 8, 2019

In this bimonthly feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here. Read more…

By Oliver Peckham

Machine Learning Fuels a Booming HPC Market

November 7, 2019

Enterprise infrastructure investments for training machine learning models have grown more than 50 percent annually over the past two years, and are expected to shortly surpass $10 billion, according to a new market fore Read more…

By George Leopold

AWS Solution Channel

Making High Performance Computing Affordable and Accessible for Small and Medium Businesses with HPC on AWS

High performance computing (HPC) brings a powerful set of tools to a broad range of industries, helping to drive innovation and boost revenue in finance, genomics, oil and gas extraction, and other fields. Read more…

IBM Accelerated Insights

Atom by Atom, Supercomputers Shed Light on Alloys

November 7, 2019

Alloys are at the heart of human civilization, but developing alloys in the Information Age is much different than it was in the Bronze Age. Trial-by-error smelting has given way to the use of high-performance computing Read more…

By Oliver Peckham

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quant Read more…

By John Russell

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. Th Read more…

By Jan Rowell

MLPerf Releases First Inference Benchmark Results; Nvidia Touts its Showing

November 6, 2019

MLPerf.org, the young AI-benchmarking consortium, today issued the first round of results for its inference test suite. Among organizations with submissions wer Read more…

By John Russell

Azure Cloud First with AMD Epyc Rome Processors

November 6, 2019

At Ignite 2019 this week, Microsoft's Azure cloud team and AMD announced an expansion of their partnership that began in 2017 when Azure debuted Epyc-backed ins Read more…

By Tiffany Trader

Nvidia Launches Credit Card-Sized 21 TOPS Jetson System for Edge Devices

November 6, 2019

Nvidia has launched a new addition to its Jetson product line: a credit card-sized (70x45mm) form factor delivering up to 21 trillion operations/second (TOPS) o Read more…

By Doug Black

In Memoriam: Steve Tuecke, Globus Co-founder

November 4, 2019

HPCwire is deeply saddened to report that Steve Tuecke, longtime scientist at Argonne National Lab and University of Chicago, has passed away at age 52. Tuecke Read more…

By Tiffany Trader

Spending Spree: Hyperscalers Bought $57B of IT in 2018, $10B+ by Google – But Is Cloud on Horizon?

October 31, 2019

Hyperscalers are the masters of the IT universe, gravitational centers of increasing pull in the emerging age of data-driven compute and AI.  In the high-stake Read more…

By Doug Black

Cray Debuts ClusterStor E1000 Finishing Remake of Portfolio for ‘Exascale Era’

October 30, 2019

Cray, now owned by HPE, today introduced the ClusterStor E1000 storage platform, which leverages Cray software and mixes hard disk drives (HDD) and flash memory Read more…

By John Russell

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge

August 7, 2019

Fusion energy is the Holy Grail of the energy world: low-radioactivity, low-waste, zero-carbon, high-output nuclear power that can run on hydrogen or lithium. T Read more…

By Oliver Peckham

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

Cray Wins NNSA-Livermore ‘El Capitan’ Exascale Contract

August 13, 2019

Cray has won the bid to build the first exascale supercomputer for the National Nuclear Security Administration (NNSA) and Lawrence Livermore National Laborator Read more…

By Tiffany Trader

DARPA Looks to Propel Parallelism

September 4, 2019

As Moore’s law runs out of steam, new programming approaches are being pursued with the goal of greater hardware performance with less coding. The Defense Advanced Projects Research Agency is launching a new programming effort aimed at leveraging the benefits of massive distributed parallelism with less sweat. Read more…

By George Leopold

AMD Launches Epyc Rome, First 7nm CPU

August 8, 2019

From a gala event at the Palace of Fine Arts in San Francisco yesterday (Aug. 7), AMD launched its second-generation Epyc Rome x86 chips, based on its 7nm proce Read more…

By Tiffany Trader

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

Ayar Labs to Demo Photonics Chiplet in FPGA Package at Hot Chips

August 19, 2019

Silicon startup Ayar Labs continues to gain momentum with its DARPA-backed optical chiplet technology that puts advanced electronics and optics on the same chip Read more…

By Tiffany Trader

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Leading Solution Providers

ISC 2019 Virtual Booth Video Tour

CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
GOOGLE
GOOGLE
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
VERNE GLOBAL
VERNE GLOBAL

Intel Confirms Retreat on Omni-Path

August 1, 2019

Intel Corp.’s plans to make a big splash in the network fabric market for linking HPC and other workloads has apparently belly-flopped. The chipmaker confirmed to us the outlines of an earlier report by the website CRN that it has jettisoned plans for a second-generation version of its Omni-Path interconnect... Read more…

By Staff report

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

Intel Debuts Pohoiki Beach, Its 8M Neuron Neuromorphic Development System

July 17, 2019

Neuromorphic computing has received less fanfare of late than quantum computing whose mystery has captured public attention and which seems to have generated mo Read more…

By John Russell

Rise of NIH’s Biowulf Mirrors the Rise of Computational Biology

July 29, 2019

The story of NIH’s supercomputer Biowulf is fascinating, important, and in many ways representative of the transformation of life sciences and biomedical res Read more…

By John Russell

Xilinx vs. Intel: FPGA Market Leaders Launch Server Accelerator Cards

August 6, 2019

The two FPGA market leaders, Intel and Xilinx, both announced new accelerator cards this week designed to handle specialized, compute-intensive workloads and un Read more…

By Doug Black

With the Help of HPC, Astronomers Prepare to Deflect a Real Asteroid

September 26, 2019

For years, NASA has been running simulations of asteroid impacts to understand the risks (and likelihoods) of asteroids colliding with Earth. Now, NASA and the European Space Agency (ESA) are preparing for the next, crucial step in planetary defense against asteroid impacts: physically deflecting a real asteroid. Read more…

By Oliver Peckham

When Dense Matrix Representations Beat Sparse

September 9, 2019

In our world filled with unintended consequences, it turns out that saving memory space to help deal with GPU limitations, knowing it introduces performance pen Read more…

By James Reinders

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This