Big Data Use in the Department of Defense

Published: October 14, 2015


Big data adoption is on the rise across the Department of Defense say speakers at a recent conference sponsored by the Technology Training Corporation.

The release of a new survey by Unisys on the use of big data technologies in the federal government focused a spotlight on the challenges agencies face when trying to grapple with the tsunami of data they possess. The obstacles identified by survey participants which hamper big data tech adoption, i.e., insufficient computing, storage, and especially transport network bandwidth, are valid challenges that we here at Federal Industry Analysis have long listed as problems in need of tackling. However, to see the challenges without also understanding where big data tech is being used is to see only half the picture. The reality is that big data technology is already in use across the federal government and while some deployments may be isolated, they are becoming more common.

Nowhere is the expanding use of big data tech clearer than in the Department of Defense. The Technology Training Corporation, or TTC, organizes a small, but focused conference every year on the use of big data by defense, intelligence, and law enforcement agencies that is mostly attended by government personnel seeking to understand how they can leverage big data tech for their agencies. The conference tends to fly under the radar, which it really too bad because the insights delivered by the speakers can be useful for understanding exactly where the DoD is when it comes to big data.

So, where is the big data-related activity in DoD? All over the place, actually.

Mark Krzysko, the Deputy Director for Enterprise Information in the Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics (OUSD/AT&L) outlined a program called the Defense Acquisition Visibility Environment (DAVE), which compiles acquisition data from across the DoD. DAVE is in need of upgrading and AT&L is willing to carve out the $10 million in FY 2016-2017 that it needs to accomplish the task. DAVE is part of a larger initiative at AT&L to implement a framework and platform for big data that enables new acquisition capabilities, coordinates a data-centric move for policy, procedures, and operations, and which supports analysts and components to enable better decision-making.

The Navy is also implementing big data tech on an enterprise level for maritime data exploitation. Chris Page from the Office of Naval Intelligence explained that incoming multi-intelligence (Multi-INT) data about U.S. adversaries is too big, moving too fast, and does not fit into traditional database structures. This situation means the Navy is running the risk of losing its ability to maintain tactical, operational, and strategic advantages over seagoing adversaries. The Navy’s answer to this challenge is to implement a big data approach (see Navy’s Expeditionary Warfare Data Focused Naval Tactical Cloud) using predictive analytics which are, in turn, forcing cultural changes on the service. As Mr. Page noted, there are architectural challenges to be addressed, but without learning how to share data across the community the Navy will lose the information dominance edge that it currently possesses.

Similar concerns occupy the Air Force, which has multiple big data efforts underway. Kenny O’Neal, National Intelligence Coordination Cell Reorganization Lead at the 25th Air Force focused primarily on work being done to expand the analytical capabilities of the Distributed Common Ground System (DCGS-AF). Mr. O’Neal noted two important developments that are occurring in addition to the processing of data. The first is the dissemination of a new data handling methodology in the Air Force called C-PAD, which stands for Collection – Processing, Analysis, & Dissemination. C-PAD represents a transition away from the military’s traditional PED (Processing, Exploitation, & Dissemination) approach, but because the service cannot analyze every piece of information it collects it must learn to automate its analytical capabilities to “attack the data that needs to be attacked so analysts can focus on what’s unknown.” The way ahead, Mr. O’Neal concluded, is to link tactical level data collection to national databases owned by the Intelligence Community to glean insight. In order to be truly effective, the tactical and national levels of data collection must be integrated and accessible to everyone, including common formatting, from national leadership to shooters at the edge.

The examples above illustrate that on the tactical level, as well as on the level of certain business operations like acquisitions, big data technology is being used or at least explored across the DoD. That certain organizations are already working the problem of linking data collection at the tactical level to analysis capabilities at the national level, suggests the adoption of big data by the DoD (and IC) is already far enough along to begin looking at creating a national big data intelligence picture.