Big Data in the Defense Intelligence Community: More Questions than Answers

Published: March 20, 2013

Big DataCloud ComputingDEFENSEIntelligence

Speakers at the recent AFCEA DC Big Data Intel Panel revealed that the Defense Intelligence Community faces tremendous challenges developing requirements for its advanced analytics (big data) needs. Could this complicate life for vendors seeking to do business with the IC?

Agencies across government are struggling these days to leverage rapidly evolving new technologies and approaches like advanced data analytics and cloud computing. Introducing these technologies and approaches into an IT enterprise that is not ready for them can be disruptive. This much is known. Less well understood is the fact that even planning for big data and cloud investments can be disruptive because of the requirements development needed.
 
Take for example the efforts of the Defense Intelligence Community to bring stand up the Intelligence Community IT Enterprise (ICITE), a platform for sharing information across agency clouds and organizational boundaries. The challenges facing the IC in this area were the subject of a recent panel on Big Data Analytics hosted by the DC chapter of the Armed Forces Communications & Electronics Association (AFCEA). This panel brought together three speakers to offer insight into what is happening at their respective agencies, including Keith Barber, Director of the NSG Expeditionary Architecture Program Office at the National Geospatial-Intelligence Agency (NGA), Agustin “Gus” Taveras, Jr., CTO in the Directorate for IT Management at the Defense Intelligence Agency (DIA), and John Marshall, CTO in the Intelligence Directorate of the Joint Chiefs of Staff.
 
The panel’s discussion swirled loosely around the challenges that the IC is facing when it comes to sharing data and employing new technologies. It is worth remembering that these agencies are out in front of adoption of big data tools and cloud computing, so their experience can prove valuable for understanding where other federal agencies are likely to encounter roadblocks. Mr. Barber began the discussion by noting that even the IC struggles to keep up with rapid technological evolution. Citing a recent article that appeared in the Harvard Business Review, Barber said that government must get a better handle on where the “big data economy” is headed so that it can leverage private sector developments. Most important in all of this is knowing simply where to start. As Barber sees it, the IC needs to begin asking the right questions to get the right answers; questions like which data sets do we go after, what tools do we need, and how do we best share data? Sharing the data is a key issue that Mr. Barber believes the IC will remain preoccupied with for years to come.
 
DIA CTO Gus Taveras agreed that data sharing is a critical piece of the evolving IC big data environment and he suggested the ICITE program is the answer. In general, ICITE is the IC’s version of “ruthless standardization” as it forces the Intel agencies to move to a common enterprise framework. Taveras noted ironically that Sequestration has helped accelerate the push toward ICITE. The biggest challenges Taveras sees are in the realms of procurement and requirements development. Here he referred back to the concept of asking the right questions. How do we pay industry for services, Taveras asked? Is a metering model the best or is there some other way to do it? As an aside, it was surreal to hear that even now, well into the adoption of cloud computing by the public sector, there is confusion about the best payment and contracting model.
 
Then there is the issue of requirements development. Taveras explained that as CTO the hardest thing about big data analytics is understanding what analysts and other customers need. Determining requirements is complicated by the fact that there is no “one tool fits all” solution available. In some cases, analysts may be happy with the capabilities the currently have, but they would like enhancements. This would be less expensive than buying an entirely new solution, but understanding how enhancements are acquired is a challenge. Underlying Mr. Taveras’ comments was a sense that analytics tools are evolving so rapidly that his personnel do not know what they can use.
 
Then there is the question of contracting. How does one contract for new capabilities when the requirements development process does not function effectively? How, indeed? This admission by Mr. Taveras raised the twin red flags of scope creep and shifting requirements that have plagued government contracts for decades. And if previous generations of contracted efforts faced these challenges imagine how much more daunting they could become as big data and cloud computing solutions grow in complexity and variety. Caveat venditor!