Government Challenges Presented at TTC’s Big Data and Predictive Analytics Symposium

Published: September 27, 2017

Big Data

Numerous government and industry speakers presented at Technology Training Corporation’s annual big data event, all resounding the similar challenges faced in implementing big data and analytical initiatives across the defense and intelligence sectors.

Each year, the Technology Training Corporation (TTC) holds an annual conference on public sector big data that features a mix bag of government and industry speakers providing various perspectives on the event’s topic. The theme this year was organized around “Big Data and Predictive Analytics – For Defense and Government.” The speakers outlined the expansions in the use of big data within the defense and intelligence communities, particularly in regards to the different advanced analytics used to achieve mission objectives.  Repeatedly, however, the speakers zeroed in on various obstacles hampering their efforts within big data and analytical initiatives. For the purpose of this writing, I’ve categorized them into four main buckets: data management, people, security and technologies.

Data Management

Big data cannot survive if there are poor data management practices says Dr. Daniel Gerstein, policy researcher from the RAND Corporation and former DHS employee. Though this may seem simple enough, Gerstein conducted a big data survey within DHS in 2012 and found that a majority of the respondents had an inconsistent definition of big data, could not identify a single governance for big data and that interface between operators and developers was missing. Mr. Kirk Brustman, Director at the Army Intelligence Information Services confirmed this by stating his office needs to organize the various data, platforms and people in order to deliver the information to a top secret network. Brustman’s division must deal with top secret systems and incoming intelligence data from the coalition as well as secret system data from the tactical army and gave the example that there are 47 different ways of time being plotted in data visualization tools for them. He stated that he is currently using pilots of cloud and data visualization tools but has a long way to go in to getting to a point where the Army’s 10 different networks that pass and receive information can do so in a smooth manner using cross-domain solutions.

People

Finding the training to build up a skill set within the data science field was an urgent need brought up by practically every government presenter. Ms. Lisa Shaler, Deputy Chief, Program Budget Data Management Division, Army G8, Program Analysis and Evaluation (PA&E) stated that skill sets the Army wants in dealing with big data is a mix of expertise in operations/intelligence and data processing, computer science, statistical analysis, mathematical statistics, machine learning and some people with a little bit of each combined. To do this, Shaler described a Functional Area 49 training group within her team who have been receiving help from industry partners such as Digital Globe, AWS and IBM. PA&E is currently undergoing a data science pilot with AWS since March 2016 in order to leverage data science capabilities to marry big data and “spreadsheet data” that do not scare off cyber capabilities and are ramping up to a full transition from on premise AWS.

Security

As with anything critical, big data initiatives must have the Authority to Operate (ATO), explains Mr. Kevin Garrison, Chief of Analytics from DOD CIO. The newer the technology (i.e. artificial intelligence), the higher the risk and more ATO challenge. On average, it takes his office 2 years in order to get a project running. His solution? Baby steps. Allow a project to “crawl, crawl, crawl, walk and then run.” He has found success in following the process of first waiting a short round and testing the security waters before going into a longer term route followed by working simultaneously with multiple clients. He also advised that when mixing sensitive and insensitive data together that it all too quickly becomes sensitive data. Moreover, a risk assessment team must be in place to alert of any “red flags” in the project.

Technologies

One of the general problems those in defense and intelligence face are the many choices of technology for big data that are being offered. Speakers stated that everyone claims that every tool they have does everything. However, the need was emphasized to first understand the right technology needed for the business models within government divisions. Additionally, often times a training tail is needed in the technologies being offered in order to increase those who know how to use it. Finally, the technologies must be able to integrate not only with different systems, but likely with aged ones as well. An example of such difficulty was presented by Shaler from the Army.  Her office needed to migrate to a new data center and had four choices: move to a DISA center, commercial cloud or move to one of four enduring centers. Her office ultimately chose to go with AWS cloud. In doing so, it has disrupted their support for the analyst customer as they had to force the customers to move to Windows 10, becoming more of a disruptive process than originally anticipated.

Contractor Help

All in all, while industry help can only go so far with some of these difficulties, Shaler offered a list points for industry to consider when moving forward with the federal government in big data initiatives:

  • Simplify operations – for instance, “server less” operations
  • Vendor lock-ins are less preferable
  • Disaggregate services to leverage government security models
  • Offer training
  • Solve new problems for them