A Sound Future for Big Data in Government

Published: April 05, 2017

Big DataInformation Technology

Based on insight from government and industry experts, successful data initiatives and upcoming legislation, the future looks bright for big data in the federal sector space.

The realm of big data is as expansive as its name suggests. Its services reach areas such as advanced analytics, distributed computing, data warehouse support and high performance computing. It can be applied to a wide variety of sectors ranging from healthcare and human resources to fraud prevention and national security.

A precise definition for the term “big data” has been difficult to locate, however, I’ll take a stab at one – massive volumes of datasets applied to problem-solving situations that lead to qualitative answers and solutions.

The collection of massive amounts of data is not new to the government space. From decennial census information under Commerce to satellite data at NASA and tax information under Treasury, there is no argument that large data repositories exist across federal agencies. Nonetheless, using that data and applying it toward agency missions and functions as well as American public use, is now becoming a larger phenomenon within the public sector.

And why shouldn’t it?

According to a panel of government and industry analysts gathered for a Federal Executive Forum (FEF) hosted by Federal News Radio, the government has benefited from a number of successful big data initiatives.

Big Data Projects

Alan Ford, Director of Government Systems at Teradata, described work being done within CMS and its high volume enterprise data warehouse called the Integrated Data Repository (IDR). The warehouse contains 600 terabytes in data including provider, risk score, beneficiary, claims, Medicare, and contractor information that is all integrated to detect fraud. According to Ford, CMS has identified $3.3 billion in fraud every year for the past three years, marking an $8 savings for every $1 spent on the big data program.

NASA CIO, Renee Wynn, stated that big data in aeronautics is used within the agency to save lives. For instance, it is used in partnership with Application Program Interfaces (APIs) and synthesized to turn on machines for pilots when they enter the G-force and become unconscious. This allows the jet to be taken over by automated systems through the use of the data.

The Future

The panelists also provided their thoughts on where the future of big data may lead within the government space, including:

  • Prescriptive Analytics – telling users what the best choices are
  • A move towards services models over stove-piped projects – allowing programs to take advantage of building blocks versus starting from scratch each time
  • A larger role in precision and personalized medicine
  • Autonomous decision platforms – allowing machines to sift through mounds of data and only alert analysts of data that needs the most attention
  • The integration of government into the fabrication of daily lives – for example, providing instant and detailed weather information
  • The big data way of unlocking the true value of that data will lead to new hypothesis and ideas and explore more unknown relationships

Legislation

The increased presence of big data in government is prevalent among legislation as well. At a Full House Committee on Oversight and Government Reform hearing, representatives reintroduced the Open, Public, Electronic, and Necessary Government Data Act (OPEN) Government Data Act. The OPEN Government Data Act will seek to make all available government data published in a machine-readable format and available to the public, a move towards transparency and accountability for federal agencies. If passed, the legislation will result in the emergence of big data projects across federal agencies in order to extract the information from various sources and convert it into the same format.

Challenges

During the FEF, government and industry panelists cautioned that various challenges are still present in the big data space. Among them: governance, culture and the security of data within the public sector. Since big data initiatives require the release of “control” of the data, governance is required for authoritative information and to determine what data tools should be used to service the customer. Likewise, the stagnant state of government culture and the mindset that data must only be protected rather than used and preserved continues. With regards to integrating technologies into the government from a security perspective, many big data tools were not designed with security in mind, particularly government security constraints.

Implications

The looming presence for big data will call for an increased involvement from industry. In fact, at the FEF, CDO for NOAA, Ed Kearns, stated that data is so popular in the agency that it cannot build access systems fast enough to keep up. Moreover, appropriated dollars also cannot keep up with the pace. Thus, NOAA has developed partnerships with industry through the Cooperative Research and Development Agreements (CRADA) to assist the agency in such things as data analytics to keep up with the demand.

According to an article in Federal Times, Secretary of Commerce, Wilber Ross, announced in an April 4th meeting that the Joint Venture Program under NTIS, also consisting of various company partners, has been paramount to capitalizing data throughout government with industry’s help.  “And I believe that to the extent that the private sector can be better engaged up front in helping federal agencies formulate the problem statements, I think we will do a much better job of delivering services to our citizens,” said Ross.