Agencies Struggle to Handle Big Data Challenges

Published: March 06, 2013

Big DataCloud ComputingDHSInnovationNASA

Major themes discussed at this year's Government Big Data Symposium, sponsored by Technology Training Corporation (TTC) included the growing challenge of data accumulation and analysis, the changing face of available commercial solutions, and the relationship between big data and cloud computing. Business opportunities exist in all three areas as agencies grapple with the challenge of big data

I had the opportunity recently to attend an excellent conference hosted by the Technology Training Corporation. This conference, called the “Government Big Data Symposium,” meets every year at the Holiday Inn in Arlington, Virginia. Organizer Marcus Min and his people do a fantastic job assembling a roster of government officials and industry experts to discuss big data challenges, solutions, and applications. This year’s symposium was solid as always, yielding a number of insights that help put attendees’ fingers on the pulse of big data projects and initiatives at federal agencies. Here are a few of the major themes discussed during the conference that I found interesting.
The Data Tsunami Continues to Grow
Anyone involved in either analyzing big data or in selling solutions feels this problem on a daily basis. Several of this year’s speakers emphasized that federal agencies with scientific missions are already at or past the point of Petascale computing. The challenge of handling this data has become acute at even relatively small agencies like the National Oceanic and Atmospheric Administration (NOAA). Dr. Mark Luker of the Networking and Information Technology Research and Development (NITRD) Program pointed out that NOAA’s data demands are compelling it to add 30 Petabytes of storage per year to archive its data. This massive inflow of data is only expected to increase.
Take the example of NOAA and apply it to larger agencies like the Department of Energy and National Aeronautics and Space Administration (NASA) and you will quickly see that the challenge of big data is not going away. This challenge presents a real business opportunity for vendors. Agencies are so reliant on data to accomplish their missions that storage vendors are in the enviable position of providing capacity that is not only desired, it is mission-critical. Similarly, those providing analytics are seeing an uptick in interest as agency personnel grapple with the problem of too much data. Finally, lest services vendors feel left out, agencies are in need of consulting services and data analysis services like never before as they try to understand how to incorporate the next generation of analytical tools into their IT environments.
The Changing Complexion of Solution Sets
Since discussion of big data arose a few years back it has become common to hear about the need for data scientists. Ideally these specialists would belong to an integrated team of professionals that parse and analyze data to enable valuable business decisions. This approach remains a best practice, but it presents federal agencies with a couple of significant challenges: a shortage of trained personnel and increased costs. Not only is the data scientist a rare breed that is in demand in both the public and private sectors, he/she also commands a good salary. In the current environment of fiscal austerity, finding and employing data scientists raises the bar for agencies seeking to invest in big data solutions.
Advancing technology is addressing this challenge, however, by providing alternatives that do not require specialized personnel to operate. Tableau would be one of these. As Sean Brophy of Tableau explained to me at the TTC Government Big Data Symposium, his company’s solution provides visualization capabilities for non-IT specialists, making it easy to use and reducing the need for agency spending on specialized personnel. I do not endorse one commercial solution or another, but it struck me that gearing solutions to non-specialists is the smart way to go for analytics vendors seeking to increase their share of the market in a fiscally constrained environment.
Cloud Computing and Big Data Come Together
Another common theme at the symposium was the growing nexus of cloud computing and big data solutions. Representatives from multiple agencies expressed interest in employing big data solutions in the cloud. NASA Chief Technology Officer, Dr. Sasi Pillay, emphasized that the agency is poised to significantly increase its investment in commercial cloud computing solutions. Michael Simcock, Chief Data Architect at Homeland Security (DHS) also said that his department is interested in making greater use of cloud for big data solutions. The only caveat was that the solution will be hosted in a private cloud. DHS will not use a public cloud for big data.
My impression from speaker comments is that the importance of the cloud for growth in federal big data investments cannot be understated. Cloud computing offers a relatively simple way to acquire the required solutions. Cloud computing can also scale up computing power on demand. For example, Dr. Nancy Grady of SAIC described a proprietary solution that automatically senses a processing load in the data queue and spins up (or down) the required number of machines to get the job done. Given the interest at federal agencies to acquire greater computing power on demand it sure looks like this will be an area of continued agency investment for years to come.