Signs of the Data Apocalypse: Finite Network Bandwidth Meets Infinite Data

Published: September 11, 2013

Big DataCloud ComputingCybersecurityDEFENSE

As the volume of data agencies are dealing with rises, the amount of funding they have to handle it is declining. This is placing agencies in the difficult position of figuring out how to implement new technologies on a shoestring budget. They are adapting, but as new technology approaches are adopted, including mobile computing, cloud computing, and big data, agencies are finding that their new environments demand even greater investment in transport infrastructure to accommodate increased data flow.

Eleven months ago, Deltek published a report on the Federal Big Data Outlook, FY 2012–2017.  In that report we discussed something we termed “little data.”  We defined little data at the time as the average or typical volume of data flowing through agency networks on any given day.  This is in contrast to the massive volumes of unstructured data associated with what is commonly thought of as “big data.”  Little data, we argued, was about to be transformed into a big data challenge by the increased use of network automation, the breaking down of stovepiped systems, and the opening of networks to a larger number of end-point , specifically mobile, devices.  In effect, we envisioned that the flattening of agency networks to enable the flow of data in all directions would quickly overwhelm the capacity of those networks, creating latency problems and bottlenecks that would diminish performance and reduce overall effectiveness.
 
If any of this sounds familiar, then you probably read a new report published on September 9th that shows agency network managers are struggling to deal with precisely the “little data” challenge that we predicted they would.  In this new report, the focus is on the “Big Five” technology initiatives currently underway across the federal government – cyber security, data center consolidation, cloud computing, mobile communications/computing, and big data.  Of these five initiatives, big data alone poses a major challenge to agency networks.  The bandwidth requirements of big data make the solution something much larger and significantly more complicated than implementing a data crunching appliance or buying analytics software.  Adopting big data tools requires foresight and careful planning.  Agency IT managers need to develop estimates of how much data is/will be traveling across networks, how much storage capacity they estimate they will need, and what types of analytical tools their analysts will want/need to use.
 
As an example, take the Department of Defense’s ongoing effort to adapt to the new everything-over-the-network (EOTN) reality.  On the surface, EOTN may seem like the common sense solution to the problems agencies face in their current IT environments.  Yet, even as the DoD’s budget is being cut by huge amounts annually, the department finds itself in need of a new 100+GB per second fiber optic network to push the data it requires.  Consider also the underlying lesson of the Large Data Object Storage episode; this being that one of many types of data (i.e., ISR video) is driving the department’s need for massive data storage capacity.  Add all the other types of data flowing over DoD networks and the magnitude of the problem becomes clear.
 
In a world of unlimited funding these problems go away.  But that is not the world we live in.  Exactly where are agencies supposed to get the money they need to modernize and upgrade their networks?  My hunch is that more agencies are going to follow the trail being blazed by the DoD and invest in Multi-Protocol Label Switching (MPLS) technology.  This will meet some of their needs, but agencies with geographically dispersed missions will still need to rely on scarce satellite bandwidth or maybe containerized data center solutions to provide capabilities.
 
In sum, dear readers, federal agencies are in a quandary.  They are under pressure from policy mandates and technology trends to modernize their networks, move to the cloud, utilize big data tools, consolidate data centers, and incorporate access for a greater variety of end-point devices.  But they are finding that acting on these initiatives is creating another expensive problem that they may have trouble funding.  This is the central issue moving forward, so helping agencies innovate to save is a key value proposition.