The Three C’s of Big Data in Federal IT

Published: October 09, 2013

Big DataInnovation

The outlook for federal spending on big data goods and services is positive, but tempered by the overall deteriorating budgetary situation. In this post I outline what I see is the current state of the federal big data market and offer some recommendations maximizing business opportunities.

Big data is all the rage these days.  Gripped in the upward slope of the hype-cycle, articles about big data in the trade press and the never-ending parade of big data events describe the benefits of advanced analytics to potential federal customers.  The hype is all well and good, serving an important purpose by generating excitement and interest, but what does it really tell us about the state of the big data “market”?  Not much, I’d argue.  For example, at a conference on big data in the military and IC that I recently attended the first few speakers spent time defining big data!  After years of hype and Gartner’s introduction of the 3 “V’s” of big data – volume, variety, and velocity – we are still stuck on defining what big data is?
The confusion surrounding the definition of big data is a good place to dive into the purpose of this post, which is to describe where we are when it comes to big data and the federal government.  Here I will shamelessly take a page from Gartner’s playbook and break my characterization of the federal big data “market” into three “C’s” – Confusion, Changeability, and Competitiveness.  By describing the market in these terms I hope it will help readers navigate what have become strange and unpredictable times in Federal IT.
Let’s start with the confusion.  Forget for a moment the lack of clarity about how big data is defined.  This changes depending on who you ask and in the end the most important definition you’ll need to know is the one used by your customers.  A more important concern in my mind is the lack of a clear path forward.  Agencies are bogged down in multiple ongoing efforts to modernize and reform their IT environments.  These efforts are taking the lion’s share of budget and personnel resources to deal with.  As a result, although many agency IT leaders see the benefits of using big data solutions, most are unable to focus on developing a strategic approach to using them.
Further complicating the situation is the fact that using “big data” solutions is actually secondary to developing an enterprise approach for managing data.  Without an enterprise data management strategy, large-scale agency big data use is probably dead in the water.  Don’t get me wrong, I am not arguing that big data tools cannot be applied to specific data sets or problems.  This can and is being done at all agencies, as far as I can tell.  What I am saying is that these efforts are taking place in isolation because most agencies have not yet developed data management strategies that offer a clear vision of how use of the technology should progress. In short, confusion reigns, both in the definition of big data and in understanding how to move forward with using solutions.
This brings me to the second characteristic – the changeable nature of agency IT environments and the evolving big data solutions industry is throwing at them.  With no established path forward, agencies are approaching big data haphazardly and in the context of dynamic IT environments that are changing rapidly.  Data centers are being closed and virtualized.  Agencies are still pivoting toward cloud computing.  Vast legacy systems remain in place.  How all of this fits together is still a question mark in many agency IT shops.  Big data adds a complicating factor that most are ill-equipped to deal with in as dynamic environment as this.  Many agencies will continue to be interested in big data solutions, but expect their efforts to be halting until IT environments achieve some stability and coherence.
Lastly, there is the competitive environment to consider.  The declining overall number of addressable contract dollars is forcing more vendors into smaller competitive areas.  This scenario further reduces the revenue stream for everyone involved until a clearly dominant set of vendors appears on the scene.  Think big fish eating smaller fish until balance is achieved.  The silver lining here is that big data encompasses HW, SW, and Services.  Because it reflects the IT market as a whole, “big data” is thus able to accommodate a large number of vendors.
First, confusion and dynamic IT environments are making it difficult for vendors to establish a single strategy for selling big data solutions to federal customers.  The requirements of the IC and DoD will be different from those at the EPA and Interior if only because their IT environments and levels of understanding about big data are in flux.  It is therefore important for vendors to help federal customers identify needs while simultaneously explaining big data solution benefits.  Most importantly, emphasize ROI!
Second, in this increasingly competitive environment it is recommended that primes create a stable of trusted partners, especially small business partners, to provide pieces of big data solutions as customers require.  Flexibility and the capacity to meet a variety of customer needs and demands is the key to increasing competitive opportunities.