B2G is moving!
Blogs posted after May 22, 2015 will be located on Deltek's central blog page at
Just select the "B2G Essentials" blog to continue to receive this valuable content.
GSA Progresses with Implementation of Category Management Method of Acquisition

GSA is piloting a Common Acquisition Portal (CAP) as part of its Category Management Initiative launched in April. 

Category Management is the concept of grouping products and services into categories or “hallways” for access by contracting officers and program managers.  Category management is how the most successful Fortune 500 companies approach acquisition.   It focuses on five key areas:  

  • Optimizing contract vehicles and managing the landscape  
  • Managing data collection and analysis  
  • Leveraging supplier relationships  
  • Maximizing customer relationships  
  • Growing and sharing expertise

For GSA’s Federal Acquisition Service, this will mean identifying core categories of business they will develop a higher level of expertise and manage like a strategic business unit.  The expertise will be leveraged to direct buyers to the best solutions for them while streamlining the procurement process. 

GSA officials gave a briefing on the CAP project at the recent ATC/IAC Executive Leadership Conference.   “Right now there are tens of thousands of contracts across our government,” said Tom Sharpe, commissioner of GSA’s Federal Acquisition Service. “One company alone can have hundreds of contracts with the federal government; there is a way to radically change federal procurement, and it’s as simple as working and acting as one.”

Online access to CAP will occur through an “acquisitions gateway,” which will guide users through their category and procurement options.  The gateway is currently under development.   A beta version of the gateway was launched in early October with three hallways:  IT hardware, IT software and office supplies.   

Over time, the hallways will be developed with information and services to continuously improve acquisition outcomes.  Ultimately, the solution will provide the following capabilities:  

  • Procurement Optimizer:  A comprehensive contract-comparison search engine that enhances competition for government acquisition 
  • Market Intelligence Center:  Category-centric market research materials that guide purchase decisions based on category manager’s government-wide expertise  
  • Clear View:  Real-time data on pricing and purchasing, as well as assessment tools that help provide a big-picture view of government and individual agency spending behavior  
  • Collaborative Contracting Library:  Provides a resource to jump-start procurements with a central repository of exemplary contract work for complex buys compiled by community experts.  
  • eMarketplace:  An eCommerce transaction platform for simple purchases

GSA’s goal is to streamline and simplify the federal acquisition process, and to help agencies be more efficient and make smarter buying choices.


NASA Seeks Industry Guidance on Data Center Solutions

Mid July 2014, The National Aeronautics and Space Administration's Goddard released a request for information on data center consolidation strategies for near-term, interim, and long-term strategies to address federal mandates for reducing IT footprint and improving energy efficiency.

Although federal agencies they've continued monitoring progress, agencies have not publicly released any recent updates to their data center consolidation plans. As part of the Federal Data Center Consolidation Initiative (FDCCI), NASA initially reported 79 data centers, closed 14 centers and revised the count to 58 after a physical inventory. The target is to retain 22, and at the end of 2014, NASA is expected to have 16 data centers to close before reaching its goal.

Along with the request for information around the Data Efficiency and Containerization effort, NASA GSFC released details regarding the several approaches under consideration. The first approach targets retrofitting technologies and solutions with a high return on investment (ROI) as short term and interim measures. A number of technologies to address cooling, power, and software management are under consideration. Approaches for cooling include rear door heat exchanger, direct liquid cooling, and application of water-side economizers. For power, NASA is looking at transformer-free uninterruptible power supply (UPS), power distribution units (PDU) that convert power from AC to DC, and fuel cells. Management software technologies being explored include remote power monitoring, power management based on the impact on energy consumption, and server utilization management. The potential for heat reuse applications is also on the table for deliberation. The second approach for data center consolidation targets standalone, containerized solutions for two possible use cases. One use case involves a management information systems computing scenario with power requirement up to 10 kW per rack. The other user case scenario involves high power computing with power requirements up to 30 kW per rack. The third approach aims to establish a long term strategic data center plan leveraging containerized data center solutions with a high return on investment. All of the approach must meet compliance requirements for Federal Data Center Consolidation and Green mandates.

Beyond the efforts at Goddard, NASA's consolidation of Agency and Center-specific data centers will continue through efforts to simplify IT architecture by reducing duplication within the IT footprint. Some of the savings expected to result from these efficiencies will be reinvested to support mission programs and projects. NASA intends to reinvest a portion of these savings to fund critical IT innovations in order to drive further efficiencies and cost savings. Candidate investments to drive efficiencies include standardization of mobile and collaboration capabilities, continued consolidation of IT security tools and computing services, and shifting web services to a cloud platform. Analytics will be one significant area that will benefit from data center improvements. NASA's vision for big data includes improving the capability to extract value and insight from the data it already has. To this end, NASA may explore the potential for creating a new, virtual mission to examine the data it possesses. With the vast volume and variety of data on its systems, NASA will need to overcome storage and accessibility challenges to ensure timely availability of information.


Originally published in the GovWin FIA Analysts Perspectives Blog. Follow me on Twitter @FIAGovWin .

GAO Recommends Further Implementation of Best Practices to Improve Federal IT

Earlier this month, David A. Powner, Director Information Technology Management Issues at GAO, testified before the Senate Committee on Homeland Security and Governmental Affairs regarding critical factors necessary for successful IT projects and acquisitions.

Powner used information from previous GAO reports and studies to offer insight into best practices and reform initiatives that can help improve IT investment management.  Expanded use of critical success factors in IT acquisition, such as active stakeholder engagement and support from agency executives, along with further implementation of government and industry best practices, will better position agencies to more effectively deliver mission-critical systems, according to GAO.

One key reform initiative has been the IT Dashboard, launched by OMB.  The dashboard provides information on 760 major IT investments at 27 federal agencies.  As of April 2014, 74% of the investments were low or moderately low risk, 21% were medium risk, and 6% were moderately high or high risk.  GAO issued a report in 2011 which voiced concerns about accuracy and reliability of dashboard data, but also pointed out that data was improving over time.  Recently, GAO reported that agencies had removed major investments from the dashboard which raises concerns about transparency.  Additionally, GAO noted that the timeliness of updates to the dashboard was lacking.  As of December 2013, the public version of the dashboard was not updated 15 of the previous 24 months.

Powner also cited OMB recommendations for increased incremental development, but GAO’s recent findings indicate that almost 75% of investments reviewed did not plan to deliver capabilities every six months and less than half planned to deliver capabilities in 12 month cycles.

Additionally, Powner refer to data center consolidation efforts, and the continued need for oversight and tracking.  He also praised PortfolioStat efforts, but recommended more consistent implementation across agencies.

With over $80 billion in federal IT spending per year, it’s incumbent upon agencies and the administration to learn from successful IT implementations, as well as failed projects.  While use of best practices, legislation, and OMB efforts at transparency and oversight have improved IT execution and spending, continued leadership and attention is necessary to build on current progress.



New DOD Acquisition Instructions Meant to Simplify Process

On December 2, Ashton Carter, deputy defense secretary, distributed a memo to the acquisition workforce implementing new DOD Instruction 5000.02 directed at streamlining the acquisition process and tailoring the process to the product or service being acquired.

DOD Instruction 5000.02 entitled, Operation of the Defense Acquisition System, was signed November 26th to serve as interim guidance while the DOD office of Acquisition, Technology & Logistics (AT&L) develops a legislative proposal to simplify the current body of law into a more user-friendly set of requirements.  The new 5000.02 is meant to “achieve greater efficiency and productivity in defense spending and effectively implement the department’s Better Buying Power initiatives.”

Carter stated in his memo that he tried to make the interim instructions more helpful to both experienced defense acquisition professionals, as well as those new to the defense contracting process.  However, at first glance it seems streamlining the process takes more explanation, as the new instructions total 150 pages and the instructions from 2008 only total 80 pages. 

The instructions are organized into a main document with thirteen enclosures describing policies and procedures for a specific aspect of acquisition or a specialized type of product.  The instructions also incorporate a number of statutes and regulations that have been adopted since its original publications in 2008, which may explain its added document length. 

Another key addition to the interim instruction is two new decision points:  the requirements decision point, and a decision point for a development request for proposal release.  The new requirements decision point is the starting point for the requirements analysis and allocation system engineering process, according to Carter’s memo.  It also informs the RFP for the development phase.  Frank Kendall, undersecretary of defense AT&L, stated that he regards the development request for proposal release decision point “as the most important single decision point in the entire life cycle because the release of the engineering and manufacturing decision RFP sets in motion everything that will follow in the product’s life cycle.”

A team led by Andrew Hunter, the director of the Joint Rapid Acquisition Office, will be developing the new formal body of legislation and will work closely with Congress over the next few months to do so.


Recapping the National Association of State Technology Directors (NASTD) Conference

As the 2013 National Association of State Technology Directors (NASTD) Conference wrapped up, both vendors and state IT officials may have left Charleston, S.C., with one message pounding in their heads: Watch out for storm clouds on the horizon.
Concerns over cybersecurity, employee retention and the pending roll out of FirstNet – the national public safety broadband initiative – dominated this year’s conversation as NASTD officials packed sessions with multiple speakers on each topic. Each subject has been more or less driven by a combination of current events and long-term trends.
The long-awaited wave of retiring baby boomers is finally underway and wreaking havoc on the ability of federal and state agencies to replace experienced personnel and retain institutional memory. After four years of planning and design, federal officials are getting ready to tally the number of states that will opt in to the federal FirstNet broadband plan and those that will build their own network. States received a wakeup call in October 2012 when nearly 4 million social security numbers and credit card data were hacked from South Carolina’s state government. The cyberattack brought to life the warnings that cybersecurity officials in the public and private sector have been quietly raising for years.
Most of the speakers opted to take an awareness approach and attempted to lay out the dire problems and statistics as plainly as possible; not because they were dodging the issues, but often because there are no obvious solutions to these problems. Besides, that wasn’t necessarily their job. Ultimately, these challenges are going to have to be addressed by the people who were sitting in the audience.
The dominant themes among these kinds of conferences for the past few years has been the recession, budget cuts and figuring out how to maintain service levels with fewer resources. The conversation has begun to shift, but the major themes of NASTD 2013 demonstrated that the end of one crisis often provides state IT officials with just enough breathing room to prepare for the next.
Cybersecurity in the age of cloud adoption and the mobile workforce will be one of the preeminent issues state and local governments deal with over the next 3-5 years. The volume and sophistication of attacks directed at state governments is rising at an alarming pace every year, which means that more state CIOs are going to be expected to pursue aggressive security strategies over the next few budget cycles. More attacks similar to the South Carolina hack will ensure that funding and budgets for these areas are robust. Dedicated network penetration and training for staff to help identify common phishing techniques and personnel security measures were two methods that most security officials stressed at the conference.
In the public safety realm, vendors should be on the lookout for another handful of RFIs dealing with FirstNet development and implementation. Whether a state opts in or out of the federal plan, the NTIA foresees a considerable amount of private sector involvement for this project over the next few years, which is good news for vendors nationwide.
For the full version of the National Association of State Technology Director's Conference Recap, click here (subscription required)

Oversight Subcommittee and GAO Urge OMB and Agencies to Better Manage IT Investments

A late July House Oversight and Government Reform Subcommittee hearing on Federal Data Centers and Cloud garnered much press recently, especially in light of GAO’s proclamation that there are actually over 7,000 data centers, more than double original inventory counts.

Although the title of the hearing emphasized federal data center consolidation and cloud usage, testimonies and questions from subcommittee members aimed at addressing overall management and oversight of federal IT initiatives.  The subcommittee heard testimony from David Powner, GAO’s Director of Information Technology Management Issues; Steve VanRoekel, Federal CIO and OMB’s Acting Deputy Director for Management; and David McClure, GSA’s Associate Administrator in the Office of Citizen Services and Innovative Technologies.

Powner’s testimony highlighted a number of areas for improvement in IT investment management, but also gave credit to several OMB initiatives that have produced positive results over the last three years.

Positive Results

  • Creation of the IT Dashboard
  • Implementation of TechState Reviews
  • Launch of the Federal Data Center Consolidation Initiative (FDCCI)
  • Implementation of PortfolioStat Reviews    

Needs Improvement

  • Accuracy and reliability of cost and schedule data in the IT Dashboard
  • Validation of cost savings from TechStat to date
  • Implement TechStat for all investments with moderately-high or high-risk rating.  (Currently only evaluating 33%)
  • Track and report on key performance measures for the FDCCI and improve oversight mechanisms
  • Determine whether agencies are completing key actions related to PortfolioStat and the incorporation of FDCCI into the process.

Powner stated, “Information technology should enable government to better serve the American people. However, according to OMB, despite spending more than $600 billion on IT over the past decade, the federal government has achieved little of the productivity improvements that private industry has realized from IT.”  GAO’s investigation of the IT Dashboard showed 154 investments at risk, totaling $10.4 billion.   

Oversight Subcommittee members agreed and asked pointed questions of those testifying and why the current tools and oversight programs are not producing better results.  Subcommittee Chair Rep. Mica asked Powner where we were now on server utilization.  Powner stated that server utilization is not being monitored now.  Rep. Mica also stated that GSA was setting a poor example for data center consolidation having only closed one data center to date, with over 100 non-core data centers in existence.  Rep. Meadows asked McClure how GSA planned to close 37 data centers in the next two months, when they only managed to close one in the three year existence of the FDCCI program.  

Rep. Mica closed the hearing on a more positive note by telling the panelists that the subcommittee was going to figure out a way to give them all the tools they need to help get the job done.  He also encouraged GAO to keep up the good work of monitoring federal agencies and programs, and bringing issues to light that need attention. 





Proliferation of data collection drives business intelligence contracting market

Big data has been in the news a lot lately, and I’m not just referring to the massive surveillance apparatus run by the NSA, which sweeps up nationwide phone metadata and may (or may not) do the same for Internet data. The private sector has been getting in on the act as well.
Streaming giant Netflix has been at the forefront of this trend, collecting and analyzing a staggering amount of data from its 29 million customer base, then using that information to determine, with scientific-like accuracy, what new shows to green light and even who to cast.
While some have questioned the value of such a data-driven approach, one thing is becoming abundantly clear: there is no going back. Big data and the tools to make sense of it are frequently used to sort “winning” ideas from “losing” ones in business and government.
A multitude of factors – the digitization of records, smartphone and social media revolutions, and widespread Internet availability – have coalesced into a tipping point, and we have suddenly found ourselves in a world rich with digital data and incentives to figure out what it all means.
There has never been more of a demand for business intelligence and data analytic tools, and the applications for state and local governments are endless. The move by governments toward a more data-driven approach to policymaking has been happening for some time now, and reminds me of the “CitiStat” revolution that swept the state and local landscape in the late '90sand early 2000s. Obviously the technology around data analytics has advanced since then; however, I believe the more important driver of business intelligence tools has been the expansion of information databases over the past decade.
Whether you are registering for a driver’s license, applying for food stamps, or donating money to a political candidate, your personal details are now more likely to wind up stored in a database somewhere. This trend holds true for both the private and nonprofit sectors as well, where finding out what makes your customers tick can be crucial to remaining a sustainable enterprise.
Of course, not everyone thinks this is a positive development. The potential for abuse and the erosion of civil liberties is both real and frightening, but big data and business intelligence are not inherently good or evil. They can be used for Big Brother-like surveillance or as a means for making government more open and transparent. It all depends on use and the legal barriers we place around that use. Figuring out the appropriate balance and legal oversight will be important, because as Deltek Senior Manager Chris Dixon detailed in a report released in February, governments have “significantly increased” the number of solicitations with business intelligence components since 2010.
Opportunities with BI components by status, 2001-2014

Source: Chris Dixon’s “Business Intelligence Market 2013,” Industry Analysis Report, GovWin IQ
These opportunities span every vertical market and can be delivered through either software or professional services contracts. Public safety agencies like the Chicago Police Department are considering purchasing predictive policing and crime analytics software in order to leverage their existing data warehouses for traffic, crime and accidents, police contact reports, warrant and arrest records, and other sets of data.
The Massachusetts Executive Office of Administration and Finance wants to use analytics software to root out identity theft and personal income tax fraud. The Indiana Family and Social Services Administration awarded a contract to Optum Health in 2012 for decision support and business intelligence to analyze its existing Temporary Assistance to Needy Families (TANF) data warehouse.
In 2011, the Minnesota Department of Administration awarded a consultant contract to 22 vendors in the hopes of improving operational efficiencies and budget savings across departments through the use data analytics. There is almost no limit around the use of these tools in state and local government, and virtually every agency has a need to (as statistical guru Nate Silver phrases it) locate the signal through the noise.
Analyst’s Take
We are right in the sweet spot for business intelligence and data analytics requirements. Many governments have already done the legwork of setting up the necessary database infrastructure for the information they wish to store and measure. As GovWin IQ’s numbers show, they are just now beginning to realize the endless potential of having sophisticated tools to guide better policymaking and resource allocation.
A limiting factor for this technology does exist: Governments tend to purchase it as a component or add-on to larger, more comprehensive software and professional service contracts. Finding a way to market your product to both governments and prospective software vendors for subcontracting opportunities will be critical for companies in the business intelligence market. As the amount of data and the proliferation of large-scale data warehouses continue to develop, the prospect of add-on BI components may no longer be enough to achieve the desired results of many governments, and the number of standalone business intelligence/data analytics bids may increase substantially over the next five years.
For more on the current state of the business intelligence/data analytics contracting market, see Chris Dixon’s Industry Analysis Report “Business Intelligence Market 2013” (Subscription Required)


North Carolina’s bleak performance audit of 84 IT projects

Vendors who have ever worked on a government IT contract know that there are often discrepancies between initial estimates and actual hours worked and dollars spent on the project. In a way, overages (time and budget) have become the unspoken status quo for many projects. The problem is state and local governments looking to cut waste and seek efficiencies are realizing that this new norm is counterproductive to their bottom line.
The North Carolina State Auditor’s Office released a performance audit on April 12 that called out 84 IT projects that cost the state a total of $356.3 million in overages, and took a total of 389 days longer than initial project estimates suggested. Essentially, these projects have cost twice as much and taken 65 percent longer than expected.
The audit highlighted some of the more glaring examples, such as the state’s Medicaid management information system (MMIS), which was initially estimated to cost $92.7 million with the project completed in November 2011. However, the MMIS project ballooned to $229 million and now has a completion date of October 31, 2013. Another example is the N.C. toll collection management system (TCMS) project, which was expected to cost $19.8 million, but now has a revised budget of $41 million.
The most egregious project on the audit is the state’s tax information management system (TIMS), which had early estimates of costing only $525,000, with a completion date of December 31, 2011. Nonetheless, the project exploded to $97.3 million and now has a due date of January 31, 2014.
The audit highlighted the seriousness in which the state of North Carolina is viewing this problem and pinpointed two main issues that have heavily contributed to the overages among state IT projects:
1.    Actual costs and schedules differ significantly from original estimates, which can result in unplanned spending and resource use.
a.     No standard practice for creating IT projects estimates
b.    No independent validation of agency estimates
c.     No accountability for unreliable estimates
2.     Procedures do not ensure complete, accurate, and timely data.
a.     No method to identify IT projects that circumvent the SCIO approval process
b.    No assurance that historical IT project data is preserved
c.     No oversight/review of self-reported IT project data from state agencies
d.    No consequences/incentives to compel state agencies to submit IT project status reports in a timely manner
The auditor’s office made six recommendations to mitigate these issues:
  • North Carolina Information Technology Services (ITS) should develop and publish written guidance for developing state agency IT project cost and schedule estimates. The guidance should also describe the education, experience, and credentials needed by the personnel who develop the estimates.
  • ITS should require state agencies to obtain independent validation of the accuracy and reasonableness of IT project estimates. Alternatively, ITS should require agencies to submit appropriate and adequate documentation so that ITS can evaluate and determine the accuracy and reasonableness of agency estimates.
  • ITS should request that the General Assembly consider enacting state law to hold state agency managers accountable and require them to meet IT project cost and schedule estimates.
  • ITS should develop and document a method to identify state agency IT projects that require the SCIO’s approval. ITS should also ensure that the EPMO Tool retains both historical and current information to allow for trending and analysis.
  • ITS should develop and document procedures to verify state agency data in the EPMO Project Portfolio Management Tool.
  • ITS should consider asking the General Assembly for the authority to ensure that ITS receives project status reports on schedule.
North Carolina’s new chief information officer (CIO), Chris Estes, agreed with all six recommendations produced from the audit and said ITS will address the issues found in the audit in the upcoming Statewide IT Plan, which is expected to be released on October 1, 2013.
Analyst’s Take
The audit reviewed IT projects from December 2011 to October 2012, and selected only projects whose original cost and schedule estimate data was available. North Carolina has a total of 1,034 state IT projects contained in its Project Portfolio Management database, with 128 active IT projects valued at $1.7 billion. Besides the cost of the overages identified, the fact that only 84 out of 1,034 IT projects had enough information to be included in the audit to begin with more than justifies the audit’s findings and recommendations.
If taken seriously, the North Carolina 2014 IT plan will include hard and fast solutions to improve oversight and management of state IT projects moving forward. These solutions will likely come in the form of policy, procedure and personnel restructures that will affect existing contracts and future procurements. Vendors looking to do business with North Carolina should count on an extra layer of scrutiny during the procurement process, especially when it deals with pricing, scheduling, and the management and success rate of past IT projects.
Not a Deltek subscriber? Click here to learn more about Deltek’s GovWin IQ database and take advantage of a free trial.



Sunshine Week: Maryland continues transparent CATS IT contract program

Day four of Deltek’s recognition of Sunshine Week continues promoting transparency in the name of efficient, effective and ethical government. The state of Maryland has long prioritized efficiency, transparency, and IT investment for the greater good of its constituency, and in doing so, has fostered an engaged citizenry and vendor community. The Maryland Consulting and Technical Services (CATS) contracts embody those priorities and are now entering their third generation as a streamlined procurement strategy for supporting information technology projects.  

The Maryland Department of Information Technology established the CATS program to allow state agencies to quickly and efficiently obtain IT consulting and technical services from a pre-qualified pool of vendors with services in 17 functional areas ranging from Web and Internet services to information system security and software engineering. 

The CATS methodology is often referred to as two-step procurement, with the first step qualifying a group of suppliers under one or more set of requirements, or functional areas. The second step allows using agencies to solicit responses from the aforementioned qualified vendors for a business need defined in a request for resumes (RFR) or a task order request for proposals (TORFP), depending on job’s anticipated value. 

Most two-step statewide term contracts close their curtains after the first, or qualification phase. This is often in contrast to other contracts in those same states whose award totals and contract documents are available. Since the first generation CATS contract was initiated in 2005, Maryland has not only made labor rates quoted in the qualification phase available, but also made the status and results of the second, or TORFP phase, transparent via the CATS website. Information provided includes the requesting agency, TORFP document, number of proposals submitted, awarded vendor, award amount, and in the case of CATS II, the MBE fulfillment. The Maryland Department of Information Technology also maintains a dashboard detailing CATS contract activity:


With a little bit of extra lifting with CATS data, we can see that 15 vendors have grossed more than $10 million in TORFP awards since 2005.



Of course, these are just a fraction of the 435 vendors qualified under CATS II. Additionally, there are more than 150,000 labor rates submitted under CATS II than can be viewed individually by vendors on the CATS II site or queried, sorted and downloaded by GovWin IQ members at the S&L Term Contract resource.

Sevice offerings under CATS II are classified under 17 functional areas and so by reviewing TORFP awards we can see what type of assistance state agencies are most often seeking.


After two successful generations of contracts with the original CATS and then CATS II, the Maryland Department of Information Technology decided to forge ahead with CATS Plus, and on July 2, 2012, released an RFP tracked under GovWin IQ Opportunity ID 51373Proposals were accepted on or before August 8, 2012, and awards are expected in April 2013. The department expects the third generation contract vehicle to engage more than 300 vendors and to exceed $400 million in spending volume in its five-year contract term, taking the total volume of the CATS contracts to nearly $1 billion.

Analyst’s Take 

Maryland’s CATS contracts provide a streamlined opportunity for IT professional services vendors to do business with the state directly or through partner opportunities. The transparency of the program also allows for insight into competitors’ pricing and successful bids. 

Look for more states to use statewide term contracts similar to CATS in the future and to additionally offer pricing and spending data around those contracts. Rest assured that GovWin IQ will be looking for those opportunities and including pricing in our State and Local Term Contract resource and the spending in our analysis activities.

Deltek will publish a full-length report, “State Government Transparency Report 2013,” providing detailed itemized IT expenditures for the state of Maryland and many other states in the coming weeks.

GovWin IQ subscribers can learn more about these statewide contracts in the provided links. Non-subscribers can gain access with a GovWin IQ free trial.

Navy’s IT Consolidation Efforts Reveal the Truth About IT Spending – It’s Everywhere!

In the summer of 2011 the Navy announced that they were cutting IT expenses on business-related systems by roughly 25 percent. As one might expect, this pursuit set off a process of investigation into where such cuts could be made. Nearly two years later we are now learning that the Navy has discovered they have been spending a whole lot more on IT systems, hardware, software, and services than they knew about: roughly $2-to-$3 billion more! 
In a recent news article, Department of the Navy's deputy CIO, Janice Haith, gives a frank account of how the Navy internal assessments have uncovered a huge amount of IT infrastructure and applications that had been previously unknown to the CIO’s office – even after years of consolidation efforts under the Navy Marine Corps Intranet (NMCI) program and others. The revelation includes:
  • Networks – identified 302 legacy and excepted networks that will need to be migrated to NGEN or retired by March 2014 or they will be rolled into NGEN. 
  • Servers – 32,000 servers currently identified, compared to the original estimate of 6,000.
  • Data Centers – 210 existing data centers, compared to the original estimate of 100. Navy’s goal was to get down to 25.
  • Applications –42,000 applications in the Research, Development, Test and Evaluation (RDT&E) area and 27,000 residing outside programs of record, compared to the original estimate of 15,000 distinct software applications. Navy is still assessing applications used for warfighting.
  • Software Licenses – Unclear at the headquarters level as to exactly which and how many software licenses the Navy owns. Consolidation of software licenses began in earnest last year when the Navy signed a large enterprise-wide licensing agreement for Microsoft products. The Navy expects to sign a similar agreement with Oracle in the future and reports that they have 17 other major providers in line.
On-Budget vs. Off-Budget IT Spending
It has been no secret that a significant portion of what the federal government spends on IT is not accounted for in the official IT budget submitted by agencies and reported by OMB. For years CIOs have expressed their challenge at managing and supporting IT infrastructures that are purchased and set-up outside their purview and they do not always hold sway over every office and program within an agency (until something breaks and they’re asked to help fix it.) The Navy’s ongoing quantification of this fact just shows the magnitude.
According to Haith, the Navy had estimated its overall FY 2011 IT budget was around $7 billion, but after further investigation it figured the number was more like $9 billion. For fiscal 2013, they are estimating a spend of $11 billion. Both of these re-evaluated estimates are significantly different than the Navy’s originally reported OMB IT budget submission, the Exhibit 53. (See table below.)
The bottom line is that there is a huge amount of IT purchasing that takes place outside of the CIO’s office and beyond traditional IT programs. My colleagues and I make this point whenever we comment on the federal IT spend and where potential opportunities exist. As technology permeates more operational areas and systems – including weapons, communications, energy, industrial controls, buildings, transportation, etc. – we see spending on IT products and services that fall outside official IT budgets. Therefore, any realistic assessment of an agency’s IT spending must reach beyond its official OMB Exhibit 53 submission.
In our latest annual Federal Information Technology Market forecast for FY 2012-2017, completed in June, before the current sequestration and other budget planning exercises were known, we estimate that Navy’s FY 2013 spending on all vendor-supplied IT products and services across all areas will be approximately $16 billion.

Haith’s disclosure shows the ongoing challenge of changing organizational cultures when it comes to reporting, use of cloud technologies and the need to invest in the right technologies to achieve long-term savings. There are opportunities to find for the savvy solutions provider, even in an uncertain environment.

More Entries