GovWin
B2G is moving!
Blogs posted after May 22, 2015 will be located on Deltek's central blog page at www.deltek.com/blog.
Just select the "B2G Essentials" blog to continue to receive this valuable content.
The 2015 NDAA Mandates Open Architecture for Defense IT Systems

Provisions in the annual National Defense Authorization Act legislation affect the Defense sector of the federal information technology market over many years.  Consider, for example, the mandate in the FY 2012 NDAA calling for the Department of Defense to utilize cloud services provided by commercial partners.  The DoD has been working ever since to find a viable way of implementing this mandate.  The far-reaching impact of NDAA provisions thus make it imperative that federal contractors understand how the legislation will affect their business at the DoD in the future.
 
The FY 2015 NDAA promises to have a significant impact as it features an important provision calling for the DoD to adopt open architecture for all of its IT systems. Specifically, Section 801 calls for the Under Secretary of Defense for Acquisition, Technology, and Logistics to create a plan that “develops standards and defines architectures necessary to enable open systems approaches in the key mission areas.”  The discussion about using modular approaches to acquisitions has been evolving at the DoD for several years, resulting in a shift in the length and complexity of contracted efforts.  Rather than procuring a single end-to-end solution, Defense customers tend increasingly to initiate program procurements in increments.  These increments have shorter time spans and defined objectives that set parameters for the acquisition of the next increment. In Section 801, Congress gives this “modular” approach the weight of law, meaning vendors should expect to see even more short-duration, lower dollar value, limited objective procurements.
 
Equally important is the call for DoD to develop a strategy for using open architecture.  The department is currently in the process of creating a unified transport network based on internet protocol.  This may work well for newer systems, but thousands of legacy systems across the DoD remain locked in proprietary configurations.  A clause in Section 801 mandates that the USD AT&L submit a report which “outlines a process for the potential conversion [of legacy systems] to an open systems approach.” Engineering those systems to operate on an open architecture will unlock data, make the systems interoperable, and enable Defense customers to transition more easily from one IT support vendor to another.
 
If this sounds like the next, deeper level of the Joint Information Environment, you are right on target.  IT vendors should take heed and get ahead of the curve because in all probability open architecture is going to be a requirement for every unclassified (classified too?) solution that the DoD procures in the future.  If your solution isn’t open, it won’t be purchased.  End of story.
 
The open architecture requirement will also compel Defense customers to take a hard look at commercial cloud as an alternative.  Why spend money engineering an antiquated legacy system to operate on an open architecture when you can hire a vendor to host the data and implement a comparable, new interoperable system? 
 
In short, the 2015 NDAA should stimulate business opportunity at the DoD as funding locked in Operations and Maintenance funding for legacy systems moves into new efforts to re-engineer and/or cloud-enable those systems for use in an open architecture.

 

Emerging Federal Technology Markets – Areas to Watch

Can technological innovation drive federal IT investments, even in the midst of budget pressures? Absolutely. This is what we explore in our latest report on Emerging Federal Technology Markets.

Under long-term pressure to “do more with less,” federal agencies are leveraging current trends in federal IT – cloud, wireless networks, IPv6, and virtualization – to gradually adopt new technologies that enable cost savings and the more efficient use of IT resources. Some of my colleagues and I took a look at how these and other technologies are shaping federal IT investments today and in the future.

Federal Investments in Foundation Technologies will Drive Emerging Markets

Technological change and proliferation span the gamut when it comes to impacting federal agencies. Sensor technologies are being introduced to track facility energy consumption and enhance physical security, while software-defined infrastructure is being explored to eliminate bottlenecks that result from stovepiped systems and the growing volume of data. Machine learning technology is being tested to create “smart” networks that rely less on person-based administration. Tying it all together are predictive analytics, which agencies are using for a growing number of purposes, from forecasting network performance and enhancing cyber security to ferreting out waste, fraud, and abuse. The result is that today’s investments set the stage for tomorrow’s capabilities. (See graphic below.)


Key market factors shaping the federal IT landscape

Some of the major drivers and key findings from our research include:

  • The drive to leverage sensor technologies and the data analytics that these enable is a driving force behind agency network modernization efforts like the DoD’s Joint Information Environment. The pace of sensor-based innovation is tied to the success of these efforts.
  • Software-Defined Infrastructure (SDI) is more pervasive than generally believed, particularly at agencies with highly-evolved Infrastructure-as-a-Service offerings.
  • Federal interest in SDI is not hype; it is a genuine trend with a growing number of current and planned use examples across federal agencies.
  • The use of predictive analytics programs has expanded significantly across the federal government since FY 2010, making it a maturing, though niche, technology that is expected to have continued strong growth.
  • The inclusion of predictive analytics as an offering on GSA’s Alliant 2 and, potentially, NS2020 government-wide contracts should help it become regarded less as an exotic technology and more as a standardized commercial-off-the-shelf solution.

The modernization of agency IT environments is opening the doors to future investment in emerging technologies.  The convergence of agencies’ work on expanding wireless networks, deploying standardized, commodity hardware, and engineering Internet Protocol-based transport networks is enabling the introduction of new sensor technologies and software-based capabilities. The impact of emerging technology adoption will be to introduce greater efficiency and security to agency IT environments. 

To get our full perspective on Emerging Federal Technology Markets read the full report. 

---
Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about 
GovWin FIA. Follow me on Twitter @GovWinSlye.

Medicaid eligibility and enrollment systems: Which states still need to modernize?

Modernized and fully integrated Medicaid eligibility systems have proven to be a catalyst for successful enrollment in state and federally facilitated health insurance exchanges. Kentucky, New York, and Washington state have stood out for their top-performing exchanges and high enrollment numbers. All three states rely on integrated Medicaid eligibility systems that facilitate the consumer application process, eligibility determination, and enrollment in Medicaid/CHIP or private health insurance plans.

On the other hand, states with outdated and isolated technologies struggled to enroll new customers, which led to significant Medicaid backlogs, most notably in California, New Jersey, and Tennessee. Now that the feds have finalized 90/10 funding and extended the OMB A-87 cost allocation exception, more states will invest in upgrading their Medicaid eligibility systems and building integrated eligibility systems that incorporate human services programs, including Supplemental Nutrition Assistance Programs (SNAP) and Temporary Assistance for Needy Families (TANF). This analyst perspective will help vendors identify which states have already completed upgrades, which states are currently modernizing, what contracts may be rebid, and where to find potential business opportunities.

Current Landscape


While states have been working to integrate and modernize eligibility systems for more than a decade now, the vast majority of states took steps in recent years to upgrade their Medicaid eligibility systems in preparation for ACA enrollment. In fact, 19 states have issued contracts for upgrades to Medicaid eligibility and enrollment systems since 2012. Some states combined contracts for health insurance exchanges with eligibility upgrades (HIX/IES), including Connecticut, Maryland, Oregon, Rhode Island and Washington, D.C. Other states are still in the early planning stages for eligibility system modernization efforts, and a few states have indicated their intent to release a solicitation in the coming year. Below is a preview of a few of these upcoming opportunities.

Upcoming Solicitations

Louisiana – The Louisiana Department of Health and Hospitals anticipates releasing a Medicaid Eligibility Determination System (MEDS) request for proposals (RFP) this month. The department is designing new enterprise architecture to modernize the state’s Medicaid technologies. The previous contract with Deloitte was worth approximately $29 million (Opportunity ID 99187).

Massachusetts – The Massachusetts Executive Office of Health and Human Services plans to move forward with Phase II of the state health insurance exchange and integrated eligibility system (HIX/IES). The state expects to complete planning by the end of June 2015, and an RFP could be released sometime this fall, at the earliest. A $66 million contract with CGI was terminated in March 2014, and Optum and hCentive have worked to rebuild the system (Opportunity ID 89076).

New York – The New York Office of General Services is seeking a systems integrator for its integrated eligibility system to replace the statewide welfare management system (WMS) – a legacy system first implemented in 1977. An RFP was issued in May 2014 for a business advisory services contractor that will work during the first phase of the IES project; the systems integrator will conduct phase two. Deltek anticipates this legacy system modernization could approach $100 million (Opportunity ID 49905).

Possible Rebids

Tennessee – The Tennessee Department of Finance and Administration may have a requirement for the development and/or maintenance of the TennCare Eligibility Determination System (TEDS). The current contract with Northrop Grumman is behind schedule and the system remains unfinished, which has created months-long delays for Tennesseans who want to apply for Medicaid. Subsequently, three advocacy groups have filed a lawsuit against TennCare. The incumbent contract is valued at $35.7 million (Opportunity ID 117922).

New Jersey – The $83.5 million contract with Hewlett-Packard for maintenance of the Consolidated Assistance Support System (CASS) has been terminated, and a spokeswoman for the New Jersey Department of Human Services said the state and the vendor are still in talks regarding the contract termination (Opportunity ID 105816).

Early Planning Stages

California – The 2014-2015 Governor's Budget Highlights for the Department of Health Care Services requested expenditure authority for a multi-year IT project to modernize the Medi-Cal Eligibility Data System (MEDS). In 2012, a contract was awarded to PCG for IT project planning consulting services, including a feasibility study and advanced planning document (APD) for the MEDS Modernization Project. An RFP for the Medi-Cal Program integrity data analytics is currently in development (Opportunity ID 69871).

South Dakota – The state issued an invitation to discuss and demonstrate (IDD) to review and research existing Medical assistance eligibility systems that comply with the Affordable Care Act (ACA) and preferably have existing or planned capability to support other programs such as SNAP, TANF, Child Care, Low Income Energy Assistance (LIEAP), and Child Support. The Department of Social Services is now planning an RFP for an integrated eligibility system (Opportunity ID 83922).

Washington – In 2013, the Washington State Legislature passed Senate Bill 5034, directing a study of the state’s medical and public assistance eligibility systems and infrastructure with the goal of simplifying procedures and reducing state expenditures. PCG was awarded the contract to conduct the Medical and Public Assistance Eligibility Study, which was published in September 2014. The state may continue to make efforts to modernize the medical and public assistance eligibility systems (Opp ID 104365).

Analyst’s Take

Now that 90/10 funding has been made permanent and the A-87 waiver is extended until December 2018, states will continue to make upgrades to eligibility systems, which could yield significant business opportunities for vendors. Deloitte is the dominant vendor in this space, currently holding contracts in more than 15 states. Other vendors holding contracts in multiple states include Accenture, IBM, Northrop Grumman, KPMG, HP, and Maximus. Contract values for eligibility modernization projects vary significantly based on the size of the state and scope of the project. Contracts for Medicaid eligibility modernization average between $20-50 million, while IES projects that include major system overhauls can exceed $100 million.

Many states that have recently integrated health insurance program eligibility systems may now look to incorporate human services programs, starting the next wave of procurement activity. As Deltek continues to track upcoming eligibility projects, we encourage vendors to keep an eye on the above mentioned projects and expect to see more eligibility-related opportunities thanks to this funding extension.

 

Will the Defense Inspector General Further Delay the DoD’s Migration to the Cloud?

Recently, the Department of Defense’s Office of the Inspector General published an audit report critical of the department’s efforts to implement its 2012 cloud computing strategy.  Citing material weaknesses in the execution of the strategy, including the failure to adequately train acquisition personnel who procure cloud services, the failure to fully develop cloud service broker management capabilities, the failure of DoD components to obtain proper waivers from review authorities to use non-DoD approved clouds, and the failure of the DoD CIO to develop a detailed written process for obtaining a cloud computing waiver, the OIG concluded that the department had put data at risk while also not reaping the cost savings benefits that cloud computing offers.

The DoDIG report is the latest in a series of similar reports from other government agencies that also revealed systematic flaws in efforts to leverage cloud computing.  Taken together, these audit reports point to the disruption that cloud computing is causing in federal agency information technology environments.  This disruption is not necessarily related to technology difficulties, although these are a concern.  Rather, it is related to weaknesses in the policy and governance processes that guide agency IT investments.  Cloud computing is creating challenges that agencies simply aren’t equipped to handle, a problem made worse by policies like Cloud First, which has forced agencies to dive into a technology for which they aren’t prepared.

In the DoD’s case, the lack of policy and governance oversight is particularly perplexing considering the glacial pace at which the Defense Department has moved toward using commercial cloud solutions.  The DoDIG’s audit now threatens to bring that progress to a halt as the DoD CIO and defense components consider how to respond to the OIG’s recommendations.  A real question at this point is should they bother to respond at all.  Several elements of the OIG report are based on assumptions and policies that have changed considerably since the DoD Cloud Computing Strategy was released in June 2012.  Take, for example, cloud brokering, which acting DoD CIO, Terry Halvorsen, has de-centralized.  In addition to the Cloud Brokerage Project Management Office at the Defense Information Systems Agency there are now cloud brokerages at the Army’s Program Executive Office Enterprise Information Systems and the Navy’s Space and Naval Warfare Systems Command.

Similarly, the DoDIG castigates the department for failing to implement an enterprise contract for commercial cloud services.  The goal of implementing an enterprise contract, however, no longer resembles the reality of the situation at the DoD.  When acting CIO Halvorsen gave the Services the ability to procure commercial cloud services he effectively eliminated the need for an enterprise level cloud contract.  In other words, the DoDIG calls the DoD CIO to account for not implementing an irrelevant procurement strategy.

None of this is to say that the DoD has somehow miraculously solved its challenges. There is indeed a pressing need for acquisition training and contract clauses that will ensure a proper level of cloud service and data security.  Similarly, if it is to be retained, the waiver process needs to be improved. In itself, it is unclear if waivers will be necessary given DISA’s enhanced role as certifier of cyber security requirements for commercial providers.  DISA’s imprimatur is effectively a waiver, if the commercial solution meets security requirements.

Improvements are needed, but one wonders if it is not a counter-productive use of time for the DoD CIO, DISA, and the Services to spend time addressing a critique that does not fit current conditions.  The implementation of the Joint Information Environment addresses the use of commercial cloud solutions in a way that should assuage the cyber security concerns of the DoDIG.  Furthermore, the de-centralization of cloud procurement is intended to eliminate the acquisition bottleneck at the DISA Cloud PMO while also reducing costs.  These are solid steps toward removing barriers to commercial cloud use at the DoD.  Will they be allowed to bear fruit or will they be suffocated by the weight of adherence to outdated policy demands?

 

Predictive Analytics Use at the Department of Defense

 

Back in September, an organization at the National Defense University called the Center for Technology and National Security Policy published a research paper entitled Policy Challenges of Accelerating Technological Change: Security Policy and Strategy Implications of Parallel Scientific Revolutions. Looking past the long title, one finds an in-depth consideration of the implications of emerging technologies for U.S. national security and the DoD. Considering the CTNSP is part of the defense establishment, I believe it is worth taking a few minutes to examine what the authors say, particularly since their comments fit seamlessly with the recently announced Defense Innovation Initiative (DII). Papers like this can point to areas of investment and in a time of falling budgets, any insight is welcome.

The report discusses more subjects than I can cover here, so in today’s post I’ll zero in on its comments about big data analytics. Use of big data analytics in the DoD is nothing new. In fact, based on recent contract spending data (see chart below), we can see that defense customers spent nearly $138 million on big data analytics over the five years from fiscal 2010 to fiscal 2014.

 

Big data analytics in this context are defined as advanced analytics programs offering visualization and modeling capabilities that enable statistics-based prediction/forecasting. Think Mathematica, MATLAB, Splunk, Statistica, Tableau, etc. and you have an idea of the programs included in this data.

According to the CTNSP report, employing these kinds of analytics on a vastly greater scale will be the key to controlling and exploiting the data that defense organizations will be gathering from the expansion of unmanned systems, robotics, and the Internet Protocol-enabled “Internet of Things.” The uses for such analytics include the analysis of intelligence data, cyber security, and the transition to a “health maintenance-based, rather than a disease-based medical model,” that will enhance the operational readiness of U.S. military personnel. The report’s recommendations have a clear implication – that the DoD should greatly ramp up its spending on predictive analytics and the training of its personnel to use them.
In recent years, however, just the opposite has been taking place. Examining the data presented above from the perspective of spending per fiscal year (see chart below), we see that defense spending on PA peaked at $42 million in FY 2012 and has declined since.

Undoubtedly the recent pressure placed on DoD’s budget by sequestration is the primary reason for reduced spending on PA. The question is will this trend continue. My guess would be no, for the simple reason that the DoD cannot afford to neglect developing its PA capabilities. To do so at a time when more data is coming at defense analysts than ever before would be folly. Add the increasing use of automated systems to the mix and the answer is obvious – the DoD must spend more on PA. Currently the department is in a period of retrenchment as it struggles with new budget realities. Once this retrenchment has run its course, defense customers are likely to turn their attention back to acquiring PA capabilities. The DII points the way forward in this respect and for industry partners it’s a welcome signpost of spending ahead.

 

Observations from TTC’s Internet of Things for Defense Symposium

The Department of Defense and U.S. federal law enforcement community are increasingly interested in what has come to be called the “Internet of Things.”  Labeled the “IoT” for short, the Internet of things consists of a growing network of small, low power, low bandwidth, low cost sensors and devices that are connected to networks and which send and receive data.  Think of the sensors that automatically turn on room lights or flush toilets and you have an idea of some of the uses for IoT technology.  Additional uses for IoT technology, however, are about as varied as one can imagine.  For example, the General Services Administration recently awarded a contract to IBM to outfit its facilities with sensor technology that will allow more efficient monitoring of energy use.  Similarly, tiny sensors can be used to monitor jet engine performance, or just about any other structure in the world.

As many of the speakers at the Technology Training Corporation’s IoT symposium discussed, the DoD is eyeing sensor technology to determine how it might best be used.  There are even several use cases already in progress.  Rear Admiral Scott Jerabek, Director of Command, Control, Communications and Computer Systems, at U.S. Southern Command kicked off the symposium by listing a few of these uses in his area of responsibility.  Noting that USSOUTHCOM employs IoT technology in its GeoShare program for humanitarian assistance, Jerabek also explained that the Navy is investigating a “nano-satellite network,” in addition to developing a Deep Sea Web of low observable, wide area capabilities to track dark targets at sea.

Subsequent speakers, like Air Force CTO Frank Konieczny, detailed multiple other uses for IoT technology that the defense establishment is considering. These include:

  • Base Facilities Maintenance – trash pickup, light replacement, food replenishment
  • Vehicle management – maintenance prediction, location tracking
  • Secured, smart workplace – presence for workers integrated with facilities management
  • Logistics and transportation – inventory/tracking, automated assembly/packing, geo-location in supply chain
  • Robotics – autonomous drones and vehicles, sensor based maneuvering

Needless to say, the expansion of networks to everyday items carries with it tremendous risks as well as benefits.  Multiple speakers mentioned the need to build security protocols into IoT devices so that they could be resistant to hacking.  Enhanced network security will be necessary as well given the vast expansion of data that networks will be handling.  Advanced analytics for continuous monitoring will be required, but not only that, analytics will need to be deployed to make sense of all the data and make decisions based on it.  In short, IoT will render the already big data world in which we live even bigger.

Herein lay other challenges.  Chief Warrant Officer 5, Ricardo Pina, Chief Technology Officer and Senior Technical Advisor to the Army CIO/G-6, pointed out that an organization like the Army currently does not have the network infrastructure required to handle the flow of data that an Army IoT would create.  This is one of the primary factors driving the Army’s modernization of its networks using multi-protocol label switching (MPLS) technology.  A standardized protocol will be required to enable seamless integration and use of IoT and the DoD is betting that this standard will be Internet Protocol.  Effectively, the new IP-based Joint Information Environment will enable the DoD to vastly expand its use of IoT technologies.  This expansion will in turn drive investment in the analytics and any attendant services required for IoT implementation.  Vendors therefore take note.  The business opportunity in the area of IoT is growing, particularly among informed defense customers.

For more information on upcoming symposia, visit the Technology Training Corporation. I’ll see you at the one on Software-Defined Networking scheduled for December 9-10, 2014.

 

Federal Cybersecurity Market Forecast –Sustained Growth Continues

The federal cybersecurity market continues to grow and we have just completed analysis that shows how much. Increasing threats, the rapid pace of technological change, and an increasing reliance on mobility, cloud computing, big data, and information sharing make information security critical for federal agencies. To address these challenges, agencies continue to invest in industry tools, technologies and personnel services and this will drive growth in the market segment over the next several years.

Taking a comprehensive perspective on the federal cybersecurity market, we see four major driving areas that continue to create demand for government-wide and agency budget investments:

  • Threat Drivers - Rapid rise in complex, diverse, persistent and morphing threats to networks, devices, data and other infrastructure.
  • Policy Drivers - Executive branch policies address wide areas of cyber- across government and beyond. Stagnant legislation reflects diversity of opinion. Compliance policy bolsters spending on existing frameworks. RFP language both driving and requiring security.
  • People Drivers - Challenge to find enough qualified cybersecurity professionals. Initiatives to cultivate internal government talent and “inherently governmental” roles will limit contractor addressability, but agencies that supplement by contracting will drive spending.
  • Technology Drivers - Threats and vulnerabilities drive direct technical remedies while new, disruptive technologies require security for full adoption.

Given these drivers, Deltek forecasts the demand for vendor-furnished information security products and services by the U.S. federal government will increase from $7.8 billion in FY 2014 to $10.0 billion in 2019 at a compound annual growth rate (CAGR) of 5.2%. (See chart below.)

Key Findings

There are several conclusions that we came to when reflecting on what we are observing across the federal information security environment and how the drivers above are impacting the market both now and going forward. Here are some of our key findings:

  • The continued rise in cyber incidents underscores what is at stake.
    • Threats span all areas of cyber – from within and from without.
    • Threat concerns impact all levels of the federal IT environment.
    • Persistent and diverse threats are driving risk-based approaches.
  • Policies and priorities are slow to evolve into effective security approaches.
    • The drive for security permeates multiple layers of federal policy, but there is a disconnect between compliance policies like FISMA and actual security, as revealed by the volume and type of security incidents.
    • Security considerations impact the broader tech and acquisition landscape.
  • Security efforts and posture are currently dependent on the availability and proficiency of skilled personnel.
    • Staffing levels and skill sets vary across government, driving sustained demand for industry support.
  • Technologies are seen as both security “gap-filler” and “gap-creator.”
    • One year into CDM tools BPA only marginal improvements have been seen.
  • Strong processes are needed to link technologies, approaches and personnel skill sets to maximize security posture.

Efforts among agencies to increase effectiveness, efficiency and economy like the joint DHS-GSA Continuous Diagnostics and Monitoring (CDM) program BPA are having some impact on how agencies are approaching cybersecurity and setting their spending priorities within their security budgets. Although the process of arriving at accurate and complete IT asset inventories that need to be secured and monitored is taking time, somewhat elongating the journey, we remain bullish that the priority of securing and protecting federal data and infrastructure will continue to drive significant market opportunity over the next five years.

Get more of our perspective in our latest report: Federal Information Security Market, FY 2014-2019.

---
Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about 
GovWin FIA. Follow me on Twitter @GovWinSlye.

DHS Cybersecurity Spending Trends Align with Personnel Challenges

Attracting and retaining skilled cybersecurity people is key for federal agencies in meeting their cybersecurity challenges and this is especially true at the Department of Homeland Security. Yet, DHS continues to make the news with its difficulty in retaining top staff and in hiring highly-qualified people, especially for cybersecurity.   A look at their cybersecurity spending data reveals what has been happening.

I previously looked at the media reports of morale and personnel retention issues at DHS that impact their cybersecurity mission and some legislation that Congress has moved forward that may make it easier for DHS to hire cybersecurity staff in the future. This week I want to look at some of the IT security budget data that underscores the situation at the department – especially how much of DHS’s IT security spending goes toward security personnel verses security software and hardware solutions.

Hard data on what agencies spend on cybersecurity is not usually easy to find and it can vary in its completeness and granularity. However, over the last several years OMB has released varying amounts of IT security budget data as part of their annual Federal Information Security Management Act (FISMA) report submitted to Congress to update them on the progress and challenges agencies are facing. On a few occasions OMB has provided a breakdown of spending by personnel, security tools, training and other areas.

To be sure, the amount that a federal department spends on security personnel compared to their overall IT security spending varies agency by agency and the relative mix of government personnel to contracted personnel also varies. Observing an agency’s total IT security personnel spending vis-à-vis their overall security budget can give a sense of the security landscape at the department. The stability or movement often may be tied to specific priorities at the department. Even if it is not, the mix can give us a sense and hint at what opportunities may exist

DHS IT Security Spending

Based on the last several Federal Information Security Management Act (FISMA) reports released by OMB, DHS’s reported IT Security spend was stable from FY 2010 to FY 2011 and then saw significant yearly increases in FY 2012 and FY 2013. However, over the same period, the amount of money DHS spent on security personnel actually dropped. (See chart below.)  The result is that the relative percentage of total spending that was used for security personnel decreased at an accelerating rate over the period as the two categories moved in opposite directions – total spending increased while personnel spending decreased.


But the story gets even more stark. For FY 2012 DHS reported to OMB that they employed just under 400 IT security government personnel, compared to contracting more than 600 IT security personnel from industry. While this proportion of government-to-contractor personnel itself is not completely unheard of (Treasury, Energy, and NASA have even larger spreads) the fact remains that DHS holds the predominant role in government-wide IT security, consistently receives the largest IT security budget among the civilian agencies, and is one of the most dependent on a contracted workforce to achieve its cyber- mission.

Over the last several years various members of the DHS leadership have made well-publicized comments about the challenges of attracting and retaining cybersecurity personnel. Hence the legislative push in Congress to help them. Yet the spending data suggests that there is growing opportunity at DHS in areas that are not personnel-centric, like cybersecurity solutions that put tools in the hands of the skilled people they have now in order to make them more productive and effective. Evidence for this is that DHS’s spending on IT security tools increased from about $30 million in FY 2010 to nearly $300 million in FY 2012.

DHS will probably continue to struggle to build their cyber-workforce for some time – with or without help from Congress. Until then, they’ll continue to need skilled people from industry to fulfill the mission, but to reach long-term sustainability and ultimate success they will need to look to ever-advancing security tools to leverage their people to the maximum effect.

---
Originally published in the GovWin FIA Analysts Perspectives Blog. Follow me on Twitter @GovWinSlye.

Forecasting Federal Cloud Computing, 2014-2019: The Market Keeps Growing

When assessing the state of federal cloud computing, Deltek’s Federal Industry Analysis team sees 7 major trends shaping cloud investments across most government agencies:

  • Data Governance/Management – Agencies lack policies and oversight processes to govern the hosting and use of their data in commercial cloud environments.
  • Security Concerns – There is a lack of consistency in agencies ensuring that commercial clouds are FISMA-compliant. Furthermore, FedRAMP certification is taking an average of 9 months, while receiving an agency authority to operate (ATO) takes only 4.
  • Investment Management – Agencies lack the tools and oversight processes necessary to manage their cloud investments, determine costs, and calculate ROI.
  • Shared Services – As part of evolving common operating environments, agencies are taking a hard look at the cloud as a solution for reaching shared services goals on both inter- and intra-agency levels.
  • Mission Complexity – Agencies with complex and classified missions are finding that different types of missions require different types of clouds solutions and configurations.
  • DOD Loosens the Reins – In an act anticipated to greatly increase cloud spending, acting DoD CIO Terry Halvorsen has revised DoD policy to allow the military departments to procure cloud solutions on their own vs. exclusively through the Defense Information Systems Agency.
  • NIST Charts Roadmap Forward – In October 2014, the National Institute of Standards and Technology released its long-awaited US Government Cloud Computing Roadmap, recommending standards that will impact agency cloud procurement.

Taking these drivers into consideration, Deltek forecasts the federal agency demand for vendor-provided cloud computing services will increase from $2.45 billion in FY 2014 to $6.5 billion in 2019 at a compound annual growth rate (CAGR) of 21%. (See chart below.)

 

Key Findings

FIA’s research reveals a number of important conclusions concerning how agencies are handling the transition to cloud solutions:

  • Cloud spending in the Civilian buyer segment continues to accelerate, but recent Inspector General audit findings have shown that Civilian agencies have been moving to the cloud more aggressively than has been reported and than they are prepared for.
  • Defense spending on and contract awards for cloud computing peaked in fiscal 2012. This spending has declined since the DoD CIO’s enterprise Cloud Computing Strategy was released.
  • Generally speaking, DoD spending on cloud has focused primarily on making defense systems cloud ready.
  • Investment in Platform-as-a-Service continues to significantly lag behind IaaS and SaaS.
  • Multiple award contracts and contract vehicles like Alliant continue to be the most popular methods of procuring cloud services.

All in all, the purchase of commercial cloud services by federal agency customers continues to evolve slowly, but surely. Nevertheless, several trends implying challenges with cloud adoption are expected to hamper rapid growth through fiscal 2016.  Come fiscal 2017, however, agency spending on cloud is expected to rise considerably.

Learn more of our perspective in our latest report: Federal Update: Cloud, Data Center, Big Data, and Mobility, 2014-2019.

Follow me on Twitter @govwinfiaalex.

 

2020 Census Continues to Hit Planning Hurdles

The Department of Commerce’s Census Bureau has big plans for the upcoming decennial census, aiming to turn over a new leaf around costs and sustainability. Progress over the last year, however, shows the effort is encountering many of the same old problems.

Background

The constitutionally mandated decennial census conducted by the Census Bureau determines the allocation of billions of dollars in federal funds to states and realigns the boundaries for Congressional districts. Since 1970, the cost of enumerating households has climbed from around $16 to around $98 in 2010. During that same time period, the mail response rate dropped from 78 percent to 63 percent. The 2010 census was the costliest in U.S. history, weighing in at $13 billion. Throughout its strategic plans and program documents, the Department of Commerce continues to reiterate its commitment to conducting the 2020 Census at lower costs per housing unit than the 2010 Census. The Government Accountability Office (GAO) found issues with managing, planning, and implementing IT solutions for both the 2000 and 2010 enumerations. These issues contributed to acquisition problems and cost overruns.

Planning Progress and Missteps

In September 2013, the GAO released a progress report on efforts to contain enumeration costs. The Census Bureau launched a number of cost-saving, modernization initiatives in advance of the 2020 Census. These efforts targeted changes like establishing enterprise standards and tools for IT management and reducing duplicative investments. The GAO found that the Census Bureau had made progress towards long-term planning, but the roadmap for the 2020 census still lacked milestones for decision and cost estimates. The study also determined that while the Census Bureau identified some cost saving approaches, the new implementation of these practices for the 2020 census carries operational risk. The Census Bureau estimates that it could save up to $2 billion by using administrative records in 2020 to decrease the need for door-to-door visits. In particular, using the internet to include a self-response option stands to improve response rate and lower costs by reducing the number of households visited by field staff.

Later in 2013, the GAO reported that the Census Bureau was not producing reliable schedules for the 2020 Research and Testing program and the Geographic Support System Initiative. These two programs as the most relevant to constructing the master address file for the 2020 census. In both cases, activities were omitted from the schedule, which could potentially lead to unforeseen delays. While many activities were linked sequentially, the schedules lacked the preceding and following activities. Thus, the schedules were not producing accurate dates, which would interfere with determining whether the work is on schedule.

In the spring of 2014, a review of the 2020 Decennial Census program found that the Census Bureau had made progress in researching and testing IT options, but several of the supporting projects lacked schedules and plans. The absence of this scheduling data raises uncertainty about whether the projects will be completed in time to support operational design planning slated for September 2015. Four of the six IT research and development projects did not have finalized schedules. As for the two projects that did have completed schedules? They were not estimated to be completed until after the September 2015 design decision date. The same review found inconsistencies with the risk assessments and mitigation plans for the associated IT options.

By the summer of 2014, the Department of Commerce’s Inspector General released an assessment of the Census Bureau’s mandatory budget cuts. The review determined that cost information inaccuracies made it impossible to determine the impact of budget reductions. Further, the OIG found that internal practices increased the risk of incorrect or fraudulent charges. Four recommendations were issued for improvements, all centering on processes and procedures for the programs oversight. In addition to agreeing to the recommendations, the Census Bureaus acknowledged the “deficiency in the completeness” of its documentation around budget estimates and financial decisions.

This fall, the Census Bureau’s efforts to identify data sources for addressing and mapping requirements were examined. Once again, inconsistencies were found in the research and planning processes. These gaps included failing to document cost and quality information for executed decisions related to use address and mapping data from state, local governments, other agencies, and a commercial vendor. Additionally, management approval for the data source decisions failed to be documented, which would present accountability and transparency issues for future sourcing decisions.

Going Forward

Census has just completed the research and testing phase of the program, which ran from FY 2012 to FY 2014. The nest phase runs from FY 2015 to FY 2018 and will incorporate operational development and systems testing. FY 2015 activities are expected to focus on supporting research and testing infrastructure. Projects slated for this fiscal year include automation, workload management, Bring Your Own Device (BYOD) incremental development and research, enumeration system reuse, geographic program planning, IT security as well as a virtual office computing environment. In October, the Census Bureau released a request for information (RFI) as part of its market research into source for the Integrated Communications Program. Responses were due by the end of the month, and as of mid-October no decision on an acquisition strategy had be made yet. The mandated execution of this program and its expanding use of IT to execute the 2020 Decennial Census and to establish a sustainable model for future enumeration provide a number of elements for contractors to watch. In addition to the potential business opportunities associated with the census, the convergence of technologies like mobile computing, cloud environments, and data analytics will be a practical test of government technology adoption.

----------------------------------

Originally published in the GovWin FIA Analysts Perspectives Blog. Follow me on Twitter @FIAGovWin.

 

More Entries