GovWin
B2G is moving!
Blogs posted after May 22, 2015 will be located on Deltek's central blog page at www.deltek.com/blog.
Just select the "B2G Essentials" blog to continue to receive this valuable content.
Emerging Federal Technology Markets – Areas to Watch

Can technological innovation drive federal IT investments, even in the midst of budget pressures? Absolutely. This is what we explore in our latest report on Emerging Federal Technology Markets.

Under long-term pressure to “do more with less,” federal agencies are leveraging current trends in federal IT – cloud, wireless networks, IPv6, and virtualization – to gradually adopt new technologies that enable cost savings and the more efficient use of IT resources. Some of my colleagues and I took a look at how these and other technologies are shaping federal IT investments today and in the future.

Federal Investments in Foundation Technologies will Drive Emerging Markets

Technological change and proliferation span the gamut when it comes to impacting federal agencies. Sensor technologies are being introduced to track facility energy consumption and enhance physical security, while software-defined infrastructure is being explored to eliminate bottlenecks that result from stovepiped systems and the growing volume of data. Machine learning technology is being tested to create “smart” networks that rely less on person-based administration. Tying it all together are predictive analytics, which agencies are using for a growing number of purposes, from forecasting network performance and enhancing cyber security to ferreting out waste, fraud, and abuse. The result is that today’s investments set the stage for tomorrow’s capabilities. (See graphic below.)


Key market factors shaping the federal IT landscape

Some of the major drivers and key findings from our research include:

  • The drive to leverage sensor technologies and the data analytics that these enable is a driving force behind agency network modernization efforts like the DoD’s Joint Information Environment. The pace of sensor-based innovation is tied to the success of these efforts.
  • Software-Defined Infrastructure (SDI) is more pervasive than generally believed, particularly at agencies with highly-evolved Infrastructure-as-a-Service offerings.
  • Federal interest in SDI is not hype; it is a genuine trend with a growing number of current and planned use examples across federal agencies.
  • The use of predictive analytics programs has expanded significantly across the federal government since FY 2010, making it a maturing, though niche, technology that is expected to have continued strong growth.
  • The inclusion of predictive analytics as an offering on GSA’s Alliant 2 and, potentially, NS2020 government-wide contracts should help it become regarded less as an exotic technology and more as a standardized commercial-off-the-shelf solution.

The modernization of agency IT environments is opening the doors to future investment in emerging technologies.  The convergence of agencies’ work on expanding wireless networks, deploying standardized, commodity hardware, and engineering Internet Protocol-based transport networks is enabling the introduction of new sensor technologies and software-based capabilities. The impact of emerging technology adoption will be to introduce greater efficiency and security to agency IT environments. 

To get our full perspective on Emerging Federal Technology Markets read the full report. 

---
Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about 
GovWin FIA. Follow me on Twitter @GovWinSlye.

Agencies Continue to Trudge Toward Data Center Consolidation and Strive for Mobility

As part of agencies’ ongoing efforts to reduce costs and gain efficiencies, they are continuing to pursue data center consolidation and optimization.  At the same time, they are implementing mobile applications and environments for the convenience of constituents and employees.

Deltek’s new Federal Update:  Cloud, Data Center, Big Data and Mobility, 2014-2019 report delves into the status of federal data center consolidation and mobility efforts, as well as cloud computing and big data. 

The evolution of FDDCI resulted in a shift in focus from consolidation to optimization of core data centers, which helps agencies focus resources on investments and strategies with the highest return.  However, measuring success and ROI is complex and challenging, and has resulted in mixed information regarding progress. 

The number of identified data centers continues to grow and change with the shifting definition of a data center.   At the start of the Federal Data Center Consolidation Initiative (FDCCI) in 2010, agencies identified approximately 2,000 data centers, but over time, with more detailed inventorying efforts and a change in the size of what is considered a data center, agencies have now found over 9,600 data centers.  242 of these data centers are considered core data centers.  Agencies plan to close an additional 3,127 of non-core data centers.

A September GAO report highlighted agency cost savings and avoidance due to data center consolidation.  19 out of the 24 agencies participating in the FDCCI reported achieving an estimated $1.1 billion in cost savings and avoidances between FY 2011 and FY 2013.  21 agencies collectively reported planned savings of $2.1 billion in cost savings and avoidance by the end of FY 2015 for a total of $3.3 billion by FY 2015, $300 million above OMB’s original $3 billion goal.  Between FY 2011 and FY 2017, agencies plan to have achieved total cost savings or avoidance of $5.3 billion.

But signals are mixed regarding progress.  GAO stated that agency savings and avoidance calculations were not consistent among agencies.  Additionally, some agencies experience difficulties determining or uncovering baseline spending.  A Meritalk survey of federal IT managers showed that 52% of respondents rated their agency’s FDCCI efforts at a “C” or below, while only 6% gave their agency an “A.” 72% said that the number of data centers in their agency had stayed the same or increased since the FDCCI launched in 2010.

Mobility programs in the federal government continue to advance, driven by cloud platforms and Virtual Desktop Infrastructure (VDI), although securing mobile devices and data remain areas of focus.  Continued expansion in the use of mobile communications and computing solutions is expected to rely on accelerating network modernization and optimization, including the use of Internet Protocol and cloud-based technologies.

The purpose of the Federal Mobility Strategy is to accelerate the federal government’s adoption of mobile technologies to improve delivery of government information and services, engage citizens more fully, reduce the costs of government operations, and increase federal workforce productivity.  Much of federal agencies’ mobile computing direction is driven by the Federal Digital Strategy, released in May 2012.  Its goal was to move agencies to an enterprise-wide model for procurement of mobile devices and services.  The strategy outlined objectives aimed at taking a baseline of current mobile assets and usage, followed by the consolidation and streamlining of purchasing, invoicing, and asset management.

Top trends in federal mobile computing include resolving security issues, expanding mobile applications, and leveraging enterprise-wide contracts.  Security concerns run the gamut from employee mobile device security training, to authentication, encryption and application vetting and management.  Application rationalization and the need for more self-serve citizen services are driving agencies to expand the roster of applications made available via mobile devices.  The flexibility of carrier reach, pooling, bundling, etc. has driven demand for enterprise-wide contracts, while driving costs efficiencies.  

Contractors targeting data center consolidation should focus on optimization, look to existing customers for opportunities, and take advantage of the variety of contract vehicles at your exposure to fulfill requirements.   Contractors looking to address the mobility market should work with agency customers to assess infrastructure readiness for mobile computing,

promote key security features of their offering, and develop mobile applications that are specifically designed for federal government challenges.

 

2020 Census Continues to Hit Planning Hurdles

The Department of Commerce’s Census Bureau has big plans for the upcoming decennial census, aiming to turn over a new leaf around costs and sustainability. Progress over the last year, however, shows the effort is encountering many of the same old problems.

Background

The constitutionally mandated decennial census conducted by the Census Bureau determines the allocation of billions of dollars in federal funds to states and realigns the boundaries for Congressional districts. Since 1970, the cost of enumerating households has climbed from around $16 to around $98 in 2010. During that same time period, the mail response rate dropped from 78 percent to 63 percent. The 2010 census was the costliest in U.S. history, weighing in at $13 billion. Throughout its strategic plans and program documents, the Department of Commerce continues to reiterate its commitment to conducting the 2020 Census at lower costs per housing unit than the 2010 Census. The Government Accountability Office (GAO) found issues with managing, planning, and implementing IT solutions for both the 2000 and 2010 enumerations. These issues contributed to acquisition problems and cost overruns.

Planning Progress and Missteps

In September 2013, the GAO released a progress report on efforts to contain enumeration costs. The Census Bureau launched a number of cost-saving, modernization initiatives in advance of the 2020 Census. These efforts targeted changes like establishing enterprise standards and tools for IT management and reducing duplicative investments. The GAO found that the Census Bureau had made progress towards long-term planning, but the roadmap for the 2020 census still lacked milestones for decision and cost estimates. The study also determined that while the Census Bureau identified some cost saving approaches, the new implementation of these practices for the 2020 census carries operational risk. The Census Bureau estimates that it could save up to $2 billion by using administrative records in 2020 to decrease the need for door-to-door visits. In particular, using the internet to include a self-response option stands to improve response rate and lower costs by reducing the number of households visited by field staff.

Later in 2013, the GAO reported that the Census Bureau was not producing reliable schedules for the 2020 Research and Testing program and the Geographic Support System Initiative. These two programs as the most relevant to constructing the master address file for the 2020 census. In both cases, activities were omitted from the schedule, which could potentially lead to unforeseen delays. While many activities were linked sequentially, the schedules lacked the preceding and following activities. Thus, the schedules were not producing accurate dates, which would interfere with determining whether the work is on schedule.

In the spring of 2014, a review of the 2020 Decennial Census program found that the Census Bureau had made progress in researching and testing IT options, but several of the supporting projects lacked schedules and plans. The absence of this scheduling data raises uncertainty about whether the projects will be completed in time to support operational design planning slated for September 2015. Four of the six IT research and development projects did not have finalized schedules. As for the two projects that did have completed schedules? They were not estimated to be completed until after the September 2015 design decision date. The same review found inconsistencies with the risk assessments and mitigation plans for the associated IT options.

By the summer of 2014, the Department of Commerce’s Inspector General released an assessment of the Census Bureau’s mandatory budget cuts. The review determined that cost information inaccuracies made it impossible to determine the impact of budget reductions. Further, the OIG found that internal practices increased the risk of incorrect or fraudulent charges. Four recommendations were issued for improvements, all centering on processes and procedures for the programs oversight. In addition to agreeing to the recommendations, the Census Bureaus acknowledged the “deficiency in the completeness” of its documentation around budget estimates and financial decisions.

This fall, the Census Bureau’s efforts to identify data sources for addressing and mapping requirements were examined. Once again, inconsistencies were found in the research and planning processes. These gaps included failing to document cost and quality information for executed decisions related to use address and mapping data from state, local governments, other agencies, and a commercial vendor. Additionally, management approval for the data source decisions failed to be documented, which would present accountability and transparency issues for future sourcing decisions.

Going Forward

Census has just completed the research and testing phase of the program, which ran from FY 2012 to FY 2014. The nest phase runs from FY 2015 to FY 2018 and will incorporate operational development and systems testing. FY 2015 activities are expected to focus on supporting research and testing infrastructure. Projects slated for this fiscal year include automation, workload management, Bring Your Own Device (BYOD) incremental development and research, enumeration system reuse, geographic program planning, IT security as well as a virtual office computing environment. In October, the Census Bureau released a request for information (RFI) as part of its market research into source for the Integrated Communications Program. Responses were due by the end of the month, and as of mid-October no decision on an acquisition strategy had be made yet. The mandated execution of this program and its expanding use of IT to execute the 2020 Decennial Census and to establish a sustainable model for future enumeration provide a number of elements for contractors to watch. In addition to the potential business opportunities associated with the census, the convergence of technologies like mobile computing, cloud environments, and data analytics will be a practical test of government technology adoption.

----------------------------------

Originally published in the GovWin FIA Analysts Perspectives Blog. Follow me on Twitter @FIAGovWin.

 

AppVet Honored for Expediting Mobile Apps to the Military

GCN honored NIST and its AppVet project this week at the 2014 GCN Awards Gala for speeding the deployment of mobile hardware and software to warfighters.

Ten government IT projects were selected for GCN’s IT Achievement Awards based on their impact on the agency or public, the quality of the project leadership team and the level of  innovation found in the technology plan.   According to PV Puvvada, Unisys Federal Systems acting president and one of eight judges, this year’s awardees rose to the top because they “focused their attention on achieving mission outcomes for stakeholders and not just on the milestones to get there.”

In the case of AppVet, NIST was approached by DARPA for help deploying commercial mobile devices and applications to the field more quickly. The challenge was to quickly vet the security and reliability of the hardware and software for military needs on a large scale.  Prior to AppVet, military personnel in the field had to use costly, heavy communications equipment with limited functionality, or acquire and test commercial devices that were outdated by the time they could be put to use.

NIST led a multi-organizational effort that developed innovative methods for security, testing and evaluation of hardware and software to securely deploy off-the-shelf smartphones and applications in military field operations.  They created a framework of software assurance methodology, power and reliability analysis techniques, and standards-based cryptographic solutions.  AppVet is available as a free open source download.  The first users were in DoD, which has used it to securely deploy more than 3,000 commercial smartphones in the battlefield.  

AppVet was developed under DARPA’s TransApps program by a team headed by NIST and including George Mason University.  The project began in early fiscal 2012, with the framework becoming operational that year. Funding ended in April 2014.

Because AppVet is a simple, open-source web service and framework for vetting mobile applications, it can be used across government and by commercial developers.

 

 

 

Mobility a must for state social services programs

Over the last 2-3 years, state governments have seen an increase in mobile traffic to health and human services websites. A recent Government Technology article highlighted this trend, noting that half of all traffic to Georgia’s child support website in 2012 came from mobile phones. Similarly, one in three visits to the New Jersey child support website in 2013 came from a mobile device. In response, many state IT departments are adopting “mobile first” strategies to ensure that information and benefits are easily accessible via mobile technologies, which are oftentimes the sole source of Internet access for many state residents.
 
Several states have pioneered smartphone apps for social services programs, which participants can download to quickly get information about their benefits or to access additional resources. New Jersey and California have developed child support phone apps to help recipients manage child support accounts on the go. Georgia has developed Quickwic, an app for WIC participants who want instant access to their benefits and information about eligible purchases. The Connecticut Health Insurance Marketplace, Access Health CT, developed a smartphone app that makes it easier for residents to browse healthcare plans and submit applications, even allowing residents to take photos of their verification documents and upload them to their account.
 
Deltek predicts an increase in user-friendly, mobile-enabled Web applications that make it easier for both caseworkers and constituents to access the information and resources they need. Vendors who emphasize mobile-first strategies or the importance of mobile-friendly software applications will stand out to state governments looking for innovative health and social services solutions. Pennsylvania announced that it is considering a mobile app for WIC payments, and many other states are looking for ways to make their health and social services programs more mobile friendly. To learn more about upcoming health and human services IT business opportunities, be sure to visit the State & Local Vertical Profiles for Health Care and Social Services. Not a Deltek subscriber? Click here to learn more about Deltek's GovWin IQ service and gain access to a free trial.
 

 

 

Justice to Streamline IT Buying through Service Broker

In the coming year, the Justice Department will join the ranks of agencies leveraging service broker arrangements for acquisition of IT infrastructure and services.

In recent years, Department of Justice (DOJ) has progressed efforts to consolidate contracts, reducing redundancy of acquisition efforts and improving enterprise capabilities. Some of these initiatives began as informal strategic sourcing efforts. The department has actively leveraged Enterprise License Agreements (ELAs) and Blanket Purchase Agreements (BPAs) to achieve cost savings. The majority of the department’s mobile device and wireless services were consolidated through several contract vehicles. By leveraging strategic sourcing and shared services for wireless and telecom needs, DOJ can lower equipment expenditures by moving to contracts with best negotiated prices.

Now, it seems that the Justice Department is taking the next step by pursuing service broker. Other federal agencies that have adopted a service broker model include Defense Department and the National Nuclear Security Administration (NNSA). These broker arrangements allow agencies to identify solutions for common requirements and simplify technology buying within organizations.

According to recent reports, DOJ expects to target infrastructure and commodity IT services initially. These technologies would include wide area network (WAN), data centers, storage, email, telecommunications, security, and Trusted Internet Connection (TIC) services. The “next tier” of services that would be addressed, according to Justice’s CIO Klimavicz, cover business enterprise services, such as voice and collaboration.

The decision to formally adopt service brokerage aligns with the department’s strategic plans and technology initiatives. For a number of years, DOJ has actively leveraged Enterprise Level Agreements and Blanket Purchase Agreements to achieve cost savings. In 2012, Justice established ten commodity area working groups focus on IT functions, like data centers, email, and mobility. These groups provide recommendations to the DOJ CIO Council to address commodity investment areas, to identify potential for consolidation and cost savings, as well as to manage milestone and performance metrics.

DOJ’s near term information resource planning highlights 5 goals including institutionalizing IT portfolio management, streamlining operations, enhancing IT security, delivering innovative solutions, and expanding information sharing. The shift to centralized delivery of IT capabilities, such as multi-component (enterprise) IT services, and use of enterprise platforms is expected to drive greater value than silo solutions. Ongoing assessments and continuous enhancement of existing IT assets and vendor relationships will improve the value of the IT portfolio by evaluating the risks of adopting new technologies too soon or sustaining legacy technology for too long.

Brokerage would facilitate increased use of shared services, enable enterprise capabilities, and consolidate departmental purchasing power to improve pricing through strategic sourcing. The Department of Justice’s vision for strategic sourcing has led to the establishment of a Vendor Management Office (VMO) targeting improvement of buying practices for IT infrastructure. The VMO will lead efforts to analyze procurement data, to identify best practices, and to centralize enterprise procurement vehicles.

As with other federal markets being impacted by strategic sourcing, vendors will need to be increasingly mindful of market positioning. IT spending will be increasingly directed through agencies strategic sourcing and preferred contract vehicles, but that shift inhibits spending as government organizations look to achieve economies of scale for commodity IT purchases. The establishment of Vendor Management Offices means contractors can expect increased oversight and greater need to partner smartly.

----------------------------------

Originally published in the GovWin FIA Analysts Perspectives Blog. Follow me on Twitter @FIAGovWin.

 

Cloud Spending on VA’s T4 Contract Vehicle

Over the last month I’ve been posting series of brief analyses of cloud spending on some of the federal government’s largest task order contract vehicles.  So far this series has focused on Government-Wide Acquisition Contracts like Alliant and on Blanket Purchase Agreements like GSA’s cloud BPAs.  This week I’ll narrow the focus to an agency-specific multiple award contract, the Department of Veterans Affairs’ Transformation Twenty-One Total Technology (T4) vehicle.  Since its inception in 2011, T4 has assumed an increasingly important role in VA’s acquisition of information technology support services.  In fact, with 23% of overall agency IT spending going through T4 annually, one could say that T4 has become the go-to procurement vehicle for VA customers.

Within this context cloud computing has assumed an increasingly central role in the VA’s IT environment, a development reflected in spending on cloud computing on T4.  Since fiscal 2010 the VA has awarded contracts for cloud computing with an overall value of $189 million.  This figure comes from data I keep for Federal Industry Analysis clients and it is not comprehensive.  A lot of work is going on behind the scenes that I haven’t been able to capture due to a lack of reporting by the VA.  Nevertheless, I believe the $189 million figure approximates a good portion of the awards made so far. 

Of this total, cloud contract awards made via VA T4 add up to $153 million, basically the lion’s share of the work.  VA customers have awarded an additional $29 million in cloud contracts via GSA’s Schedule IT 70, and just over $1 million apiece for sole source and set-aside awards, so you can see just how central T4 is to cloud procurement at VA.

How does this break down by project?  Here are the top ten programs by total award value.

  1. Cloud Computing for Mobile Device Management and Mobile Application Environment - $49M
  2. Mobile Infrastructure-as-a-Service - $34M
  3. Cloud Computing Services - $28M
  4. Migration and Cloud Hosting Services for the My HealtheVet Application - $14.5M
  5. Veterans Relationship Management CRM Expansion Hosted Cloud Services - $13M
  6. Mobile Applications Collaborative Environment and Device Manager - $9M
  7. Cloud Hosting of Mobile Applications Collaborative Environment and Device Management - $9M
  8. VA for Vets Program Cloud Computing Support - $8M
  9. Voice-as-a-Service Project Support - $7M
  10. Turnkey Cloud Computing Environment to Support the MI 7 New Model of Care HRA - $5M

As we can see from this list there are two areas in particular where the VA has been investing in cloud computing – mobile communications services/capabilities and health IT.  These are of course related as the infrastructure and capabilities for mobile communications will enable access to and use of health IT applications on mobile devices.  In this sense, the VA has been using T4 for precisely the purpose that it was developed – to speed the acquisition of core technologies central to fulfilling the agency’s mission.   It’s worth keeping an eye on T4 as it progresses through its lifecycle because if the VA continues to demonstrate successful use of T4 to accomplish its technology goals, it will serve as an example for other agencies seeking to establish MACs they can use to achieve their own specific goals.

Labor Wants Platform for Mobile Computing and Open Data

The Department of Labor is requesting additional IT funds for FY 2015 to develop a platform for mobile computing and open data, and build on its ongoing IT modernization initiative.  

The Digital Government Integrated Platform (DGIP) is designed to complete another set of prerequisite and basic IT capabilities department-wide to include VOIP, VTC, and wireless access in Labor national, regional and field office locations.  The platform would be used by all agencies at Labor to build and deploy mobile computing and data sharing applications.  The FY 2015 Exhibit 53 IT budget request shows Labor asking for $9 million in FY 2015 for the effort.

However, the fate of the initiative rests with the Labor, Health and Human Services, Education and Related Agencies spending bill which could be stalled for months.  No draft of the bill is available and no markup has been scheduled.

 

Labor strives to promote the welfare of job seekers, wage earners, and retirees in the US, but in recent years has been somewhat hampered by lack of new IT tools.  Despite the challenges, Labor staff members have been able to launch several new initiatives.  Earlier this year, the department successfully moved 17,000 employees to a cloud-based e-mail environment.

 

FCW recently spoke with Labor’s Deputy CIO Dawn Leaf about the DGIP effort.   According to Leaf,”…using the platform would save us about 50 percent, than if the agencies went out and built it and bought it individually."  The majority of costs savings will come from the reduced need for systems integration.  "Sometimes people think because you're using cloud that it's simpler, but it's not really. Your external cloud provider will have requirements, all of your suppliers will have requirements and you'll have to integrate them."  Cost savings from DGIP will come from deploying the platform department-wide rather than individually at each agency, cutting down on integration costs.

 

Leaf stated that the initial stage of the design platform is complete.  “The hardest part might be over – assuming funding is made available,” she said.

 

DoD Cloud Innovation: Research on Cloudlets

The Department of Defense’s efforts to utilize commercial cloud solutions over the last few years have received a decent amount of attention in the trade press and on the conference circuit.  The reporting tends to evaluate the DoD’s use (or non-use, as the case may be) of the cloud from the perspective of a standard commercial business use-case, meaning DoD customers are expected to either identify applications to migrate, solicit the work, and migrate the app to a commercial hosting solution, or to purchase a capability as a service from a commercial provider.  It is against these standard approaches to cloud computing that the DoD’s efforts have been judged.  Cloud innovation at the DoD, however, is often more diverse and exploratory than industry is led to believe.  This and next week’s posts will examine two examples of innovative cloud use in the DoD in an effort to show that there can be business opportunities for vendors beyond the threshold of “ordinary” use-case expectations.

Mobile Cloudlets

The first area of innovation is in mobile cloudlets.  What’s a mobile cloudlet?  Good question.  Cloudlets are an approach to cloud computing in connection-challenged environments that is being pioneered by researchers at the Carnegie Mellon University’s Software Engineering Institute.  As explained by Grace Lewis, a Senior Member of the Technical Staff at the SEI, “cloudlets … are lightweight servers running one or more virtual machines [that] allow soldiers in the field to offload resource-consumptive and battery-draining computations from their handheld devices to nearby cloudlets. This architecture decreases latency by using a single-hop network and potentially lowers battery consumption by using WiFi instead of broadband wireless.”  This approach, which takes advantage of both cloud computing and mobile technology, provides mission capabilities more effectively to military personnel, and, potentially, law enforcement and first responders, in difficult environments where connectivity may be lacking.

Research on cloudlets in the DoD is currently focused in a couple of different areas.  The first of these is funding for work at the SEI, which I won’t go into here because of the limited addressability of these dollars.  The second area is research being performed at the Army Research Laboratory (ARL) related to Mobile Ad-Hoc Networks, or MANETs.  Specifically, in FY 2015, the ARL has requested $6.1 million for the Information Protection for Mobile Ad-Hoc Networks project.  The goal of this project as it relates to cloudlets is to “develop security protocols and processes for using tactical cloudlets as a shared resource among Warfighters and coalition forces.”  In addition, the ARL has also requested $1 million for the Mobile Network Modeling Institute to examine the “impact of clouds and local tactical cloudlets on network behaviors.”  The final effort worth noting is the Heterogeneous Computing and Computational Sciences project.  For this work, the ARL has requested $1.67 million to “create new models to describe offered load and computational capacity within cloudlet-based services in Army-centric mobile and ad hoc networked technologies.”

There is of course no guarantee that any of this money ever materializes into a contract.  What’s important to remember in this context is the direction of the DoD’s efforts and the potential impact this could have on future business opportunities.  As the DoD’s use of cloudlet-based approaches evolves, it can translate into benefits for those who have positioned themselves to offer solutions that can operate in a cloudlet.  This means potential opportunity down the road for software development and mobile application vendors.  The winds are blowing toward cloudlets in connectivity-challenged environments, suggesting that those who tack into this wind will find interested customers in the DoD.

 

NIST Forum Examines the Convergence of Cloud and Mobility

Of the more than 500 cloud computing efforts across the federal government being tracked by Deltek’s Federal Industry Analysis team, roughly 14% are related to communications and collaboration capabilities. Most of the efforts in this sub-group are for tools like email and SharePoint. A few, however, are for capabilities like unified communications and mobile device management.  That a small number of early adopters are dipping their toes into the water of cloud-based communications/mobility solutions indicates growing interest among federal agency customers. This interest recently coalesced into a 2.5 day long forum and workshop on the intersection of cloud and mobility hosted by the National Institute of Standards and Technology. NIST’s purpose in bringing together members of industry, academia, and government was to get ahead of what it expects will be a rapid convergence of cloud and mobile technologies in the next few years. This convergence is already taking place in the commercial world, a fact that has alerted NIST to the need for a standards-based ‘roadmap’ that outlines a path forward for federal agencies.

To that end, NIST invited speakers from the defense, civilian, and intelligence sectors of the federal government to discuss with industry and academia both the current status of cloud and mobility, and what they see coming in the future.  Although speaker comments touched on a wide number of subjects, this post will focus on three specifically: interoperability, security, and the path ahead.

Interoperability

Multiple speakers expressed concern that agency plans to use cloud are not adequately taking into account the need for interoperability between systems, applications, and platforms. In her morning keynote, Pamela Wise-Martinez, Senior Strategic Enterprise Architect at the Office of the Director of National Intelligence, explained that current limitations to interoperability are inhibiting the agility of cloud solutions being employed by the federal government. Stovepiped data and legacy systems are part of the problem, but lack of training is also a challenge as too many federal personnel don’t understand why increased interoperability is needed. Discussion of interoperability inevitably led to comments on data standardization and open architecture. Speakers agreed that both of these things are required to allow data to move freely, thereby enhancing agency efforts to share data.

Security

Liberated, standardized data in the cloud enables to greater movement of it throughout the enterprise, leading to greater concerns about security. This may sound ironic, especially given that departments like the DoD have justified their efforts to implement a Joint Information Environment by claiming that non-stovepiped data is more secure. Jacob West, Chief Technology Officer for HP Enterprise Security Solutions, noted in his post-lunch keynote, however, that the growth of mobile and cloud solutions has dramatically increased the attack surface of most networks. Coimbatore S. Chandersekaran, Distributed Systems Chief Engineer at the Institute for Defense Analyses, echoed West’s observation, stating that edge (i.e., mobile) devices are typically weaker than machines in the data center, making them preferred attack vectors for those who would seek to steal data or harm Defense systems.

If adding endpoint devices and putting data in the cloud actually increases agency vulnerability, what’s the solution? Several were proposed, including increasing the use of analytics. Automating the process of identifying attacks can mitigate the effect of those attacks much quicker than when a human being is in the loop, HP’s West argued. However, automation can only be achieved with more extensive use of open architecture and standards. Conveniently, using analytics for continuous monitoring is a process well-suited for the cloud. Just take a look at the Defense Information Systems Agency’s new NSA-inspired Acropolis analytics cloud, for instance.
Speakers also noted a need for shared threat notifications to alert the government community (something that in itself needs to evolve and coalesce), as well as a need for agencies to implement credentialing strategies that provide visibility into who is on government networks, and to engineer systems with security as a core competency.

The Path Ahead

Lastly, several speakers took a shot at explaining what they see is on the horizon for cloud and mobility. Here are a few highlights.

Pamela Wise-Martinez (ODNI) argued that ‘boundary-less’ cloud computing is coming, particularly as analytics evolve, sensor clouds develop, and the ‘Internet of Things’ expands. The result will be the transformation of government into more a service provider than ‘big brother.’

Jacob West stated that industry needs a better way of collaborating to defeat cyber threats. Therefore, work should progress toward creating a community of trusted partners linked via an automated platform that catches a large number of threats and shares relevant information.

Finally, Dawn Leaf, Deputy Chief Information Officer at the Department of Labor, expressed her opinion that the shift of IT to a cloud-based services model is a mini-revolution. It will be this continued shift that drives the use of new technologies, not the development of the technologies themselves.

 

More Entries