- Provide common user services and platform services through consolidation of infrastructure and existing software licenses.
- Provide two private clouds: an unclassified DoD cloud and a classified DoD cloud.
- Improve end-user device access by migrating end-user applications to the cloud and migrating end-users to a Virtual Desktop Interface (VDI) environment.
- Develop methods, when using commercial cloud service providers, which protect data in transit and at rest, authenticate users, and apply appropriate access controls.
- Provide virtual container technologies supporting secure unclassified operating environments on a wider variety of approved end-user devices.
- Move to a commercial-government hybrid cloud computing environment with DoD retaining the identity provider role.
- Improve service interoperability across core, intermediate and tactical edge environments.
By now everyone has probably read about the recent $45 million sole source award that the Defense Information Systems Agency (DISA) recently made to the Alliance Technology Group for Large Data Object Storage (LDOS). The Justification and Approval (J&A) notice for the award states that ATG will provide DISA with a scalable storage solution the development of an intelligence, surveillance, reconnaissance (ISR) cloud. The resources ATG will provide can store hundreds of billions of objects for ISR uses across DoD networks, including “Wide-Area Motion Imagery (WAMI), Standard and High-Definition (HD) Full-Motion Video (FMV), HyperSpectral, Laser Imaging Detection and Ranging (LIDAR), Electro-Optical/Infra-Red (EO/IR) and Synthetic Aperture Radar (SAR) data formats.” The breadth of data objects to be stored is interesting, as is the fact that DISA is building an ISR cloud, but to me the real importance of this notice lies in what it says about the challenges the DoD faces in trying to handle big data. Many of these challenges are themes that have appeared in FIA’s blog posts and reports for the last year.
The Strain of Big Data
In a moment of candor, DISA admits in the notice that it “cannot provide the Storage Cloud in its Defense Enterprise Computing Centers (DECCs) due to the physical size of the necessary hardware” required. Similarly, DISA states that “it does not have the funding … to purchase the required hardware or storage facility.” DISA also admits in the notice that the new ISR cloud requires increased bandwidth that the agency cannot provide: “Alliance Technology Group is the only contractor with the ability to provide the ISR Cloud Solution with bandwidth at a secure and accessible location.”
Here is the crux of the challenge in three short sentences. DISA lacks the physical space it needs for a large investment in hardware, it lacks the money to buy the hardware, and it lacks the bandwidth capacity required for ISR data analysis. In this blog post from October 2012, I made the case that big data is a game changer in the federal IT market, not because of the technologies that will be used to exploit it, but because it acknowledges that the exponentially growing demands of data management have outstripped the limited resources agencies have to handle it.
Visualize if you will all of the data that the DoD accumulates as a large sea. The level of the water is rising. Then picture the resources the DoD has to handle that data as a system of dikes used to hold back the sea. Occasionally the dikes are opened to relieve the pressure. Nevertheless, the sea level beyond continues to grow. This is the big data challenge facing the DoD and other federal agencies and the timing could not be worse. The challenge is rising at precisely the moment when the fiscal resources required are not available. The challenge of big data is not an “efficiency” problem, it is an overwhelming volume, variety, and complexity problem that requires smart governance and, more importantly, increased investment in infrastructure (commercial or government), analytical capabilities, and trained personnel.
Turning to the Cloud
Having recognized the challenge, DISA is doing the only thing that it can – it is turning to commercial cloud providers to provide the capacity it requires. In this case the capacity is storage and bandwidth. The J&A makes clear that DISA anticipates the LDOS ISR Cloud will exceed 1 Exabyte within one year and may exceed 3-4 Exabytes in three to four years. DISA is being optimistic here. Neither the DoD nor the Intelligence Community have any intention of limiting the amount of data taken in. Go to any DoD event on big data and you will hear speakers say that they want to keep every bit and byte because they never know what will be important in the future.
All of this means the following. Vendors need to offer secure cloud storage solutions, big data analytics (preferably as a cloud service), and related cloud service solutions that meet the DoD’s security requirements. A recent memo issued by Navy CIO Terry Halvorsen makes this latter point explicitly. This J&A award to Alliance Technology Group is the tip of the iceberg. There is a tsunami of contract dollars building to address the DoD’s big data needs. These contract dollars will flow into modernized and optimized infrastructure – like the new DISN Optical Backbone that DISA intends to build – as well as new database software called out in the FY 2013 National Defense Authorization Act (NDAA), new processing capacity, new storage capacity, and the personnel services required to make all of this go. The only thing holding back the big data spending tsunami is the fiscal crisis. This is causing procurement to dribble out in small awards here and there. However, even with imposed fiscal restraint the path ahead is clear. The DoD and all federal agencies eventually will be forced by necessity to contract out the big data services they require to cloud providers. The call has gone out in this DISA J&A. Can you hear it?
Second, the FAA may choose to compete brand new contracts for NextGen requirements. The $64,000 question at this point is will the FAA leverage cloud computing for its needs? Publicly, the FAA’s progress toward the cloud has been slow. Behind the scenes, however, it is beginning to look like the agency is growing more comfortable with using cloud-based solutions. For example, Noblis has been providing cloud computing support for the FAA’s System-Wide Information Management (SWIM) program since June 2012. That order was awarded via Enterprise Communications Support Services (ECSS) contract # DTFAWA11D00051. More recently the FAA Office of Airports awarded a contract to L-3 Services (a subsidiary of L-3 Communications) for its System of Airports Reporting (SOAR) II requirement. Section 4.2.3 of the Statement of Work called specifically for the awardee to complete an assessment of a potential cloud computing solution for SOAR II. Strictly speaking, SOAR II is not a NextGen system, but because it interfaces with NextGen systems I am wondering how long it will be before a lot more Market Surveys calling for NextGen related cloud solutions start appearing on FedBizOpps.gov.
- The threat vector has dramatically changed at the same time that laws are changing that put penalties on not securing your data. More is changing in this environment than is staying the same.
- Some security practitioners have dropped the word “advanced” from the description of advanced persistent threat (APT) because they observe the vast majority of attackers using common attack approaches – the “open door” rather than “breaking a window.” The disparity in security capabilities is greater than the disparity in threat.
- Mobility – The number of new mobile vulnerabilities being detected is growing almost exponentially each year, making mobility the biggest growing threat vector.
- Cyber arms race is unlike any other arms race in history because it is frictionless. For example, it took 3 days for Stuxnet to be reverse-engineered, reproduced, and propagated. It taught everybody how to attack a SCADA system. It has also given rise to the private cyber arms manufacturer – people who build cyber-attack capabilities and sell them on the black market.
- Personnel training to avoid risky behavior is the most important element of cybersecurity. NSA statistics show that 80 percent of exploitable vulnerabilities are a result of poor cyber hygiene. The other 20% is the APT.
- Social engineering is a growing threat because, among other things, it gives the attackers a deeper understanding of how users and organizations behave, respond and think.
- Growing cyber threats in the aviation sector target in-flight operations, ground support operations, air traffic managements systems, etc.
- Some agencies are moving to cloud services because of financial constraints, knowing of security risks and hoping security will follow soon afterward.
- Some key challenges in effectively implementing Cloud include:
- Contract structuring: How do you structure a contact offering when you don’t own the asset? How do agencies (GSA, etc.) effectively strengthen cloud acquisition policy and build in security into SLAs?
- Clearance: what types of clearance levels are needed for people around the world who are supporting agencies or have access to their data, but are not necessarily part of a secure sector? Information sharing on threats, etc. is sensitive.
- Incident response: When there is an incident, who do I call? The Cloud Service Provider (CSP) or the agency?
- Information sharing is not an ends, it’s a means to an ends. In this context, it is needed to gain an effective shared situational awareness among shared stakeholders.
- One challenge to information sharing stems from a sense of human preservation. We have a culture of not sharing information, while hackers have a culture of sharing widely.
- Electricity Sector Information Sharing and Analysis Center (ES-ISAC) – Allows electric providers to share information in a non-compliance framework and encourages free flow of information without fear of compliance threat hanging over you. Effective sharing requires the freedom from the threat of sharing.
- Cyber Federated Model (CFM) – the warfighter has great command and control (C2) information and the CFM intends to enable C2 for cyber indicator information. For example, an infected site is sent into the CFM and within a few minutes all other sites within the CFM get the information. Some sites have automated updates and the information sharer gets to control with whom they share.
- One key to effective sharing includes the ability to be able to do it securely, i.e. share with assurance. Also, data must be anonymized to be shared, especially if the data is classified, sensitive or contains private information. Sensitive but unclassified information will need cooperative agreement between government and industry to set the boundaries for what each can do with the information they receive.
- Automated information sharing should focus on machine-readable threat indicators to automate data flow and get people out of loop where possible. Currently, high-priority threat-level information is XML-based, but going forward organizations will need more visual analytics.
- SCADA (supervisory control and data acquisition) systems, and other industrial control systems (ICS) were never designed for networking, but they have been extensively. So we are now building monitoring capabilities in an attempt to detect and defend against attacks on systems that were never designed to withstand such attacks.
- Attacks like Stuxnet and Shamoon targeted energy sector systems and disclosed SCADA system vulnerabilities.
- The patching treadmill – These control systems were never designed to be patched and/or shut down regularly. This patching can mean an entire plant must be shut down to complete the patch. This has the potential for unforeseen domino effects and implications for supply interruptions and other complexities.
- Different organizations and unrelated sectors currently have different architectures and protocols for collecting and sharing threat information. What is needed is a common open-standards XML schema to communicate attacks in industrial control and other systems.
- There is not currently a consensus on how to proceed with administering cyber- and critical infrastructure protections, with significant polarization existing between competing regulatory/compliance and collaboration/incentive approaches.
- Comprehensive legislation (Lieberman-Collins, and others) that failed in the Senate included new and expanded regulatory and compliant elements over the private infrastructure community.
- Some industries, like nuclear energy, have very mature regulatory environments and some assert that the success in this area is an example of positive regulation that should serve as a prototype for other infrastructure industries.
- Public-private partnerships are essential. The Critical Infrastructure Partnership Advisory Council (CIPAC) and HSPD-7 were the predecessors to the latest Executive Order (EO) and Presidential Policy Directive (PPD-21).
Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about . Follow me on Twitter @GovWinSlye.
Following release of the SEWP V draft RFP, NASA hosted an industry event on March 11, 2013 to field questions from industry and to discuss changes from the current version of the contract. Among the changes noted are the number of competition groups, performance period and ceiling values. The performance period for the contract has increased to 10 years, and the ceiling value has risen to twenty billion dollars. At the same time, the number of competition Groups is being reduced (to the four shown below for SEWP V), a move that’s expected to reduce the costs to both industry and government.
For more information on SEWP V, visit the GovWin Opportunity Report.
Agencies across government are struggling these days to leverage rapidly evolving new technologies and approaches like advanced data analytics and cloud computing. Introducing these technologies and approaches into an IT enterprise that is not ready for them can be disruptive. This much is known. Less well understood is the fact that even planning for big data and cloud investments can be disruptive because of the requirements development needed.
Take for example the efforts of the Defense Intelligence Community to bring stand up the Intelligence Community IT Enterprise (ICITE), a platform for sharing information across agency clouds and organizational boundaries. The challenges facing the IC in this area were the subject of a recent panel on Big Data Analytics hosted by the DC chapter of the Armed Forces Communications & Electronics Association (AFCEA). This panel brought together three speakers to offer insight into what is happening at their respective agencies, including Keith Barber, Director of the NSG Expeditionary Architecture Program Office at the National Geospatial-Intelligence Agency (NGA), Agustin “Gus” Taveras, Jr., CTO in the Directorate for IT Management at the Defense Intelligence Agency (DIA), and John Marshall, CTO in the Intelligence Directorate of the Joint Chiefs of Staff.
The panel’s discussion swirled loosely around the challenges that the IC is facing when it comes to sharing data and employing new technologies. It is worth remembering that these agencies are out in front of adoption of big data tools and cloud computing, so their experience can prove valuable for understanding where other federal agencies are likely to encounter roadblocks. Mr. Barber began the discussion by noting that even the IC struggles to keep up with rapid technological evolution. Citing a recent article that appeared in the Harvard Business Review, Barber said that government must get a better handle on where the “big data economy” is headed so that it can leverage private sector developments. Most important in all of this is knowing simply where to start. As Barber sees it, the IC needs to begin asking the right questions to get the right answers; questions like which data sets do we go after, what tools do we need, and how do we best share data? Sharing the data is a key issue that Mr. Barber believes the IC will remain preoccupied with for years to come.
DIA CTO Gus Taveras agreed that data sharing is a critical piece of the evolving IC big data environment and he suggested the ICITE program is the answer. In general, ICITE is the IC’s version of “ruthless standardization” as it forces the Intel agencies to move to a common enterprise framework. Taveras noted ironically that Sequestration has helped accelerate the push toward ICITE. The biggest challenges Taveras sees are in the realms of procurement and requirements development. Here he referred back to the concept of asking the right questions. How do we pay industry for services, Taveras asked? Is a metering model the best or is there some other way to do it? As an aside, it was surreal to hear that even now, well into the adoption of cloud computing by the public sector, there is confusion about the best payment and contracting model.
Then there is the issue of requirements development. Taveras explained that as CTO the hardest thing about big data analytics is understanding what analysts and other customers need. Determining requirements is complicated by the fact that there is no “one tool fits all” solution available. In some cases, analysts may be happy with the capabilities the currently have, but they would like enhancements. This would be less expensive than buying an entirely new solution, but understanding how enhancements are acquired is a challenge. Underlying Mr. Taveras’ comments was a sense that analytics tools are evolving so rapidly that his personnel do not know what they can use.
Then there is the question of contracting. How does one contract for new capabilities when the requirements development process does not function effectively? How, indeed? This admission by Mr. Taveras raised the twin red flags of scope creep and shifting requirements that have plagued government contracts for decades. And if previous generations of contracted efforts faced these challenges imagine how much more daunting they could become as big data and cloud computing solutions grow in complexity and variety. Caveat venditor!