B2G is moving!
Blogs posted after May 22, 2015 will be located on Deltek's central blog page at
Just select the "B2G Essentials" blog to continue to receive this valuable content.
NASA Charts Course for Information Technology Development

The National Aeronautics and Space Administration (NASA) is seeking input on drafts of the organization’s technology roadmaps that will shape priorities for the next 20 years.

Five years ago, in 2010, NASA developed a set of technology roadmaps to guide space technology developments. This month, NASA released drafts updating and expanding on those fourteen roadmaps. These documents will serve as a foundational piece of the Strategic Technology Investment Plan (STIP), which will lay out a strategy prioritizing technology developments and establishing principles for investment. NASA’s web-based system, TechPort, tracks and analyzes technology investments. Together, the roadmaps, STIP, and TechPort enable portfolio management for NASA’s technology spending, aligning investments and reducing duplication.

The 2015 NASA Technology Roadmaps cover fifteen Technology Areas. Crosscutting technologies, like avionics, autonomy, information technology, and space weather, span multiple sections. Focusing on applied research and development activities, NASA’s Technology Roadmaps cover a broad set of needed technologies and development approaches for the next 20 years (2015 to 2035). The Modeling, Simulation, Information Technology, and Processing Technology Area highlights advances in flight and ground computing capabilities, physics-based and data-driven modeling, as well as information and data processing frameworks, systems, and standards. The capabilities addressed within this area impact other technologies across the NASA portfolio, enabling application-specific modeling, simulation, and information technologies called out in other roadmaps to support the ever-increasing challenges of NASA missions.

Despite parts of the discipline-specific nature of parts of the technology area, many of the capabilities will enable advances in modeling and simulation for areas addressed in other roadmaps, which address specific domain perspectives. The current roadmap continues the pattern of previous versions, listing Modeling and Simulation separately; however, given the high degree of their interrelation, the roadmap contents often reference them together.

Goals for the next twenty years include development of transformational flight and ground computing capabilities, increased modeling productivity and fidelity, simulations to enable risk management across the entire system lifecycle, and progress around leveraging NASA’s massive volumes of observational, sensor, simulation, and test data. Ultimately, these capabilities will help to empower decision makers and support NASA’s missions. 

NASA released a request for information (RFI) on May 11, 2015 associated with the road map drafts. The space agency is looking to determine if the correct state of the art has been identified, gauging interest in space applications from commercial industry and other government agencies, use of technology for non-space applications, and exploring partnerships for technology development. The comment period is open until June 10, 2015.


Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.


The Hunt for the New Duct Tape – New Defense Cyber Strategy Looks to Cyber R&D

The Secretary of Defense, Ashton Carter, announced last week the release of the Department of Defense’s (DoD) new Cyber Strategy aimed at improving the their cyber capabilities. One theme focuses on leveraging cybersecurity research and development (R&D) to accelerate these capabilities. So how much money might DoD be directing toward cyber R&D?

New Defense Cyber Strategy Overview

The stated purpose of the new Department of Defense Cyber Strategy is to guide the development of DoD's cyber forces and strengthen its cyber defense and cyber deterrence posture. The strategy focuses on building cyber capabilities and organizations for DoD’s three cyber missions: defend DoD networks, systems, and information; defend the United States and its interests against cyberattacks of significant consequence; and provide integrated cyber capabilities to support military operations and contingency plans.

The strategy sets five strategic goals and establishes specific objectives for DoD to achieve over the next five years and beyond.

  1. Build and maintain ready forces and capabilities to conduct cyberspace operations
  2. Defend the DoD information network, secure DoD data, and mitigate risks to DoD missions
  3. Be prepared to defend the U.S. homeland and U.S. vital interests from disruptive or destructive cyberattacks of significant consequence
  4. Build and maintain viable cyber options and plan to use those options to control conflict escalation and to shape the conflict environment at all stages
  5. Build and maintain robust international alliances and partnerships to deter shared threats and increase international security and stability

Cybersecurity Research and Development

Under the first strategic goal in the area of building technical capabilities for cyber operations the DoD is setting an objective to accelerate innovative cyber research and development (R&D) to build their cyber capabilities, looking to both the existing DoD R&D community and to established and emerging private sector partners for help in developing “leap-ahead technologies” that can aid U.S. cyber-defenses. To that end, DoD plans to focus its basic and applied R&D on developing cyber capabilities to expand the capacity of overall cyber workforce.

What might cyber-focused R&D look like in budgetary terms across the DoD? Looking at the FY 2016 Defense Research, Development, Test and Evaluation (RDT&E) budget books gives a general sense of magnitude and relative distribution of recent and proposed budget dollars. Reviewing the various RDT&E budget artifacts for Army, Air Force, Navy, and the Defense Agencies and searching for key terms like cybersecurity, information assurance, and information security identifies dozens of programs that are primarily directed at cybersecurity (and several more that appear cybersecurity-related.)

Looking at just the programs that appear directly cybersecurity-focused in the FY 2016 DoD RDT&E budget shows that the department budgeted nearly $780 million in FY 2014, with that level increasing to more than $1.1 billion in FY 2015 and FY2016. Further, the Air Force and DARPA have been the major players in the cyber R&D area for DoD, accounting for $844 million (72%) of the total $1.17 billion in FY 2016 requested funding. (See chart below.)



The R&D dollars depicted above are just part of the story. There is other cyber-related R&D spending embedded in larger efforts that contain cybersecurity elements or impacts, but ferreting out those dollars is gets tricky and can be even more imprecise. The point here is to get a sense of the size of the overall investment and where these dollars tend to be directed.

While it is important to recognize that not all of these dollars will be spent on contracts with industry partners for R&D services and technologies, the fact remains that the sustained need by DoD for more advanced cyber technologies and tools is likely to grow in both real terms and in proportion to other R&D areas. In fact, the investment in this push for greater cyber tools may easily outpace the growth rate for other areas of contractor-addressable cybersecurity within DoD. This is especially true in the support services area as the DoD strives to develop thousands of uniformed cybersecurity personnel in the coming years.

One thing seems for certain, the DoD recognizes its need to cover a lot of ground quickly when it comes to improving its cybersecurity capabilities and posture and they are looking to harness creative energies to address the need. In many ways, it’s not unlike past challenges where they have looked to partners in industry and elsewhere to come up with creative solutions. Who knows? Soon we could be looking at the cyber equivalent of duct tape.

HHS is a Major Player in the Move toward Agile Acquisition

HHS is helping to lead the way toward more agile acquisition of IT. Outgoing HHS CTO, Brian Syvak, recruited Mark Naggar to start the HHS Buyers Club and blaze a trail for more innovative acquisition in the agency. The Buyers Club brings together HHS procurement professionals to discuss and test innovative purchasing ideas. 


Naggar used the Digital Services Playbook for the first HHS Buyers Club procurement, redesign of the Office of the Assistant Secretary for Planning and Evaluation’s (ASPE) public and intranet websites.  Vendors were only required to submit an eight-page concept paper after which five were chosen to proceed by creating prototypes.  This procurement concept allows vendors to show the federal buyer “what they can do” rather than writing about it in a lengthy proposal.  Naggar involved all of the stakeholders and significantly shortened the procurement cycle. 


In an interview with FedScoop Naggar said, "So often we're focused on getting something awarded and there's not enough attention focused on implementation, which is why we're trying to switch from waterfall to agile."


The waterfall method traditionally used for procurement and development has proven to be time-consuming, costly, and has not delivered what agencies truly need. So often, agencies don’t realize they’re headed down the wrong path until large sums of money have already been spent on a project.  "It's basically, 'Congratulations, you won the award,' they drop the mic and walk out of the room. And in six months you get something and realize it's not what you wanted, not what you needed," Naggar told FedScoop.  An agile approach dramatically mitigates risk and delivers results faster.


Naggar feels so strongly about government-wide acquisition innovation and reform that he organized a Conference for Innovative Acquisitions in February of this year.  The conference was sponsored by the Federal-wide Buyers Club with help from OFPP, the US Digital Service, and GSA and its 18F team.  The attendance of over a thousand government employees and contractors showed that Naggar is not alone in his quest to re-invent federal acquisition. 

One goal of the innovative acquisition movement is to change the federal government’s aversion to risk.  Although current procurement methods have been shown to be ineffective, federal procurement officials are trained to be good stewards of taxpayer funds.  Failure is not an option.  The private sector realized long ago that failure is part of innovation using the adage "fail fast, fail often."

Anne Rung, OFPP administrator and strong supporter of innovative acquisition, said at a recent conference, "We don't tolerate any kind of perceived failure. And people immediately walk away and resort to the old way of doing things."

The President’s FY 2016 Budget Request calls for more digital services teams and idea labs modeled after HHS across the federal government.  With such strong support from top officials and buy-in from the procurement ranks, we are likely to see increased use of innovative acquisition methods in the coming months and for years to come. 


Energy Department Continues Collaborative Supercomputing Investment


The Department of Energy’s (DOE) Office of Science and the National Nuclear Security Administration (NNSA) provide high-performance computing facilities that enable world-class scientific research. Mid April 2015, DOE extended its exascale supercomputer development program under its Collaboration of Oak Ridge, Argonne, and Lawrence Livermore (CORAL) initiative with a third and final contract. 

The $525 million CORAL project was established in early 2014 with the goal of improving supercomputing investments, streamlining procurement, and reducing development costs for high performance systems that will be five to seven times more powerful than today’s top supercomputers. Through collaborating across the department’s labs on the effort, DOE aims to help the nation accelerate to next-generation exascale computing. The three CORAL labs specified requirements in a single Request for Proposal (RFP) released in January 2014. The recent $200 million award will deliver Aurora system to the Argonne Leadership Computing Facility, completing the final supercomputer investment of the CORAL initiative. DOE earlier announced a $325 million investment to build state-of-the-art supercomputers at its Oak Ridge and Lawrence Livermore laboratories.

The entire scientific community will have access to the system once it is commissioned in 2018. Key research goals for the Aurora system include: material science, biological science, transportation efficiency, and renewable energy. The next-generation system, Aurora, will use a high-performance computing scalable system framework to provide a peak performance of 180 PetaFLOP/s.  The Aurora system will be delivered in 2018. In the interim, Argonne and Intel will also provide the Theta system, to be delivered in 2016, which will help ALCF users transition their applications to the new technology.

Additionally, DOE Under Secretary Orr announced $10 million for a high-performance computing R&D program, DesignForward, led by DOE’s Office of Science and NNSA to accelerate the development of next-generation supercomputers. The recent announcement complements the $25.4 million already invested in the first round of program. Through this public-private partnership, technology firms will work with DOE researchers to study and develop software and hardware technologies aimed at maintaining a national lead in scientific computing.


Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.


Federal Acquisition Improvement Takes Aim at IT

With over 3,300 contract units across the government, collaborating to share information and best practices can be challenging. Back in December 2014, the Office of Management and Budget’s Office of Federal Procurement Policy (OFPP) described near-term to transform federal procurement. 

OFPP administrator Anne Rung’s memo to federal agencies outlined current priorities to transform government buying. These areas include category management, acquisition workforce and processes, and government-industry communication. Milestones for many of the efforts and actions associated with these areas will be approaching in the next few months. 

Category Management

OFPP aims to shift from managing government purchases and prices individually to establishing categories for common spending and costs. Unnecessary duplication of contracts across government for similar goods and services burdens vendors with proposal preparation costs and administrative expenses, which can have a significant impact on small businesses. This shift in government buying includes promotion of strategic sourcing, in particular looking to optimize the $25 billion the government spends annually on commodity IT. To support this push, the General Services Administration is cataloging prices paid for IT goods and providing access to contract details for related products to highlight best practices. 

Talent Management and Innovation

The Office of Science and Technology Policy (OSTP) and OFPP are taking steps to encourage adoption of best practices within government purchasing of digital services and fostering innovation. These steps have included releasing a draft of the TechFAR Handbook and exploring case studies of resourceful contracting practices. OSTP and OFPP are collaborating on a plan to increase the government’s digital acquisition capabilities. To further support these efforts, the U.S. Digital Services is expected to launch a pilot program for training agency personnel in digital acquisition. One of the areas targeted for these activities is agile development. 

Strengthen Government-Industry Relationships

OFPP is developing an approach to improve communication between government and industry. Guidance is in the works to allow open feedback from industry on acquisition process improvement and to identify trends and issues. The guidance will shape Acquisition 360, an effort to formalize the agency collection of feedback related to acquisition processes and identify areas for improvement. The focus on strengthening relationships includes establishing enterprise-wide vendor managers, a step that will begin with recruiting vendor managers to support relationships with key IT commercial contractors. 

While these efforts will address all government buying, near-term efforts are zeroing in on agency IT. In particular, activities related to category management are expected to really dig into how agencies are buying technology products and services. It is worth noting, however, that the plans associated with this transformation initiative do not paint a picture of a sudden, new reality. Rather, they suggest ongoing activities to strategically reshape the landscape of government acquisition. As these current transformation efforts continue, pockets of advancement in different contracting organizations will contribute to gradual change across the government.


Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.


Does Chief Data Scientist Appointment Signal Boost for Analytics Spending?

Mid February 2015, the White House announced the appointment of its first Chief Data Scientist in the Office of Science and Technology Policy (OSTP). In addition to his post as Chief Data Scientist, Dr. Dhanurjay ‘DJ’ Patil will also serve as the Deputy Chief Technology Officer for Data Policy. This announcement comes as federal agencies continue to juggle resource limitations and competing priorities. 

According to the Chief Technology Officer’s post on the White House blog, the Chief Data Scientist will work closely with the Federal Chief Information Officer and U.S. Digital Service, but the particular objectives of the role remain unclear. Patil is expected to help shape policies for technology and innovation, to develop partnerships to get more from the nation’s data investments, and to recruit talented data scientists into public service. He is also expected to support the Administration’s Precision Medicine Initiative, which targets advances at the crossroads of health and data. 

Over the last several years, government agencies have been working to harness the data they generate and steward. In fact, a number of agencies – the Department of Transportation, the Federal Communications Commission, the Department of Energy, the Department of Agriculture, and the Department of Commerce - have already carved out posts for chief data officers in their organizations or filled role. The appointments reflect renewed efforts across the government to tap into agency information through open data and analytics. 

Agency investments in big data aspire to spur technology innovation and deliver improvements to digital services. As part of the Digital Government Strategy, agencies are continuing to achieve and maintain high quality data sets, and to make data open and useable to the public. The administration has set a clear goal of open and machine-readable data the default for government information. Currently, there are over 138,000 datasets available on for public use. Agency oversight organizations will be better equipped to combat waste and fraud. Meanwhile, decision-makers anticipate improved resource allocation. Technologists and research organizations are looking to advance scientific investigations by harnessing greater analytical capabilities. There are numerous health care applications from enhancing medical services to informing public health activities. Education stands to gain from data analytics through tailored lesson plans, online platforms, and increased efficiency for school operations. In short, cracking big data challenges stands to bring a wide array of benefits in areas like health, energy, education, public safety, finance, and development. 

Amid these promises, concerns have been raised around personal privacy and civil liberties. Current government efforts are targeting improvements to agency use and storage of data as well as strengthening cybersecurity. Additional efforts are likely to address accountability, oversight, and relevant privacy requirements for both public and private organizations. 

Federal agencies are working towards various data science goals like better decision-making support and more comprehensive situational awareness for areas like cybersecurity. Many of these efforts are well established and underway. Unless they are managing resources and bringing additional funding to the table, the primary impact of chief data officers is likely to be guidance. In the case of the new Chief Data Scientist, guidance focused on building the government’s capabilities may aim to temper reliance on industry for support.


Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.


NASA Tech Proves Windfall for Army

Under pressure to deliver cost savings and efficiency improvements, federal agencies are looking at each other’s achievements with increased interest. Through a recent technology transfer, the Army is getting a boost in software development from the National Aeronautics and Space Administration (NASA).

The Meteorology Calibration Laboratory (MCL) at NASA’s Marshall Space Flight Center is located on the Army’s Redstone Arsenal campus in Alabama. According to an announcement from NASA, Army officials “became aware” of NASA’s considerable work on automated software development during a tour of the MCL. Over the past decade, NASA has generated over 2,400 automated software procedures for calibration and testing. Around 1,700 were developed for the Space Shuttle Program and another 300 were produced for general use. In the last two years, an additional 400 have been developed for NASA projects and programs including the Space Launch System. By using a standardized set of procedures, the control of instrumentation can be automated. This automation minimizes risk by reducing the probability of errors related to human involvement. Limiting the necessary human interference, it also increases the consistency in the data recorded from technician to technician.

During a period of four months, programmers at the Army’s Test, Measurement and Diagnostic Equipment Activity completed around 25 automation procedures. (Assuming a constant rate, it would take several decades to amass a volume of procedures equal to the current size of NASA’s collection.) With the recent transfer from NASA, the Army benefits from over twelve years of work on calibration and testing, sparing them the costs of the software development. Government officials estimate that the move marks a potential savings of nearly $4 million. Beyond time and money, this represents a win for the service in terms improving the quality of the Army’s measurements.

NASA has shared these procedures between its centers, but this Army transfer is the first to a non-NASA recipient. The agency expects additional interest from other Defense Department organizations is likely to follow.  It’s no surprise that sharing technology is saving agencies time and money. Various initiatives like shared services and common standards (e.g. for security or for electronic reporting) are encouraging organizations to more closely consider potential existing solutions. Going forward, we’re likely to see more and more examples of agencies reaping benefits from other agency’s advances. As with other cost cutting moves, the trend will eat into the contractor addressable spending in some areas but may free up funds for other investments.


Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.


Progress Continues on Cyber-Physical Framework

During the summer of 2014, the National Institute of Standards and Technology (NIST) kicked off a working group effort to develop a framework and roadmaps for cyber physical systems. Mid January 2015, this public working group focused launched the second phase of its work. 

Cyber-physical systems (CPS) are often simply referred to as “smart” systems. These co-engineered systems comprise interacting networks of physical and computations components. The influx of smart technologies has expanded CPS domains to include infrastructure (grid, water, gas), buildings, emergency response, healthcare, manufacturing, transportation, and numerous others. The public working group aims to take a multi-domain perspective to ensure the research, development and deployment guidance it produces will be applicable within all CPS domains as well as supporting cross-domain applications. In particular, this group intends to address needs for a common lexicon and taxonomy as well as a reference architecture. 

These working group efforts began during the summer of 2014 with plans for the first several phases over the course of a year. The first face-to-face meeting during August launched the first phase of the initiative to draft a framework for the CPS elements. This work produced draft reports from each of the five subgroups – Reference Architecture, Use Cases, Cybersecurity, Timing, and Data Interoperability. Following the launch of the first phase, the subgroups organized meeting and collaboration to create initial documents that would eventually combine as elements of the CPS framework. 

All five subgroups completed their documents by the close of 2014, so now efforts are underway to integrate and review the work. This second phase aims to produce a combined framework document by integrating the work completed by the subgroups and refining it further. The third phase of the work will result in a CPS technology roadmap which will identify opportunities for additional collaboration and propose a timeline for follow-on efforts to address key technical challenges. 

According to the current timeline, the combined framework is expected to be finalized this spring.  The group is scheduled to have its next face-to-face meeting in April, which will conclude the framework phase and launch the roadmap activities. A draft of the roadmap is anticipated in June 2015, followed by a month of review before its finalized in July. Another, related effort underway is also being led by the NIST Engineering Laboratory’s Smart Grid and Cyber-Physical Systems Program Office. The Cyber-Physical Testbed Development Workshop is scheduled for February 24-25, 2015 and will explore future research and development areas for CPS. 

Ultimately, these efforts hope to head off several trends like the sector-specific applications of cyber-physical system deployments and the expansion of the Internet of Things without a foundation of interoperability. By drawing stakeholders from government, industry, and academia, the working group hopes to address the increasing need for systems-of-systems solutions to integrate CPS across domains. For insights on how CPS and other technologies are shaping the federal landscape, check out the Federal Industry Analysis team’s recent report on emerging federal technology markets.


Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.


New JIE Requirements May Help the “Internet of Things” at the DoD

The “Internet of Things” (IoT) is a pretty common phrase these days, with the rapid-expanding interconnectivity of devices and sensors sending information across communications networks, all to achieve greater capabilities, effectiveness, efficiency, and flexibility.  The Department of Defense (DoD) clearly links the growth of emerging, interconnected technologies to the sustained superiority of U.S. defense capabilities, on and off the battlefield, so you could say that the IoT impacts defense IT at all levels.

The key to leveraging the IoT is in harnessing and integrating three key areas:

  • Information – Data from devices and sensors, (e.g. phone, camera, appliance, vehicle, GPS, etc.) and information from applications and systems, (e.g. social media, eCommerce, industrial systems, etc.) provide the content input.
  • Connectivity – Network connections via various wireless capabilities and communications backbones provide the transport links for aggregation and distribution. This facilitates the environment where data meets the power to use that data.
  • Processing – The computational capacity and capabilities to make the data content useful.  This may reside at the device and/or back end and ranges in complexity, (e.g. data analytics, etc.)


DoD Implications

The use of integrated networks to connect data with processing capacity to affect outcomes is far from a new idea at the DoD – it gave us much of the warfighting capabilities we have today. But technological evolution has resulted in a growing IoT mentality that goes beyond combat operations. One example is the establishment of the Air Force Installation Service Management Command (AFISMC) to coordinate management and maintenance of resources across Air Force bases and facilities. According to Air Force CTO Frank Konieczny, potential uses of IoT include facilities and vehicle management, logistics and transportation, integrated security, and robotics.

But pervasive connectivity is also creating security ramifications.  In the wake of a network security incident last year, the Navy launched Task Force Cyber Awakening (TFCA) in an effort to protect hardware and software Navy-wide as IoT engulfs everything from weapons systems to shipboard PA systems.

Importance of the JIE

The drive to leverage sensor technologies and data analytics that these technologies enable is a driving force behind the DoD’s Joint Information Environment (JIE) network modernization efforts, so the pace of sensor-based innovation is tied to the success of JIE efforts. Adding potentially tens of thousands of diverse Internet-connected objects to a network that then need to be managed and secured will require proactive IT governance policies to ensure effectiveness, and some provisions in recent law apply.

The FY 2015 National Defense Authorization Act (NDAA), passed just last month, requires the DoD CIO to develop processes and metrics within the next six months for measuring the operational effectiveness and efficiency of the JIE. Further, Congress is having the CIO identify a baseline architecture for the JIE and any information technology programs or other investments that support that architecture.

These requirements may stem, in part, from a desire to help formalize and oversee JIE as an investment program, but the resulting baseline architecture will help pave the way to further implement greater IoT capabilities. The data from sensor-based devices will only continue to grow, but to maximize its utility the DoD will need a successful JIE to connect and carry the information.

Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about 
GovWin FIA. Follow me on Twitter @GovWinSlye.


Emerging Federal Technology Markets – Areas to Watch

Can technological innovation drive federal IT investments, even in the midst of budget pressures? Absolutely. This is what we explore in our latest report on Emerging Federal Technology Markets.

Under long-term pressure to “do more with less,” federal agencies are leveraging current trends in federal IT – cloud, wireless networks, IPv6, and virtualization – to gradually adopt new technologies that enable cost savings and the more efficient use of IT resources. Some of my colleagues and I took a look at how these and other technologies are shaping federal IT investments today and in the future.

Federal Investments in Foundation Technologies will Drive Emerging Markets

Technological change and proliferation span the gamut when it comes to impacting federal agencies. Sensor technologies are being introduced to track facility energy consumption and enhance physical security, while software-defined infrastructure is being explored to eliminate bottlenecks that result from stovepiped systems and the growing volume of data. Machine learning technology is being tested to create “smart” networks that rely less on person-based administration. Tying it all together are predictive analytics, which agencies are using for a growing number of purposes, from forecasting network performance and enhancing cyber security to ferreting out waste, fraud, and abuse. The result is that today’s investments set the stage for tomorrow’s capabilities. (See graphic below.)

Key market factors shaping the federal IT landscape

Some of the major drivers and key findings from our research include:

  • The drive to leverage sensor technologies and the data analytics that these enable is a driving force behind agency network modernization efforts like the DoD’s Joint Information Environment. The pace of sensor-based innovation is tied to the success of these efforts.
  • Software-Defined Infrastructure (SDI) is more pervasive than generally believed, particularly at agencies with highly-evolved Infrastructure-as-a-Service offerings.
  • Federal interest in SDI is not hype; it is a genuine trend with a growing number of current and planned use examples across federal agencies.
  • The use of predictive analytics programs has expanded significantly across the federal government since FY 2010, making it a maturing, though niche, technology that is expected to have continued strong growth.
  • The inclusion of predictive analytics as an offering on GSA’s Alliant 2 and, potentially, NS2020 government-wide contracts should help it become regarded less as an exotic technology and more as a standardized commercial-off-the-shelf solution.

The modernization of agency IT environments is opening the doors to future investment in emerging technologies.  The convergence of agencies’ work on expanding wireless networks, deploying standardized, commodity hardware, and engineering Internet Protocol-based transport networks is enabling the introduction of new sensor technologies and software-based capabilities. The impact of emerging technology adoption will be to introduce greater efficiency and security to agency IT environments. 

To get our full perspective on Emerging Federal Technology Markets read the full report. 

Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about 
GovWin FIA. Follow me on Twitter @GovWinSlye.

More Entries