GovWin
B2G is moving!
Blogs posted after May 22, 2015 will be located on Deltek's central blog page at www.deltek.com/blog.
Just select the "B2G Essentials" blog to continue to receive this valuable content.
NASA Charts Course for Information Technology Development

The National Aeronautics and Space Administration (NASA) is seeking input on drafts of the organization’s technology roadmaps that will shape priorities for the next 20 years.

Five years ago, in 2010, NASA developed a set of technology roadmaps to guide space technology developments. This month, NASA released drafts updating and expanding on those fourteen roadmaps. These documents will serve as a foundational piece of the Strategic Technology Investment Plan (STIP), which will lay out a strategy prioritizing technology developments and establishing principles for investment. NASA’s web-based system, TechPort, tracks and analyzes technology investments. Together, the roadmaps, STIP, and TechPort enable portfolio management for NASA’s technology spending, aligning investments and reducing duplication.

The 2015 NASA Technology Roadmaps cover fifteen Technology Areas. Crosscutting technologies, like avionics, autonomy, information technology, and space weather, span multiple sections. Focusing on applied research and development activities, NASA’s Technology Roadmaps cover a broad set of needed technologies and development approaches for the next 20 years (2015 to 2035). The Modeling, Simulation, Information Technology, and Processing Technology Area highlights advances in flight and ground computing capabilities, physics-based and data-driven modeling, as well as information and data processing frameworks, systems, and standards. The capabilities addressed within this area impact other technologies across the NASA portfolio, enabling application-specific modeling, simulation, and information technologies called out in other roadmaps to support the ever-increasing challenges of NASA missions.


Despite parts of the discipline-specific nature of parts of the technology area, many of the capabilities will enable advances in modeling and simulation for areas addressed in other roadmaps, which address specific domain perspectives. The current roadmap continues the pattern of previous versions, listing Modeling and Simulation separately; however, given the high degree of their interrelation, the roadmap contents often reference them together.

Goals for the next twenty years include development of transformational flight and ground computing capabilities, increased modeling productivity and fidelity, simulations to enable risk management across the entire system lifecycle, and progress around leveraging NASA’s massive volumes of observational, sensor, simulation, and test data. Ultimately, these capabilities will help to empower decision makers and support NASA’s missions. 

NASA released a request for information (RFI) on May 11, 2015 associated with the road map drafts. The space agency is looking to determine if the correct state of the art has been identified, gauging interest in space applications from commercial industry and other government agencies, use of technology for non-space applications, and exploring partnerships for technology development. The comment period is open until June 10, 2015.

 

Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.

 

Big Data Investments are Accelerating across the DoD

In a recent blog posting that received wide industry attention, I detailed how the Defense Advanced Research Projects Agency (DARPA) is investing big money in research and development efforts related to big data.  An observation discussed in that post concerned the fact that advanced analytics and technologies like distributed computing are becoming entwined with modern, networked weapons systems. The incorporation of big data is a function not only of the growing complexity of weapons, but also of the command and control capabilities that today’s U.S. military is employing.  Facing a falling number of military personnel, all branches of the Defense establishment are turning to networked and unmanned weapons commanded and controlled from a distance to offset the strain on American fighting power.

In this context, DARPA’s R&D efforts are the “tip of the spear” when it comes to figuring out how big data technology can enhance combat capabilities.  DARPA is not the only Defense organization, however, that is dedicating R&D dollars in this area. The military services are also investing and in general the funding flowing into those research efforts is growing annually.

 

As the numbers in the chart above demonstrate, all of the military services are funding R&D efforts related to big data.  The data in the table above reflects projects in the FY 2016 Defense Research, Development, Test, and Enhancement (RDT&E) budget request that are dedicated primarily to some type of big data R&D.  Put otherwise, developing a big data-related capability is the primary objective of the effort. In addition to these primary efforts, there is a plethora of other research programs that include big data technologies as part of the effort.  The FY 2016 requested funding numbers for those programs with a related big data component are shown below.


What to make of these figures?

First, when the primary objective of a project (Table 1) is developing a big data solution, the Navy is leading the way among the military services.  A big reason for this is the Navy’s push to employ unmanned systems – aerial, surface, and undersea – on a much greater scale than at present.  The development of these systems requires an incredible amount of money, with work focused on enhanced C2 capabilities, cyber security, and analytics for parsing intel data gathered by these systems.  This trend is in evidence in the Air Force and Army as well, just not to the extent it is in the Navy, so if your company works in this area, it is a green field.

Second, from FY 2015 to FY 2016, the Army intends to nearly double its investment in primary big data related R&D (Table 1), reflecting a focus on parsing intel data and on utilizing big data for cyber security operations, especially automated network monitoring and defensive response.

Third, the Air Force is the only service that will see investment in primary big data R&D fall in FY 2016. This is due to some slight cuts in multi-source fusion technologies research and in the evaluation of advanced countermeasure concepts.  When it comes to big data R&D related to other efforts (Table 2), the total planned investment grows significantly, with a special focus on the automation of complex networks, analysis and use of sensor fusion technology, and exploitation of intel data.

In conclusion, looking at this one piece of the DoD big data market we can see that the military services intend to spend at least $159 million in FY 2016 on R&D related primarily to a big data objective.  At most, they intend to spend almost $725 million, if we count programs with a related big data component.  Keep in mind that these numbers do not include present investments in operations and maintenance and procurement programs. Big data R&D is thus a growing area of Defense IT spending in an otherwise flat market.

 

Energy Department Continues Collaborative Supercomputing Investment

 

The Department of Energy’s (DOE) Office of Science and the National Nuclear Security Administration (NNSA) provide high-performance computing facilities that enable world-class scientific research. Mid April 2015, DOE extended its exascale supercomputer development program under its Collaboration of Oak Ridge, Argonne, and Lawrence Livermore (CORAL) initiative with a third and final contract. 

The $525 million CORAL project was established in early 2014 with the goal of improving supercomputing investments, streamlining procurement, and reducing development costs for high performance systems that will be five to seven times more powerful than today’s top supercomputers. Through collaborating across the department’s labs on the effort, DOE aims to help the nation accelerate to next-generation exascale computing. The three CORAL labs specified requirements in a single Request for Proposal (RFP) released in January 2014. The recent $200 million award will deliver Aurora system to the Argonne Leadership Computing Facility, completing the final supercomputer investment of the CORAL initiative. DOE earlier announced a $325 million investment to build state-of-the-art supercomputers at its Oak Ridge and Lawrence Livermore laboratories.

The entire scientific community will have access to the system once it is commissioned in 2018. Key research goals for the Aurora system include: material science, biological science, transportation efficiency, and renewable energy. The next-generation system, Aurora, will use a high-performance computing scalable system framework to provide a peak performance of 180 PetaFLOP/s.  The Aurora system will be delivered in 2018. In the interim, Argonne and Intel will also provide the Theta system, to be delivered in 2016, which will help ALCF users transition their applications to the new technology.

Additionally, DOE Under Secretary Orr announced $10 million for a high-performance computing R&D program, DesignForward, led by DOE’s Office of Science and NNSA to accelerate the development of next-generation supercomputers. The recent announcement complements the $25.4 million already invested in the first round of program. Through this public-private partnership, technology firms will work with DOE researchers to study and develop software and hardware technologies aimed at maintaining a national lead in scientific computing.

 

Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.

 

Big Data Programs at the Defense Advanced Research Projects Agency

The Department of Defense is investing big in goods and services related to big data. This investment, however, is not spread evenly across the department. It exists instead in certain agencies where the spending is deep and related to a variety of other programs. One of these agencies is Defense Advanced Research Projects Agency, or DARPA, as it is commonly known. DARPA does research and assessments related to the applicability of cutting edge technologies to U.S. national security, including unmanned systems, robotics, cyber security, mobility, networking and computing technologies, and others.

Underlying the research and development work at DARPA are significant investments in advanced algorithms, analytics, and data fusion that illustrate the importance of “big data” to the efficient use of next generation systems and weapons platforms. Put differently, more and more of DoD’s weapons and communications systems, as well as the platforms that carry them, are becoming extremely complex. They are now so complex, in fact, that big data analytics and algorithms are necessary for them to function properly. Big data analytics and algorithms are thus a foundational technology without which an increasing number of advanced DoD weapons systems and platforms would not function.

Knowing this makes a big difference when it comes to understanding where business opportunity can be found at the DoD. Big data is such a complex subject, and its uses are so varied, that it is rare if an acquisition calls explicitly for a specific solution by name or the term “big data.” This makes selling big data solutions and services to defense customers tricky.

Getting back to DARPA, the fact is that big data is in use across the agency. It appears primarily in R&D work related to software development, algorithm design, and data fusion efforts. The two tables below identify programs that have big data requirements related to them. Table 1 lists DARPA programs in which big data goods or services are the primary requirement. Table 2 shows DARPA programs in which big data requirements are but one of many different pieces of work. These programs have been drawn from the DARPA Research, Development, Test, and Evaluation Budget Request for FY 2016.


As we can see in Table 1, spending rises from approximately $97 million in FY 2014, to the $164 million that DARPA forecasts in FY 2016. This represents a projected 69% increase over the course of three fiscal years.

Turning to the list of programs that includes both big data specific projects and those with a big data component (the gold lines in Table 2 below), we can see that the trend is the same – spending at DARPA on big data related R&D is on the rise. The increase is a more modest 21% from FY 2014 to FY 2016, but this is still a positive return in an overall declining DoD technology market.


Summing up, the DoD’s spending on big data, particularly on R&D, is rising. Because money is flowing to R&D efforts, the fact that the work is related to big data may be hidden in general project descriptions. The best thing to do when searching for big data related work is to seek out complexity. Where agencies like DARPA are conducting R&D work on complex systems, the integration of massive volumes of sensor data, the development of advanced algorithms for controlling unmanned systems, and/or fusing large data sets into common pictures, that is where you’ll find big data related spending.

 

Does Chief Data Scientist Appointment Signal Boost for Analytics Spending?

Mid February 2015, the White House announced the appointment of its first Chief Data Scientist in the Office of Science and Technology Policy (OSTP). In addition to his post as Chief Data Scientist, Dr. Dhanurjay ‘DJ’ Patil will also serve as the Deputy Chief Technology Officer for Data Policy. This announcement comes as federal agencies continue to juggle resource limitations and competing priorities. 

According to the Chief Technology Officer’s post on the White House blog, the Chief Data Scientist will work closely with the Federal Chief Information Officer and U.S. Digital Service, but the particular objectives of the role remain unclear. Patil is expected to help shape policies for technology and innovation, to develop partnerships to get more from the nation’s data investments, and to recruit talented data scientists into public service. He is also expected to support the Administration’s Precision Medicine Initiative, which targets advances at the crossroads of health and data. 

Over the last several years, government agencies have been working to harness the data they generate and steward. In fact, a number of agencies – the Department of Transportation, the Federal Communications Commission, the Department of Energy, the Department of Agriculture, and the Department of Commerce - have already carved out posts for chief data officers in their organizations or filled role. The appointments reflect renewed efforts across the government to tap into agency information through open data and analytics. 

Agency investments in big data aspire to spur technology innovation and deliver improvements to digital services. As part of the Digital Government Strategy, agencies are continuing to achieve and maintain high quality data sets, and to make data open and useable to the public. The administration has set a clear goal of open and machine-readable data the default for government information. Currently, there are over 138,000 datasets available on Data.gov for public use. Agency oversight organizations will be better equipped to combat waste and fraud. Meanwhile, decision-makers anticipate improved resource allocation. Technologists and research organizations are looking to advance scientific investigations by harnessing greater analytical capabilities. There are numerous health care applications from enhancing medical services to informing public health activities. Education stands to gain from data analytics through tailored lesson plans, online platforms, and increased efficiency for school operations. In short, cracking big data challenges stands to bring a wide array of benefits in areas like health, energy, education, public safety, finance, and development. 

Amid these promises, concerns have been raised around personal privacy and civil liberties. Current government efforts are targeting improvements to agency use and storage of data as well as strengthening cybersecurity. Additional efforts are likely to address accountability, oversight, and relevant privacy requirements for both public and private organizations. 

Federal agencies are working towards various data science goals like better decision-making support and more comprehensive situational awareness for areas like cybersecurity. Many of these efforts are well established and underway. Unless they are managing resources and bringing additional funding to the table, the primary impact of chief data officers is likely to be guidance. In the case of the new Chief Data Scientist, guidance focused on building the government’s capabilities may aim to temper reliance on industry for support.

 

Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.

 

New JIE Requirements May Help the “Internet of Things” at the DoD

The “Internet of Things” (IoT) is a pretty common phrase these days, with the rapid-expanding interconnectivity of devices and sensors sending information across communications networks, all to achieve greater capabilities, effectiveness, efficiency, and flexibility.  The Department of Defense (DoD) clearly links the growth of emerging, interconnected technologies to the sustained superiority of U.S. defense capabilities, on and off the battlefield, so you could say that the IoT impacts defense IT at all levels.

The key to leveraging the IoT is in harnessing and integrating three key areas:

  • Information – Data from devices and sensors, (e.g. phone, camera, appliance, vehicle, GPS, etc.) and information from applications and systems, (e.g. social media, eCommerce, industrial systems, etc.) provide the content input.
  • Connectivity – Network connections via various wireless capabilities and communications backbones provide the transport links for aggregation and distribution. This facilitates the environment where data meets the power to use that data.
  • Processing – The computational capacity and capabilities to make the data content useful.  This may reside at the device and/or back end and ranges in complexity, (e.g. data analytics, etc.)

 


DoD Implications

The use of integrated networks to connect data with processing capacity to affect outcomes is far from a new idea at the DoD – it gave us much of the warfighting capabilities we have today. But technological evolution has resulted in a growing IoT mentality that goes beyond combat operations. One example is the establishment of the Air Force Installation Service Management Command (AFISMC) to coordinate management and maintenance of resources across Air Force bases and facilities. According to Air Force CTO Frank Konieczny, potential uses of IoT include facilities and vehicle management, logistics and transportation, integrated security, and robotics.

But pervasive connectivity is also creating security ramifications.  In the wake of a network security incident last year, the Navy launched Task Force Cyber Awakening (TFCA) in an effort to protect hardware and software Navy-wide as IoT engulfs everything from weapons systems to shipboard PA systems.

Importance of the JIE

The drive to leverage sensor technologies and data analytics that these technologies enable is a driving force behind the DoD’s Joint Information Environment (JIE) network modernization efforts, so the pace of sensor-based innovation is tied to the success of JIE efforts. Adding potentially tens of thousands of diverse Internet-connected objects to a network that then need to be managed and secured will require proactive IT governance policies to ensure effectiveness, and some provisions in recent law apply.

The FY 2015 National Defense Authorization Act (NDAA), passed just last month, requires the DoD CIO to develop processes and metrics within the next six months for measuring the operational effectiveness and efficiency of the JIE. Further, Congress is having the CIO identify a baseline architecture for the JIE and any information technology programs or other investments that support that architecture.

These requirements may stem, in part, from a desire to help formalize and oversee JIE as an investment program, but the resulting baseline architecture will help pave the way to further implement greater IoT capabilities. The data from sensor-based devices will only continue to grow, but to maximize its utility the DoD will need a successful JIE to connect and carry the information.

---
Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about 
GovWin FIA. Follow me on Twitter @GovWinSlye.

 

Federal Spending on Enterprise Business Systems Stays Strong

Ongoing initiatives to modernize government business systems offer prime examples of the ways federal agencies are looking to leverage technology transformation to achieve cost savings and efficiency gains. 

At end of 2014, Deltek’s Federal Industry Analysis team completed analysis of the market for business systems, identifying four segments characterized by different types of enterprise solutions. These four segments are financial management, asset and material management, human resources management, and administration and government management. 


Financial Management – The goal of improving financial management across the government has led to updated guidance for financial management system and shared services initiatives. Systems in this segment include solutions for payroll, accounting, invoice processing, budget formulation, and collections. This segment is expected to grow by 4.7% from FY 2014 to reach $3.4 billion in FY 2015.

 

Asset and Materials Management – Business systems for asset and materials management facilitate tighter asset control. Systems in this segment include solutions for supply chain management, inventory control, and fleet management. This segment remains flat from FY 2014 to 2015.

 

Human Resources Management – These systems support efforts to improve workforce performance. Solutions include personnel management, performance management, recruiting, and compensation management. This segment is expected to grow by 8.3% over FY 2014 levels to $3 billion.

 

Administration and Government Management – These systems include solutions for contract management, program management, customer relationship management, and travel management. Spending in this segment continues near FY2014 levels.

 

Deltek predicts contractor addressable spending on federal business systems to total $10.6 billion for FY 2015, increasing slightly over FY 2014 spending levels.  While many government efforts to improve business systems have been underway for some time, policies and legislative mandates continue to shape both the strategic direction and agency progress. For example, demand for improved business performance is underscored by reporting requirements and the need for increased financial transparency. The goal of reducing spending is also linked to efforts like adoption of shared services and plans to address auditability of financial systems. Ongoing budget pressure has increased the tendency to take an incremental approach to streamlining and enhancing government business operations.

 

Agencies making the largest investments in modernizations efforts include the Department of Defense, Treasury, and Veterans Affairs. Going forward, agencies are looking to continue advancing business system capabilities through mobile access and business analytics. The role of cloud environments is expected to expand, as only a small percentage of systems have completed migrated to cloud environments. Further exploration of the government initiatives targeting modernization of business systems is available in the recent Federal Industry Analysis report Federal Enterprise Business Systems, 2015.

 Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.

 

Emerging Federal Technology Markets – Areas to Watch

Can technological innovation drive federal IT investments, even in the midst of budget pressures? Absolutely. This is what we explore in our latest report on Emerging Federal Technology Markets.

Under long-term pressure to “do more with less,” federal agencies are leveraging current trends in federal IT – cloud, wireless networks, IPv6, and virtualization – to gradually adopt new technologies that enable cost savings and the more efficient use of IT resources. Some of my colleagues and I took a look at how these and other technologies are shaping federal IT investments today and in the future.

Federal Investments in Foundation Technologies will Drive Emerging Markets

Technological change and proliferation span the gamut when it comes to impacting federal agencies. Sensor technologies are being introduced to track facility energy consumption and enhance physical security, while software-defined infrastructure is being explored to eliminate bottlenecks that result from stovepiped systems and the growing volume of data. Machine learning technology is being tested to create “smart” networks that rely less on person-based administration. Tying it all together are predictive analytics, which agencies are using for a growing number of purposes, from forecasting network performance and enhancing cyber security to ferreting out waste, fraud, and abuse. The result is that today’s investments set the stage for tomorrow’s capabilities. (See graphic below.)


Key market factors shaping the federal IT landscape

Some of the major drivers and key findings from our research include:

  • The drive to leverage sensor technologies and the data analytics that these enable is a driving force behind agency network modernization efforts like the DoD’s Joint Information Environment. The pace of sensor-based innovation is tied to the success of these efforts.
  • Software-Defined Infrastructure (SDI) is more pervasive than generally believed, particularly at agencies with highly-evolved Infrastructure-as-a-Service offerings.
  • Federal interest in SDI is not hype; it is a genuine trend with a growing number of current and planned use examples across federal agencies.
  • The use of predictive analytics programs has expanded significantly across the federal government since FY 2010, making it a maturing, though niche, technology that is expected to have continued strong growth.
  • The inclusion of predictive analytics as an offering on GSA’s Alliant 2 and, potentially, NS2020 government-wide contracts should help it become regarded less as an exotic technology and more as a standardized commercial-off-the-shelf solution.

The modernization of agency IT environments is opening the doors to future investment in emerging technologies.  The convergence of agencies’ work on expanding wireless networks, deploying standardized, commodity hardware, and engineering Internet Protocol-based transport networks is enabling the introduction of new sensor technologies and software-based capabilities. The impact of emerging technology adoption will be to introduce greater efficiency and security to agency IT environments. 

To get our full perspective on Emerging Federal Technology Markets read the full report. 

---
Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about 
GovWin FIA. Follow me on Twitter @GovWinSlye.

Predictive Analytics Use at the Department of Defense

 

Back in September, an organization at the National Defense University called the Center for Technology and National Security Policy published a research paper entitled Policy Challenges of Accelerating Technological Change: Security Policy and Strategy Implications of Parallel Scientific Revolutions. Looking past the long title, one finds an in-depth consideration of the implications of emerging technologies for U.S. national security and the DoD. Considering the CTNSP is part of the defense establishment, I believe it is worth taking a few minutes to examine what the authors say, particularly since their comments fit seamlessly with the recently announced Defense Innovation Initiative (DII). Papers like this can point to areas of investment and in a time of falling budgets, any insight is welcome.

The report discusses more subjects than I can cover here, so in today’s post I’ll zero in on its comments about big data analytics. Use of big data analytics in the DoD is nothing new. In fact, based on recent contract spending data (see chart below), we can see that defense customers spent nearly $138 million on big data analytics over the five years from fiscal 2010 to fiscal 2014.

 

Big data analytics in this context are defined as advanced analytics programs offering visualization and modeling capabilities that enable statistics-based prediction/forecasting. Think Mathematica, MATLAB, Splunk, Statistica, Tableau, etc. and you have an idea of the programs included in this data.

According to the CTNSP report, employing these kinds of analytics on a vastly greater scale will be the key to controlling and exploiting the data that defense organizations will be gathering from the expansion of unmanned systems, robotics, and the Internet Protocol-enabled “Internet of Things.” The uses for such analytics include the analysis of intelligence data, cyber security, and the transition to a “health maintenance-based, rather than a disease-based medical model,” that will enhance the operational readiness of U.S. military personnel. The report’s recommendations have a clear implication – that the DoD should greatly ramp up its spending on predictive analytics and the training of its personnel to use them.
In recent years, however, just the opposite has been taking place. Examining the data presented above from the perspective of spending per fiscal year (see chart below), we see that defense spending on PA peaked at $42 million in FY 2012 and has declined since.

Undoubtedly the recent pressure placed on DoD’s budget by sequestration is the primary reason for reduced spending on PA. The question is will this trend continue. My guess would be no, for the simple reason that the DoD cannot afford to neglect developing its PA capabilities. To do so at a time when more data is coming at defense analysts than ever before would be folly. Add the increasing use of automated systems to the mix and the answer is obvious – the DoD must spend more on PA. Currently the department is in a period of retrenchment as it struggles with new budget realities. Once this retrenchment has run its course, defense customers are likely to turn their attention back to acquiring PA capabilities. The DII points the way forward in this respect and for industry partners it’s a welcome signpost of spending ahead.

 

Observations from TTC’s Internet of Things for Defense Symposium

The Department of Defense and U.S. federal law enforcement community are increasingly interested in what has come to be called the “Internet of Things.”  Labeled the “IoT” for short, the Internet of things consists of a growing network of small, low power, low bandwidth, low cost sensors and devices that are connected to networks and which send and receive data.  Think of the sensors that automatically turn on room lights or flush toilets and you have an idea of some of the uses for IoT technology.  Additional uses for IoT technology, however, are about as varied as one can imagine.  For example, the General Services Administration recently awarded a contract to IBM to outfit its facilities with sensor technology that will allow more efficient monitoring of energy use.  Similarly, tiny sensors can be used to monitor jet engine performance, or just about any other structure in the world.

As many of the speakers at the Technology Training Corporation’s IoT symposium discussed, the DoD is eyeing sensor technology to determine how it might best be used.  There are even several use cases already in progress.  Rear Admiral Scott Jerabek, Director of Command, Control, Communications and Computer Systems, at U.S. Southern Command kicked off the symposium by listing a few of these uses in his area of responsibility.  Noting that USSOUTHCOM employs IoT technology in its GeoShare program for humanitarian assistance, Jerabek also explained that the Navy is investigating a “nano-satellite network,” in addition to developing a Deep Sea Web of low observable, wide area capabilities to track dark targets at sea.

Subsequent speakers, like Air Force CTO Frank Konieczny, detailed multiple other uses for IoT technology that the defense establishment is considering. These include:

  • Base Facilities Maintenance – trash pickup, light replacement, food replenishment
  • Vehicle management – maintenance prediction, location tracking
  • Secured, smart workplace – presence for workers integrated with facilities management
  • Logistics and transportation – inventory/tracking, automated assembly/packing, geo-location in supply chain
  • Robotics – autonomous drones and vehicles, sensor based maneuvering

Needless to say, the expansion of networks to everyday items carries with it tremendous risks as well as benefits.  Multiple speakers mentioned the need to build security protocols into IoT devices so that they could be resistant to hacking.  Enhanced network security will be necessary as well given the vast expansion of data that networks will be handling.  Advanced analytics for continuous monitoring will be required, but not only that, analytics will need to be deployed to make sense of all the data and make decisions based on it.  In short, IoT will render the already big data world in which we live even bigger.

Herein lay other challenges.  Chief Warrant Officer 5, Ricardo Pina, Chief Technology Officer and Senior Technical Advisor to the Army CIO/G-6, pointed out that an organization like the Army currently does not have the network infrastructure required to handle the flow of data that an Army IoT would create.  This is one of the primary factors driving the Army’s modernization of its networks using multi-protocol label switching (MPLS) technology.  A standardized protocol will be required to enable seamless integration and use of IoT and the DoD is betting that this standard will be Internet Protocol.  Effectively, the new IP-based Joint Information Environment will enable the DoD to vastly expand its use of IoT technologies.  This expansion will in turn drive investment in the analytics and any attendant services required for IoT implementation.  Vendors therefore take note.  The business opportunity in the area of IoT is growing, particularly among informed defense customers.

For more information on upcoming symposia, visit the Technology Training Corporation. I’ll see you at the one on Software-Defined Networking scheduled for December 9-10, 2014.

 

More Entries