MA

The Path to Big Data Adoption

Published: October 10, 2018

Big DataInformation Technology

Volume 9 of the NIST Big Data Interoperability Framework describes the barriers to adopting big data systems as well as the technologies that will help accelerate its current and future implementation.

In order to guide big data stakeholders in leveraging the best technologies, processes and knowledge in big data architecture, the NIST Big Data Public Working Group (NBD-PWG) was created to form the extensive Big Data Interoperability Framework (NBDIF). The framework is being developed in three stages:

  1. Identify high-level NIST Big Data Reference Architecture (NBD-RA) key components, which are technology, infrastructure, and vendor agnostic;
  2. Define general interfaces between the NBD-RA components with the goals to aggregate low-level interactions into high-level general interfaces and produce set of white papers to demonstrate how NBD-RA can be used;
  3. Validate the NBD-RA by building Big Data general applications through the general interfaces.

Currently in the second stage, the working group published a second version of volumes to achieve how the NBD-RA can be used. The last volume, published in June 2018 and titled, Volume 9: Adoption and Modernization , addresses the common barriers to big data development and explores project, organization and technology maturities to help identify the strategies needed for implementation and modernization of big data systems/projects in organizations.

Recognizing that barriers to big data are both non-technical and technical, the document identifies the top contenders in each category:

Non-technical Barriers

Technical Barriers

  • Lack of definition and product agreement
  • Budget and expensive licenses
  • Lack of established processes to go from proof-of-concept to development of systems
  • Inconsistent metadata standards
  • Silos of data and access restriction
  • Shifting from centralized stewardship to a decentralized and granular model
  • Legacy access methods
  • Proprietary, patented access
  • Organizational maturity
  • Lack of workforce with key skill sets to handle complexity of software
  • Integration with existing infrastructure
  • Security of systems
  • Cloud:

o Concerns over liabilities, security and performance

o Connectivity bandwidth

o Mesh, cell and internet network components

The document continues to describe a set of technologies that have become commercialized that pave the way for success big data implementation. Of these, open source technologies have made it much easier for organizations to implement big data, growing at a fast rate and decreasing in cost. In addition to open source, the following technologies/processes were identified as some of the more recent advances into commercialization to help big data adoption:

  • Infrastructure-as-a-Service
  • In-memory technologies, these include data management systems, analytics and data grids
  • Access technologies and information retrieval techniques
  • Internal search
  • Stream processing, technologies that provide the capability to cross-reference streaming data with data at rest

Lastly, the volume attempts to predict future big data trends and forecasts using observances of which technologies are likely to mature in the next 5 or so years to have a significant impact in big data advancements. Of those, high performance message infrastructure, search-based analysis and predictive models are anticipated to mature in the 2017-2020 time frame. Meanwhile, IOT, semantic web (particularly in cloud deployments), text analysis and integration are projected to mature in the 2020-2025 time period. With regard to integration, the demand for increased interfaces with flexibilities to handle various types of data will likely be met with the maturity of data APIs, container frameworks and metadata standards.