Big Data is a term used to describe the large amount of data in the networked, digitized, sensor-laden, information-driven world. The growth of data is outpacing scientific and technological advances in data analytics. Opportunities exist with Big Data to address the volume, velocity and variety of data through new scalable architectures. To advance progress in Big Data, the NIST Big Data Public Working Group (NBD-PWG) is working to develop consensus on important, fundamental concepts related to Big Data. The results are reported in the NIST Big Data Interoperability Framework (NBDIF) series of volumes. This volume, Volume 1, contains a definition of Big Data and related terms necessary to lay the groundwork for discussions surrounding Big Data.
The potential for organizations to capture value from Big Data improves every day as the pace of the Big Data revolution continues to increase, but the level of value captured by companies deploying Big Data initiatives has not been equivalent across all industries. Most companies are struggling to capture a small fraction of the available potential in Big Data initiatives. The healthcare and manufacturing industries, for example, have so far been less successful at taking advantage of data and analytics than other industries such as logistics and retail. Effective capture of value will likely require organizational investment in change management strategies that support transformation of the culture, and redesign of legacy processes. In some cases, the less-than-satisfying impacts of Big Data projects are not for lack of significant financial investments in new technology. It is common to find reports pointing to a shortage of technical talent as one of the largest barriers to undertaking projects, and this issue is expected to persist into the future. This volume explores the adoption of Big Data systems and barriers to adoption; factors in maturity of Big Data projects, organizations implementing those projects, and the Big Data technology market; considerations for implementation and modernization of Big Data systems; and, Big Data readiness.
Big Data is a term used to describe the large amount of data in the networked, digitized, sensor-laden, information-driven world. While opportunities exist with Big Data, the data can overwhelm traditional technical approaches and the growth of data is outpacing scientific and technological advances in data analytics. To advance progress in Big Data, the NIST Big Data Public Working Group (NBD-PWG) is working to develop consensus on important, fundamental concepts related to Big Data. The results are reported in the NIST Big Data Interoperability Framework series of volumes. This volume, Volume 6, summarizes the work performed by the NBD-PWG to characterize Big Data from an architecture perspective, presents the NIST Big Data Reference Architecture (NBDRA) conceptual model, and discusses the components and fabrics of the NBDRA.
This document summarizes interfaces that are instrumental for the interaction with Clouds, Containers, and High Performance Computing (HPC) systems to manage virtual clusters to support the NIST Big Data Reference Architecture (NBDRA). The REpresentational State Transfer (REST) paradigm is used to define these interfaces, allowing easy integration and adoption by a wide variety of frameworks. Big Data is a term used to describe extensive datasets, primarily in the characteristics of volume, variety, velocity, and/or variability. While opportunities exist with Big Data, the data characteristics can overwhelm traditional technical approaches, and the growth of data is outpacing scientific and technological advances in data analytics. To advance progress in Big Data, the NIST Big Data Public Working Group (NBD-PWG) is working to develop consensus on important fundamental concepts related to Big Data. The results are reported in the NIST Big Data Interoperability Framework (NBDIF) series of volumes. This volume, Volume 8, uses the work performed by the NBD-PWG to identify objects instrumental for the NIST Big Data Reference Architecture (NBDRA) which is introduced in the NBDIF: Volume 6, Reference Architecture.
Big Data is a term used to describe the large amount of data in the networked, digitized, sensor-laden, information-driven world. While opportunities exist with Big Data, the data can overwhelm traditional technical approaches and the growth of data is outpacing scientific and technological advances in data analytics. To advance progress in Big Data, the NIST Big Data Public Working Group (NBD-PWG) worked to develop consensus on important fundamental concepts related to Big Data. The results are reported in the NIST Big Data Interoperability Framework series of volumes. This volume, Volume 3, contains the original 51 Version 1 use cases gathered by the NBD-PWG Use Cases and Requirements Subgroup and the requirements generated from those use cases. The use cases are presented in their original and summarized form. Requirements, or challenges, were extracted from each use case, and then summarized over all the use cases. These generalized requirements were used in the development of the NIST Big Data Reference Architecture (NBDRA), which is presented in Volume 6. During the development of Version 2 of the NBDIF, the Use Cases and Requirements Subgroup and the Security and Privacy Subgroup identified the need for additional use cases to strengthen work of the NBD-PWG in Stage 3. The subgroup accepted additional use case submissions using the more detailed Use Case Template 2. The three additional use case submissions collected using Use Case Template 2 are presented and summarized in this volume.
The IOF-s mission is to create a suite of ontologies intended to support digital manufacturing by facilitating cross-system integration both within the factory and across an enterprise, in commerce between suppliers, manufacturers, customers and other trading partners, and throughout the various stages of the product life cycle. The IOF Core Ontology resides at the top of this suite from an architectural perspective and contains terms found in a number of operational areas of manufacturing. These common terms appear, or are anticipated to appear, in two or more of the ontologies of the suite. Additionally, as the architectural approach chosen by the IOF is to base all of its ontologies on a single foundational or top-level ontology - for which the IOF chose the Basic Formal Ontology or BFO - the Core Ontology contains a number of intermediate-level terms that derive from BFO and from which common industry terms are in turn derived. Such intermediate-level terms are most often domain independent - meaning they are found in other industries and domains, such as in the banking, insurance, and healthcare industries, or in the sciences, as in the physics, chemistry and biology domains. The IOF Core Ontology is developed and formalized as an ontology using both first-order logic and version 2 of the Web Ontology Language (OWL). The use of logic ensures that each term is defined in a way that is unambiguous to humans and can be processed by computers. All terms appearing in the ontology are reviewed and curated by a working group and consensus is reached by validating usage in the context of manufacturing domain use cases.
The objective of the paper is to show how the OAM can be used to realize seamless integration of product information, with an emphasis on assembly, throughout all phases of a product design
This publication provides a catalog of security and privacy controls for federal information systems and organizations and a process for selecting controls to protect organizational operations (including mission, functions, image, and reputation), organizational assets, individuals, other organizations, and the Nation from a diverse set of threats including hostile cyber attacks, natural disasters, structural failures, and human errors. The controls are customizable and implemented as part of an organization-wide process that manages information security and privacy risk. The controls address a diverse set of security and privacy requirements across the federal government and critical infrastructure, derived from legislation, Executive Orders, policies, directives, regulations, standards, and/or mission/business needs. The publication also describes how to develop specialized sets of controls, or overlays, tailored for specific types of missions/business functions, technologies, or environments of operation. Finally, the catalog of security controls addresses security from both a functionality perspective (the strength of security functions and mechanisms provided) and an assurance perspective (the measures of confidence in the implemented security capability). Addressing both security functionality and security assurance ensures that information technology products and the information systems built from those products using sound systems and security engineering principles are sufficiently trustworthy.
The purpose of this document is to define a NIST Cloud Computing Security Reference Architecture (NCC-SRA)--a framework that: i) identifies a core set of Security Components that can be implemented in a Cloud Ecosystem to secure the environment, the operations, and the data migrated to the cloud; ii) provides, for each Cloud Actor, the core set of Security Components that fall under their responsibilities depending on the deployment and service models; iii) defines a security-centric formal architectural model that adds a security layer to the current NIST SP 500-292, "NIST Cloud Computing Reference Architecture"; and iv) provides several approaches for analyzing the collected and aggregated data.
Big Data is a term used to describe the large amount of data in the networked, digitized, sensor- laden, information-driven world. While opportunities exist with Big Data, the data can overwhelm traditional technical approaches and the growth of data is outpacing scientific and technological advances in data analytics. To advance progress in Big Data, the NIST Big Data Public Working Group (NBD-PWG) is working to develop consensus on important, fundamental concepts related to Big Data. The results are reported in the NIST Big Data Interoperability Framework (NBDIF) series of volumes. This volume, Volume 7, contains summaries of the work presented in the other six volumes, an investigation of standards related to Big Data, and an inspection of gaps in those standards.