Big Data

Available (73)

Showing 37 - 48 per page



Self-contained Information Retention Format (SIRF)

The SIRF format enables long-term physical storage, cloud storage and tape-based containers effective and efficient ways to preserve and secure digital information for many decades, even with the ever-changing technology landscape. This SNIA Technical Position specifies the Self-contained Information Retention Format (SIRF) Level 1 and its serialization for LTFS, CDMI and OpenStack Swift.

SIRF v1.0

Network Functions Virtualisation (NFV) Release 2; Protocols and Data Models; Network Service Descriptor File Structure Specification

The present document specifies the structure of the Network Service Descriptor (NSD) file archive and the naming conventions for the different files it contains, fulfilling the requirements specified in ETSI GS NFV-IFA 014 [1] for an NSD file structure.

ETSI GS NFV-SOL 007 V2.6.1

Information technology — Big data — Overview and vocabulary

This document provides a set of terms and definitions needed to promote improved communication and understanding of this area. It provides a terminological foundation for big data-related standards.

This document provides a conceptual overview of the field of big data, its relationship to other technical areas and standards efforts, and the concepts ascribed to big data that are not new to big data.

ISO/IEC 20546:2019

Autonomic Management and Control Intelligence for Self-Managed Fixed & Mobile Integrated Networks

AFI WG is the leading Standardization Group on Autonomic Management & Control (AMC) of Networks and Services, including Autonomic Networking, Cognitive Networking and Self-Management. Its mandate is to carry out the various Tasks and Activities described below:

  • Develop use cases and requirements for automated & autonomic (self-) management of networks and services;
  • Develop test frameworks and test specifications for autonomic network architectures (Self-Adaptive Networks);
  • Draft test specifications for Scenarios, Use Cases and Requirements for automated & autonomic (self-) management of networks and services;
  • Drive a 5G PoC Program/Project on GANA Autonomics in 5G Network Slices Creation, Autonomic & Cognitive Management & E2E Orchestration; with Closed-Loop (Autonomic) Service Assurance of IoT 5G Slices;

Maintenance and evolution of existing ETSI standards on network and service management, including NGN management (ETSI 188 series: https://www.etsi.org/deliver/etsi_ts/188000_188099/ ); and Maintenance of existing ETSI Standards linked to AFI activities: 

  • ETSI TS 103 194 on Scenarios, Use Cases and Requirements for Autonomic/Self-Managing Future Internet;
  • ETSI TS 103 195-1,
  • ETSI TS 103 195-2,
  • ETSI TS 103 195-3

GANA Model and its instantiations onto various types of network architectures and their associated management and control architectures:

  • ETSI TR 103 473 on GANA autonomics in BBF Architectures;
  • ETSI TR 103 404 on GANA autonomics in 3GPP Backhaul & Core Network Architectures;
  • ETSI TR 103 495 on GANA autonomics in Wireless Ad-hoc/Mesh Network Architectures;

ETSI White Paper no. 16: The Generic Autonomic Networking Architecture Reference Model for Autonomic Networking, Cognitive Networking and Self-Management of Networks and Services: http://www.etsi.org/images/files/ETSIWhitePapers/etsi_wp16_gana_Ed1_20161011.pdf

ETSI TC INT AFI WG

Supply Chain Reference Data Model (SCRDM)

The development by theUnited Nations Centre for Trade Facilitation and Electronic Business(UN/CEFACT)of Reference Data Models (RDMs) paves the way for thisrequired new approach. The RDMs being developed by UN/CEFACT are applicable to specificsegments of the e-business arena and are based on UN/CEFACT standardized business semantics. In summary,“anRDM providesa consolidated list of standardized data and processes for use in a particular business domain, which are globally understandable and exchangeable between parties using common standard data exchange structures.
 
CEFACT/DEC/2018

OASIS Context Server (CXS) TC

The OASIS Context Server (CXS) TC was chartered to create specifications for a Context Server (also known as Customer Data Platforms, CDP see below) as a core technology for enabling the delivery of personalized user experiences. The goal is to assist organizations that currently struggle to create and deliver consistent personalized experiences across channels, markets, and systems. The Context Server (aka CDP) will simplify management, integration, and interoperability between solutions providing services like Web Content Management, CRM, BigData, Machine Learning, Digital Marketing, and Data Management Platforms. TC members are producing a detailed list of use cases, a domain model, a REST API, as well as a reference implementation to serve as a real world example of how the Context Server standard can be used.
 
Relation to Customer Data Platforms: since the OASIS Context Server TC has been established, the term Customer Data Platform (CDP) has emerged and can be interchangeably be used for the Context Server. To reflect this, the specification produced by this TC has changed its name to : the Customer Data Platform specification.
 
Relation to Apache Unomi : the reference implementation of the Customer Data Platform specification is produced as part of the Apache Unomi project.

Hybrid Cloud Considerations for Big Data and Analytics

Hybrid Cloud Considerations for Big Data and Analytics is a companion guide to the CSCC's Cloud Customer Architecture for Big Data and Analytics.
 
Today, the majority of big data and analytics use cases are running on hybrid cloud computing infrastructure. A hybrid cloud is a combination of on-premises and local cloud resources integrated with one or more dedicated cloud(s) and one or more public cloud(s), allowing for increased scalability and computational power for big data and analytics capabilities.
 
This whitepaper summarizes what hybrid cloud is, explains why it is important in the context of big data and analytics, and discusses implementation considerations

Hybrid Cloud Considerations for Big Data and Analytics

Cloud Customer Architecture for Big Data and Analytics V2.0

Cloud Customer Architecture for Big Data and Analytics describes the architectural elements and cloud components needed to build out big data and analytics solutions.
 
Big data analytics and cloud computing are a top priority for CIOs. Harnessing the value and power of big data and cloud computing can give your company a competitive advantage, spark new innovations, and increase revenue. Many companies are experimenting and iterating with different cloud configurations as a way to understand and refine requirements for their big data analytics solutions without upfront capital investment.
 
This paper includes proven architecture patterns that have been deployed in successful enterprise projects and a description of capabilities offered by cloud providers.

Cloud Customer Architecture for Big Data and Analytics V2.0

Cloud Customer Architecture for Big Data and Analytics V2.0

Cloud Customer Architecture for Big Data and Analytics describes the architectural elements and cloud components needed to build out big data and analytics solutions.
 
Big data analytics and cloud computing are a top priority for CIOs. Harnessing the value and power of big data and cloud computing can give your company a competitive advantage, spark new innovations, and increase revenue. Many companies are experimenting and iterating with different cloud configurations as a way to understand and refine requirements for their big data analytics solutions without upfront capital investment.
 
This whitepaper includes proven architecture patterns that have been deployed in successful enterprise projects and a description of capabilities offered by cloud providers.

Cloud Customer Architecture for Big Data and Analytics V2.0

NIST Big Data Interoperability Framework: Volume 1, Definitions

Big Data is a term used to describe the large amount of data in the networked, digitized, sensor-laden, information-driven world. The growth of data is outpacing scientific and technological advances in data analytics. Opportunities exist with Big Data to address the volume, velocity and variety of data through new scalable architectures. To advance progress in Big Data, the NIST Big Data Public Working Group (NBD-PWG) is working to develop consensus on important, fundamental concepts related to Big Data. The results are reported in the NIST Big Data Interoperability Framework (NBDIF) series of volumes. This volume, Volume 1, contains a definition of Big Data and related terms necessary to lay the groundwork for discussions surrounding Big Data.

NIST Big Data Interoperability Framework: Volume 2, Big Data Taxonomies [Version 2]

Big Data is a term used to describe the large amount of data in the networked, digitized, sensor- laden, information-driven world. While opportunities exist with Big Data, the data can overwhelm traditional technical approaches and the growth of data is outpacing scientific and technological advances in data analytics. To advance progress in Big Data, the NIST Big Data Public Working Group (NBD-PWG) is working to develop consensus on important, fundamental concepts related to Big Data. The results are reported in the NIST Big Data Interoperability Framework (NBDIF) series of volumes. This volume, Volume 2, contains the Big Data taxonomies developed by the NBD-PWG. These taxonomies organize the reference architecture components, fabrics, and other topics to lay the groundwork for discussions surrounding Big Data.