APMG International EBDP Certification Exam Syllabus

EBDP dumps PDF, APMG International EBDP Braindumps, free Enterprise Big Data Professional dumps, Enterprise Big Data Professional dumps free downloadTo achieve the professional designation of APMG International Enterprise Big Data Professional from the APMG International, candidates must clear the EBDP Exam with the minimum cut-off score. For those who wish to pass the APMG International Enterprise Big Data Professional certification exam with good percentage, please take a look at the following reference document detailing what should be included in APMG International Enterprise Big Data Professional Exam preparation.

The APMG International EBDP Exam Summary, Body of Knowledge (BOK), Sample Question Bank and Practice Exam provide the basis for the real APMG International Certified Enterprise Big Data Professional (EBDP) exam. We have designed these resources to help you get ready to take APMG International Enterprise Big Data Professional (EBDP) exam. If you have made the decision to become a certified professional, we suggest you take authorized training and prepare with our online premium APMG International Enterprise Big Data Professional Practice Exam to achieve the best result.

APMG International EBDP Exam Summary:

Exam Name APMG International Enterprise Big Data Professional
Exam Code EBDP
Exam Fee USD $299
Exam Duration 90 Minutes
Number of Questions 60
Passing Score 65%
Format Multiple Choice Questions
Books / Trainings Find a training provider
Schedule Exam Book an exam
Sample Questions APMG International Enterprise Big Data Professional Exam Sample Questions and Answers
Practice Exam APMG International Certified Enterprise Big Data Professional (EBDP) Practice Test

APMG International Enterprise Big Data Professional Syllabus Topics:

Topic Details

Big Data Key Concepts

Recall key terms and definitions relating to Big Data Specifically to recall:
- The definition of Big Data
- The names of the four characteristics of Big Data
- The names of the two classes of machine learning and the techniques commonly associated with them:
  • Supervised - classified and regression
  • Unsupervised - clustering and correlation
Understand the origins of Big Data and the characteristics of its key concepts Specifically to understand:
- The origins of Big Data and the characteristics of the three Big Data development phases:
  • Phase 1
  • Phase 2
  • Phase 3
- The four characteristics of Big Data and how they distinguish Big Data from traditional data analysis:
  • Volume
  • Velocity
  • Variety
  • Veracity
- The four forms of pattern identification:
  • analysis
  • analytics
  • business intelligence
  • Big Data
- The purpose of the different types of analytics:
  • descriptive
  • diagnostic
  • predictive
  • prescriptive.
- The function of metadata in Big Data environments
- The characteristics of the three data types:
  • Structured
  • Unstructured
  • Semi-structured
- The role of Hadoop in distributed storage and distributed processing
- The two classes of machine learning and be able to recognize examples of these:
  • Supervised
  • Unsupervised

The Big Data Framework

Recall terms and key facts about the Big Data Framework Specifically to recall:
- The names of the six capabilities of the Big Data Framework
Understand the structure of the Big Data Framework Specifically to understand:
- The relevance of each of the six Big Data Framework capabilities in establishing a Big Data organization
- The different levels of the Big Data maturity model:
  • Level 1 - Analytically Impaired
  • Level 2 - Localized Analytics
  • Level 3 - Analytical Operation
  • Level 4 - Analytical Enterprise
  • Level 5 - Data Driven Enterprise

Big Data Strategy

Recall key facts about the Big Data Strategy Specifically to recall:
- The five steps for formulating a Big Data Strategy and their sequence
Understand how to formulate a Big Data Strategy and the activities and techniques involved Specifically to understand:
- The six business drivers influencing the need for a Big Data strategy and how Big Data can be used to generate a competitive advantage
- The Prioritization Matrix
  • Its purpose
  • Its structure
- The activities involved in each of the five steps for formulating a Big Data Strategy:
  • Step 1 - Define business objectives
  • Step 2 - Execute current state assessment
  • Step 3 - Identify and prioritize Use Cases
  • Step 4 - Formulate a Big Data Roadmap
  • Step 5 - Embed through Change Management

Big Data Architecture

Recall terms and key facts about Big Data Architecture Specifically to recall:
- What a reference architecture is and its purpose
- Key features about the structure of the NIST Big Data reference architecture:
  • The overall structure (5 logical roles and 2 dimensions)
  • The names of the roles
  • The names of the dimensions
  • How information flows between the different roles

- The names of the core components in a Hadoop Architecture:

  • NameNode
  • MapReduce
  • SlaveNode
  • Job tracker
  • HDFS
Understand the high-level principles and design elements of contemporary Big Data Architecture Specifically to understand:
- The benefits of using a Big Data reference architecture
- The functions and activities associated with the logical roles in the reference architecture
  • System Orchestrator
  • Data Provider
  • Big Data Application Provider
  • Big Data Framework Provider
  • Data Consumer

- The difference between local and distributed storage and processsing
- The three types of Big Data storage systems for massive data:

  • Direct Attached Storage (DAS)
  • Network Attached Storage (NAS)
  • Storage Area Network (SAN)

- The storage mechanisms for Big Data

  • File systems
  • NoSQL databases
  • Parallel programming models

- The Big Data Real analysis architectures:

  • Real time analysis
  • Off-line analysis

- The function of Hadoop in Big Data Environments
- The role of the following Hadoop components:

  • NameNode
  • MapReduce
  • SlaveNode
  • Job tracker
  • HDFS

Big Data Algorithms

Recall terms and key facts about Big Data Algorithms and Analysis Techniques Specifically to recall:

- What descriptive statistics are
- Key facts about correlation:

  • What correlation is
  • The two types of variable used in correlation
  • Key facts about the Pearson correlation coefficient:
    - What it measures
    - Its value range
    - What a negative, positive or 0 value means

- Key facts about classification

  • What it does
  • What form of machine learning it is
Understand the algorithms and analysis techniques fundamental to Big Data Specifically to understand:
- For each type of descriptive statistic, understand what each statistical operation/distribution measures or shows:
  • Central tendency statistics
  • Dispersion statistics and
  • Distribution Shapes
- The characteristics of skew:
  • Positive
  • Negative
- The reason why standardization is used in Big Data calculations
- Recognize and calculate examples of descriptive statistics
- The characteristics of the different types of distribution shapes:
  • Frequency
  • Probability
  • Sampling
  • Normal
- Why the distribution shapes are important to Big Data and data science:
  • Probability
  • Sampling
  • Normal
  • Skew
- The implications of population, sample and bias for Big Data
- How correlations are used in Big Data and recognize examples of this.
- The differences between correlation and regression
- Recognize examples of a classification algorithm
- The key characteristics of clustering:
  • What it does
  • Typically what most clustering algorithms look at
- How outlier detection is used in the context of Big Data
- The key characteristics of each of the Visualization techniques and how each technique is used, with reference to examples:
  • Bar charts
  • Histograms
  • Scatter plots
  • Bi-plots
  • Box plots
  • Q-Q plots
  • Pie charts

Big Data Processes

Recall key terms relating to the Big Data Processes Specifically, to recall:
- The three different main processes that are used in Big Data and their main characteristics
- In which step in the data analysis process are the following tools/techniques typically used and how they are applied in that step:
  • Data identification graph
  • Data visualization techniques
  • Algorithms
Understand the characteristics, activities and techniques of the Big Data Processes Specifically, to understand:
- The characteristics of the six types of problems that shape the business objectives of Big Data projects:
  • Descriptive
  • Exploratory
  • Inferential
  • Predictive
  • Causal
  • Mechanistic
- The importance of each step within the data analysis process and what occurs in each step;
  • Determine the business objective
  • Data identification
  • Data collection and sourcing
  • Data review
  • Data cleansing
  • Model building
  • Data processing
  • Communicating the results
- The importance of each step within the data governance process and what occurs in each step:
  • Develop data quality strategy
  • Review regulatory and privacy requirements
  • Develop data governance policies
  • Assign roles and responsibilities
- The importance of each activity within the data management process and the what occurs in each activity:
  • Specify metrics and performance indicators
  • Monitor and manage enterprise data
  • Data improvement and validation
  • Communicate and educate on data management

Big Data Functions

Recall key terms relating to Big Data Functions Specifically, to recall:
- The names of the five pillars of the Big Data Centre of Excellence and the key characteristics of each pillar:
  • Big Data Team
  • Big Data Lab
  • Proof of Concepts
  • Agile Methodology
  • Charging Models
Understand the benefits of the Big Data Centre of Excellence, the six organization success factors and the key roles in Big Data teams Specifically, to understand:
 
- The benefits of a Big Data Centre of Excellence:
- The typical responsibilities and skill sets of the key roles in Big Data teams:
  • Big Data Analyst
  • Big Data Scientist
  • Big Data Engineer

- The six organization success factors for Big Data

Artificial Intelligence

Recall key definitions and facts relating to Artificial Intelligence and Big Data Specifically, to recall:
- The operational definition of intelligence according to the Turing test
- Key facts about cognitive analytics:
  • What cognitive analytics is
  • The two main features that differentiate cognitive analytics from other forms of analytics
Understand the key concept of Artificial Intelligence and their importance to Big Data Specifically, to understand:
- The role of rational agents in cognitive analytics
- The four essential capabilities of artificial intelligence:
  • Natural language processing
  • Knowledge representation
  • Automated reasoning
  • Machine learning

- Key characteristics about Deep Learning in artificial intelligence:

  • What Deep Learning is
  • Where it is predominantly used

Both APMG International and veterans who’ve earned multiple certifications maintain that the best preparation for a APMG International EBDP professional certification exam is practical experience, hands-on training and practice exam. This is the most effective way to gain in-depth understanding of APMG International Enterprise Big Data Professional concepts. When you understand techniques, it helps you retain APMG International Enterprise Big Data Professional knowledge and recall that when needed.

Your rating: None Rating: 5 / 5 (52 votes)