Premium

IBM Certified Data Architect - Big Data Certification Questions and Answers (Dumps and Practice Questions)



Question : Which of the following statements is TRUE regarding cloud applications?


  : Which of the following statements is TRUE regarding cloud applications?
1. Migrating a legacy application to the cloud is a simple solution to drive down cost

2. Architecting and deploying a scalable cloud application requires a private cloud implementation

3. Access Mostly Uused Products by 50000+ Subscribers

4. Leveraging a private vs. public cloud may result in sacrificing some of the core advantages of cloud computing


Correct Answer : Get Lastest Questions and Answer :
Explanation: Private clouds do not make sense for small businesses. But for large and even medium-sized businesses, IT teams can make parts of their infrastructures
virtual, so they can use their business processes and computer resources in a private cloud. As the concept becomes more mature, the idea would be to move everything that needs
more flexibility to the cloud.





Question : In Hadoop framework , you know there are 's of nodes, working as a data nodes. Now, while putiing data in the cluster, it will be decided by NameNode , on which
node and which rack data should be copied? Which of the following will help NameNode to find the correct node in a rack ?
 : In Hadoop framework , you know there are 's of nodes, working as a data nodes. Now, while putiing data in the cluster, it will be decided by NameNode , on which
1. Admin has to do the pre-configuration on a NameNode

2. ResourceManager Will help

3. Access Mostly Uused Products by 50000+ Subscribers

4. YARN Cluster Manager

Correct Answer : Get Lastest Questions and Answer :
Explanation: Rack Awareness

Hadoop components are rack-aware. For example, HDFS block placement will use rack awareness for fault tolerance by placing one block replica on a different rack. This provides data
availability in the event of a network switch failure or partition within the cluster.

Hadoop master daemons obtain the rack id of the cluster slaves by invoking either an external script or java class as specified by configuration files. Using either the java class
or external script for topology, output must adhere to the java org.apache.hadoop.net.DNSToSwitchMapping interface. The interface expects a one-to-one correspondence to be
maintained and the topology information in the format of ˜/myrack/myhost, where ˜/ is the topology delimiter, ˜myrack is the rack identifier, and ˜myhost is the individual
host. Assuming a single /24 subnet per rack, one could use the format of ˜/192.168.100.0/192.168.100.5 as a unique rack-host topology mapping.

To use the java class for topology mapping, the class name is specified by the net.topology.node.switch.mapping.impl parameter in the configuration file. An example,
NetworkTopology.java, is included with the hadoop distribution and can be customized by the Hadoop administrator. Using a Java class instead of an external script has a performance
benefit in that Hadoop doesnt need to fork an external process when a new slave node registers itself.

If implementing an external script, it will be specified with the net.topology.script.file.name parameter in the configuration files. Unlike the java class, the external topology
script is not included with the Hadoop distribution and is provided by the administrator. Hadoop will send multiple IP addresses to ARGV when forking the topology script. The
number of IP addresses sent to the topology script is controlled with net.topology.script.number.args and defaults to 100. If net.topology.script.number.args was changed to 1, a
topology script would get forked for each IP submitted by DataNodes and/or NodeManagers.





Question : You are working as a Chief Data Architect in a Retail Bank. And you are being asked to do following activities

- Monitor Each ATM transaction
- Monitor Each online Transaction

Also, you need to create a personalized model for each customer, using existing customer data and also using Customer facebook data. And system should be able to learn by this and
provide, highly targeted promotions. Which of the following system will help you to implement this
 : You are working as a Chief Data Architect in a Retail Bank. And you are being asked to do following activities
1. Apache Spark

2. Apache Hive

3. Access Mostly Uused Products by 50000+ Subscribers

4. Netezza

Correct Answer : Get Lastest Questions and Answer :
Explanation:


Related Questions


Question : Which of the following type of data well supported on IBM Big Data platform?


  : Which of the following type of data well supported on IBM Big Data platform?
1. Semi-structured

2. unstructured

3. Access Mostly Uused Products by 50000+ Subscribers

4. 1 and 3

5. 1,2 and 3


Question : A big data solution typically comprises these logical layers

A. Big data sources
B. Data massaging and store layer
C. Analysis layer
D. Consumption layer
  : A big data solution typically comprises these logical layers
1. A,B,C
2. B,C,D
3. Access Mostly Uused Products by 50000+ Subscribers
4. A,B,C,D


Question : Which of the layer, can do this

An image might need to be converted so it can be stored in a Hadoop Distributed File System (HDFS) store or a Relational Database Management System (RDBMS) warehouse for further

  : Which of the layer, can do this
1. Big data sources

2. Data massaging and store layer

3. Access Mostly Uused Products by 50000+ Subscribers

4. Consumption layer


Question : The analysis layer reads the data digested by the data massaging and store layer. In some cases, the analysis layer accesses the data directly from the data source.
Designing the analysis layer requires careful forethought and planning. Decisions must be made with regard to how to manage the tasks to

A. Produce the desired analytics
B. Derive insight from the data
C. Find the entities required
D. Locate the data sources that can provide data for these entities
E. Understand what algorithms and tools are required to perform the analytics.
  : The analysis layer reads the data digested by the data massaging and store layer. In some cases, the analysis layer accesses the data directly from the data source.
1. A,B,C
2. C,D,E
3. Access Mostly Uused Products by 50000+ Subscribers
4. A.B,C,D
5. A,B,C,D,E


Question : Visualization applications, human beings, business processes, or services can be considered under which logical layer of BigData
  : Visualization applications, human beings, business processes, or services can be considered under which logical layer of BigData
1. Big data sources

2. Data massaging and store layer

3. Access Mostly Uused Products by 50000+ Subscribers

4. Consumption layer


Question : You are working in Arinika INC, now you need to look for all the characteristics of BigData. Which of the following cannot be a characteristics of BigData
  : You are working in Arinika INC, now you need to look for all the characteristics of BigData. Which of the following cannot be a characteristics of BigData
1. Data frequency and size

2. Software

3. Access Mostly Uused Products by 50000+ Subscribers

4. Processing methodology