Premium

Mapr (HP) Hadoop Developer Certification Questions and Answers (Dumps and Practice Questions)



Question : You want to run Hadoop jobs on your development workstation for testing before you
submit them to your production cluster. Which mode of operation in Hadoop allows you to
most closely simulate a production cluster while using a single machine?

  : You want to run Hadoop jobs on your development workstation for testing before you
1. Run all the nodes in your production cluster as virtual machines on your development workstation.
2. Run the hadoop command with the -jt local and the -fs file:///options.
3. Run the DataNode, TaskTracker, NameNode and JobTracker daemons on a single machine.
4. Run simldooop, the Apache open-source software for simulating Hadoop clusters.

Correct Answer : Get Lastest Questions and Answer :
Explanation:




Question : Assuming default settings, which best describes the order of data provided to a reducers reduce method:

  : Assuming default settings, which best describes the order of data provided to a reducers reduce method:
1. The keys given to a reducer aren't in a predictable order, but the values associated with those keys always are.
2. Both the keys and values passed to a reducer always appear in sorted order.
3. Neither keys nor values are in any predictable order.
4. The keys given to a reducer are in sorted order but the values associated with each key are in no predictable order

Correct Answer : Get Lastest Questions and Answer :
Explanation: Reducer has 3 primary phases:
1. Shuffle : The Reducer copies the sorted output from each Mapper using HTTP across the network.
2. Sort : The framework merge sorts Reducer inputs by keys (since different Mappers may have output the same key).
The shuffle and sort phases occur simultaneously i.e. while outputs are being fetched they are merged.
SecondarySort : To achieve a secondary sort on the values returned by the value iterator, the application
should extend the key with the secondary key and define a grouping comparator. The keys
will be sorted using the entire key, but will be grouped using the grouping comparator to
decide which keys and values are sent in the same call to reduce.
3. Reduce : In this phase the reduce(Object, Iterable, Context) method is called for each (collection of values)> in the sorted inputs.
The output of the reduce task is typically written to a RecordWriter via
TaskInputOutputContext.write(Object, Object).
The output of the Reducer is not re-sorted.
Reference: org.apache.hadoop.mapreduce, Class
Reducer




Question : Which HDFS command displays the contents of the file x in the user's HDFS home directory?

  : Which HDFS command displays the contents of the file x in the user's HDFS home directory?
1. hadoop fs -Is x

2. hdfs fs -get x

3. hadoop fs -cat x

4. hadoop fs -cp x

Correct Answer : Get Lastest Questions and Answer :
Explanation:



Related Questions


Question : What are the feature of the Hadoop Framework
  : What are the feature of the Hadoop Framework
1. Nodes talks to each other as little as possible
2. Computation happens where the data is stored
3. Access Mostly Uused Products by 50000+ Subscribers
4. All of the above


Question : MapTask works at a time....

  : MapTask works at a time....
1. Multiple data blocks
2. Single Blocks
3. Access Mostly Uused Products by 50000+ Subscribers
4. All of the above


Question : What happens when a running task fails in the hadoop...
  : What happens when a running task fails in the hadoop...
1. Failed task data will be lost
2. the master will detect that failure and re-assign the work to a different node on the system
3. Access Mostly Uused Products by 50000+ Subscribers
4. 2 and 3 both are correct


Question : If a node appears to be running slowly then .....
  : If a node appears to be running slowly then .....
1. the master can redundantly execute another instance of the same task
2. Result from the first to finish will be used
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1 and 2 are correct


Question : What are the core components of the Hadoop framework


  : What are the core components of the Hadoop framework
1. HDFS (Hadoop Distributed File System)
2. MapReduce
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1 nad 2 both are correct


Question : Hadoop MapReduce code can be written in other than Java language ?
   :  Hadoop MapReduce code can be written in other than Java language ?
1. True
2. False