Premium

Mapr (HP) Hadoop Developer Certification Questions and Answers (Dumps and Practice Questions)



Question :

Select the correct code snippet which will produce the 12 files each for a month, considering you have defined 12 reducers for this job

Sample input data
10.1.255.266,hadoopexam.com,index.html,20/Aug/2013
10.1.255.2,hadoopexam.com,index.html,11/Feb/2013
10.1.255.233,hadoopexam.com,index.html,14/Jan/2013

 :
1. 1
2. 2
3. Access Mostly Uused Products by 50000+ Subscribers

Correct Answer : Get Lastest Questions and Answer :

Explanation: MyPartitioner is the class based on which it decides in which reducer the data should go. Hence there are 12 reducers based on each months
it will send the output to the reducer and create a corresponding files.






Question :

From the below given code snippet please select the correct one which is able to create Compressed Sequence file.

 :
1. 1
2. 2
3. Access Mostly Uused Products by 50000+ Subscribers

Correct Answer : Get Lastest Questions and Answer :

Explanation: Correct code snippet uses the OutputFile format as a SequenceFileOutputFormat and for compression it uses the SnappyCodec
And this is the Map only job,

There is no need to call setInputFormatClass, because the input
file is a text file. However, the output file is a SequenceFile.
Therefore, we must call setOutputFormatClass. Snappy compression as well as block level compression.





Question :

Select the correct code snippet which is able to read the compressed sequence file

 :
1. 1
2. 2
3. Access Mostly Uused Products by 50000+ Subscribers


Correct Answer : Get Lastest Questions and Answer :

Explanation: We are using a SequenceFile as the input file. Therefore, we must call setInputFormatClass.
There is no need to call setOutputFormatClass, because the application uses a text file on output.
There is no need to set compression options for the input file. The compression implementation details are encoded within the
input SequenceFile.
This is a map-only job, so we do not call setReducerClass, and we set the number of reducers to 0.



Related Questions


Question : During streaming job submission, you want to set environment variable as EXAMPLE_DIR=/home/example/dictionaries/. Which is correct option.

 :  During streaming job submission, you want to set environment  variable as EXAMPLE_DIR=/home/example/dictionaries/. Which is correct option.
1. -env EXAMPLE_DIR=/home/example/dictionaries/

2. -e EXAMPLE_DIR=/home/example/dictionaries/

3. Access Mostly Uused Products by 50000+ Subscribers

4. -sysenv EXAMPLE_DIR=/home/example/dictionaries/

5. -cmdenv EXAMPLE_DIR=/home/example/dictionaries/


Question : Which daemons is considered as master

  : Which daemons is considered as master
1. NameNode
2. Secondary NameNode
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1,2 and 3 are correct
5. 1 and 3 are correct




Question : Which node is considered as slave nodes

  : Which node is considered as slave nodes
1. Secondary NameNode
2. DataNode
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1,2 and 3 are correct
5. 2 and 3 are correct





Question : Which daemon stores the file data blocks ?

  : Which daemon stores the file data blocks ?
1. NameNode
2. TaskTracker
3. Access Mostly Uused Products by 50000+ Subscribers
4. Secondary Data Node




Question : When a client submits a Job, its configuration information is packaged into XML file

  : When a client submits a Job, its configuration information is packaged into XML file
1. True
2. False




Question : TaskTracker can not start multiple task in the same node

  :  TaskTracker can not start multiple task in the same node
1. True
2. False