Premium

Mapr (HP) Hadoop Developer Certification Questions and Answers (Dumps and Practice Questions)



Question : Which all statements are true regarding MCS (MapR Control Server)


 : Which all statements are true regarding MCS (MapR Control Server)
1. MCS, can be used for both Managing and Monitoring your MapR Cluster

2. We have to do one time configuration of Matrices database

3. Access Mostly Uused Products by 50000+ Subscribers

4. 1,2
5. 1,2,3


Correct Answer : Get Lastest Questions and Answer :
Explanation: MapR Control System (MCS) gives Hadoop administrators a single place for configuring, monitoring, and managing their clusters.




Question : You have given screenshot of MCS ,please select correct statement

 : You have given screenshot of MCS ,please select correct statement
1. Total time taken by job is approx. 14Seconds

2. Time taken by Map Tasks is 7.9 Seconds

3. Access Mostly Uused Products by 50000+ Subscribers

4. 2,3
5. 1,2,3

Correct Answer : Get Lastest Questions and Answer :
Explanation: Other than Map tasks and reduce tasks. There are other activity is also done by Hadoop Framework. Hence, total time for the Job is not sum of
Mapper and Reducer Tasks.





Question : Which of the following logs can be viewed from MCS
A. Standard out generated from a task
B. Standard error generated from a task
C. Syslog log file entries generated by a task
D. Profile output generated by a task
E. Debug script output generated by a task
 : Which of the following logs can be viewed from MCS
1. A,B,C,D
2. A,B,C,E
3. Access Mostly Uused Products by 50000+ Subscribers
4. A,B,D,E
5. A,B,C,D,E

Correct Answer : Get Lastest Questions and Answer :
Explanation:


Related Questions


Question : You have following Perl script, which you want to use as a Streaming identity Mapper

#!/usr/bin/env perl

while(<>) {
chomp;
my($key, $value) = split(/\t/,$_);
print "Key:".$key.XXXXX."value:".$value.YYYYY;
}

What would be the correct replacement for separator?

 :  You have following Perl script, which you want to use as a Streaming identity Mapper
1. XXXXX-> \n , YYYYY -> \n

2. XXXXX -> \t , YYYYY-> \t

3. Access Mostly Uused Products by 50000+ Subscribers

4. XXXXX -> \n , YYYYY-> \t



Question : Which is the following is/are correct way to debug streaming job


 : Which is the following is/are correct way to debug streaming job
1. We need to check mapper and reducer script can run on its own by feeding it input on standard in

2. We must test with bad data as well.

3. Access Mostly Uused Products by 50000+ Subscribers

4. 1,2

5. 1,2,3



Question : How to monitor streaming jobs using counter


 : How to monitor streaming jobs using counter
1. update counters from within your map and reduce scripts, with the string "reporter:counter"

2. update status from within your map and reduce scripts, with the string "reporter:status"

3. Access Mostly Uused Products by 50000+ Subscribers

4. 1,2

5. 1,2,3


Question : Which of the following is correct flow for MapReduce streaming job

 :  Which of the following is correct flow for MapReduce streaming job
1. input -> pipeMap -> PipeRedue -> Output

2. input ->pipeMap -> Map -> PipeMap -> Reduce -> Output

3. Access Mostly Uused Products by 50000+ Subscribers

4. input ->Map -> PipeMap -> Reduce -> PipeReduce -> Output


Question : In Reducer Streaming MapReduce program all the key-value pairs are sent at once
 : In Reducer Streaming MapReduce program all the key-value pairs are sent at once
1. True
2. False


Question : In MapReduce V, select the correct order of Steps of job submission

A. Instantiation of JobClient object
B. Submitting job to JobTracker by JobClient
C. Job Tracker instantiates a job object
D. Task Tracker launches a task, which in turn can run map or reduce task
E. Tasks updates the task tracker with status and counters
 : In MapReduce V, select the correct order of Steps of job submission
1. B,A,C,E,D
2. A,B,D,E,C
3. Access Mostly Uused Products by 50000+ Subscribers
4. A,D,E,C,B
5. A,B,C,D,E