Premium

Mapr (HP) Hadoop Developer Certification Questions and Answers (Dumps and Practice Questions)



Question : How can you use binary data in MapReduce?
  : How can you use binary data in MapReduce?
1. Binary data cannot be used by Hadoop framework.
2. Binary data can be used directly by a map-reduce job. Often binary data is added to a sequence file
3. Access Mostly Uused Products by 50000+ Subscribers
4. Hadoop can freely use binary files with map-reduce jobs so long as the files have headers

Correct Answer : Get Lastest Questions and Answer :

Binary data can be packaged in sequence files. Hadoop cluster does not work very well with large numbers of small files. Therefore, small files should be combined into bigger ones..






Question : Which is Hadoop Daemon Process (MRv)

  : Which is Hadoop Daemon Process (MRv)
1. JobTracker
2. Tasktracker
3. Access Mostly Uused Products by 50000+ Subscribers
4. DataNode
5. All of the above

Correct Answer : Get Lastest Questions and Answer :





Question : Which statement is true about apache Hadoop ?


  : Which statement is true about apache Hadoop ?
1. HDFS performs best with a modest number of large files
2. No Random Writes is allowed to the file
3. Access Mostly Uused Products by 50000+ Subscribers
4. All of the above

Correct Answer : Get Lastest Questions and Answer :



Related Questions


Question : Which of the following methods of the Mapper class is/are called?


 : Which of the following methods of the Mapper class is/are called?
1. setup()

2. map()

3. Access Mostly Uused Products by 50000+ Subscribers

4. 1,2

5. 1,2,3


Question : Map the following
A. setup()
B. map()
C. cleanup()

1. once for each record
2. once for each Mapper/split
3. Access Mostly Uused Products by 50000+ Subscribers

 : Map the following
1. A-1, B-2, C-3
2. A-3, B-1, C-31
3. Access Mostly Uused Products by 50000+ Subscribers
4. A-2, B-3, C-1
5. A-3, B-2, C-1


Question : You have written a MapReduce job. You open a connection to HBASE and read data from it. Which is the write place to close HBase connection?


 : You have written a MapReduce job. You open a connection to HBASE and read data from it. Which is the write place to close HBase connection?
1. IN setup() method of a Mapper class

2. At the end of map() method of a Mapper class

3. Access Mostly Uused Products by 50000+ Subscribers

4. 2 and 3 both are correct



Question : You have defined Mapper class as below
public class HadoopExamMapper extends Mapper{
public void map(XXXXX key, YYYYY value, Context)
}
What is the correct replacement for XXXXX and YYYYY



 : You have defined Mapper class as below
1. LongWritable, Text

2. LongWritable, IntWritable

3. Access Mostly Uused Products by 50000+ Subscribers

4. IntWritable, Text



Question : Which of the following is a correct statement regarding Input key and Value for the Reducer class


 : Which of the following is a correct statement regarding Input key and Value for the Reducer class
1. Both input key and value type of Reducer must match the output key and value type of a defined Mapper class

2. The output key class and output value class in the Reducer must match those defined in the job configuration

3. Access Mostly Uused Products by 50000+ Subscribers

4. 1,3

5. 1,2



Question : You have following reducer class defined
public class HadoopExamReducer extends Reducer {
public void reduce(XXXXX, key, YYYYY value, Context context) ....
}
What is the correct replacement for XXXXX and YYYYY

 : You have following reducer class defined
1. Text, Iterable

2. Text, IntWritable

3. Access Mostly Uused Products by 50000+ Subscribers

4. IntWritable, List