Question : You are working with the HadoopExam training videos website, where end user keep uploading the videos which will be transcoded.
HadoopExam video transcoding website uses Amazon EC2, Amazon S3, and Amazon SimpleDB together and to integrate all this component or create automated workflow, Which of the following AWS component can be used to integrate all this component and create automated workflow ? Hence multiple components or modules can communicate with each other. Even though, all component can not process the same amount of work simultaneously
1. There is no need of any other component because Amazon EC2, Amazon S3, and Amazon SimpleDB can communicate with each other. 2. You need to implement Inter Process Communication (IPC) 3. You have to create VPC and put all these three components in it, hence they can communicate with each other. 4. You can use Amazon SQS component
Correct Answer 4 :
Explanation: Amazon SQS can be used with Amazon EC2, as well as Amazon S3 and Amazon SimpleDB, to make applications more flexible and scalable. A common use case is to create an integrated and automated workflow, where multiple components or modules need to communicate with each other, but cant all process the same amount of work simultaneously. In this case, Amazon SQS queues carry messages to be processed in an orderly fashion by the users application running on Amazon EC2 instances. The Amazon EC2 instances can read the queue, process the job, and then post the results as messages to another Amazon SQS queue (possibly for further processing by another application). Because Amazon EC2 allows applications to scale up and down dynamically, application developers can easily vary the number of compute instances based on the amount of work in the SQS queues, to ensure that jobs are executed in a timely manner.
For example, here is how a video transcoding website uses Amazon EC2, Amazon SQS, Amazon S3, and Amazon SimpleDB together. End users submit videos to be transcoded to the website. The videos are stored in Amazon S3, and a message (the request message) is placed in an Amazon SQS queue (the incoming queue) with a pointer to the video and to the target video format in the message. The transcoding engine, running on a set of Amazon EC2 instances, reads the request message from the incoming queue, retrieves the video from Amazon S3 using the pointer, and transcodes the video into the target format. The converted video is put back into Amazon S3 and another message (the response message) is placed in another Amazon SQS queue (the outgoing queue) with a pointer to the converted video. At the same time, metadata about the video (e.g. format, date created and length) can be indexed into Amazon SimpleDB for easy querying. During this whole workflow, a dedicated Amazon EC2 instance can constantly monitor the incoming queue and, based on the number of messages in the incoming queue, is able to dynamically adjust the number of transcoding Amazon EC2 instances to meet customers response time requirementss.
Question : You are developing a service to continuously post the stock market price, by reading from SQS queue. Which of the following SQS interface you can use in your implememntation
1. standards-based SOAP 2. Query web services 3. You have to write custom code which implements the SQS interfaces (missing numeric value) 4. 1 and 2 5. 1, 2 and 3
Correct Answer 4 :
Explanation:
Amazon SQS provides simple, standards-based SOAP and Query web services interfaces that are designed to work with any Internet-development toolkit. The operations are intentionally made simple to work with messages and queues.
Question : You have processing a message, and max number of processing atttempts have been reached. After that also you want to process the same message, what is the correct choice so that you can process this message again
1. Persist the message in your application 2. Persist the message in DataBase 3. Not possible, once message has been reached max attempts 4. set up a DLQ
Correct Answer : 4
Explanation: A DLQ is an SQS queue which you configure to receive messages from other SQS queues - referred to as source queues. Typically, you set up a DLQ to receive messages after a max number of processing atttempts have been reached. DLQ provides the ability to isolate messages that could not be processed for analysis out of band.
A DLQ is just like any other SQS queue. Messages can be sent to it and received from it like any other SQS queues. You can create a DLQ from the SQS API and the SQS console.
1. AmazonSQSBufferedAsync client supports automatic batching of multiple SendMessage, DeleteMessage 2. AmazonSQSBufferedAsync client supports prefetching of messages into a local buffer 3. AmazonSQSBufferedAsync client supports prefetching of messages into a file in your application (So its durable) 4. 1 and 2 5. 1 and 3