Question : Which all are the correct triggering events for AWS Lambda
A. Change in AWS S3 bucket B. AWS Kinesis stream, can publish message to Lambda Function C. AWS CloudTrail will publish the event directly to Amazon Lambda function D. AWS CloudTrail will Logging an event to S3 and S3 bucket will notify Lambda 1. A,B 2. B,C 3. C,D 4. A,D 5. B,D
Correct Answer : 4 Explanation: There are various ways by which AWS Lambda can be triggered as below - You can create an event in your application, so that Lambda can be triggered. - S3 publishes the event which can trigger Lambda function. - Kinesis will not publish event, Lambda has to pull the message from AWS Kinesis and that message can be processed by AWS Lambda. - CloudTrail cannot directly publish the events, it can create a logging in S3 bucket and then S3 notification service can be used to trigger the AWS Lambda
Question : You have a website with the domain name as QuickTechie.com where students can appear for online exam and score card and certificate will be issued after successfully completing the exam. You have two separate storage structure for score card and certificate. What you want that after one year score card should be deleted and certificate should be moved to RRS storage. Which of the below functionality can help to implement this requirement?
1. You should use S3 Lifecycle management
2. You will be writing an AWS Lambda function which will keep checking once in week age of particular object and take action accordingly.
3. You will have to setup bucket policy specifically for these two different storage one for Score card and another for certificate.
4. You will be using Simple Workflow service where in one activity you will check the age of certificate and move to RRS and in another activity you will check score card age and delete that object if those are older than one year. And schedule that workflow to run once in a week.
Correct Answer : 1 Explanation: Amazon provides many things to be done automatically which are regular need in digital world. Amazon lifecycle management is the way, in which you can define what to do if object age reaches to a defined limit, using that you can delete the object or you can move that object to another lower cost storage like RRS or Glacier. Hence, this the most efficient solution, however, you can use any other way as mentioned in the option to implement given requirement but that is not an efficient solution.
Question : You have developed custom fonts for the websites, who wanted to protect their websites from copy-paste and cannot easily scrapped. You have hosted these fonts in S bucket which is protected and only paid subscriber can use this fonts. Any website who uses these fonts will have access to the S3 bucket for accessing fonts. What else your client have to do?
1. Your client website will download the font locally, on their webserver every day to use these fonts on the website
2. Your client website will have to have cookie enabled sessions for having this fonts to be used
3. Your client would have a S3 bucket which will have a connection with the original S3 bucket hosting website fonts.
4. You have to enabled cross origin resource sharing for the bucket which is hosting this custom paid fonts
Correct Answer : 4 Explanation: All options given other than CORS (Cross origin resource sharing) is not relevant and not even logical. In various scenarios as mentioned below you have to use CROS Origin Resource sharing
1. Font are stored in S3 bucket which are used by any website, then your browser requires a CORS check i.e. preflight check for loading those fonts. You would configure the bucket that is hosting the web font to allow any origin to make these requests.
2. Suppose you are hosting a static website called Acmeshell and that website will be loaded as http://acmeshell.s3-website-us-east-1.amazon.com . Now you want to use Java Script that is stored in bucket to be able to make authenticated Get and PUT requests against same bucket. But generally browser will block such request using Java Script, hence you need to enable CORS on this bucket so that browser can execute that request.
1. You will create IAM user for each new subscriber on HadoopExam.com and give access on those videos.
2. You will create an IAM group which has permission on these videos. And you will keep adding new user in that group.
3. You will create IAM Role and attach it to the EC2 instance, where www.HadoopExam.com instance is installed. This role should have permission on videos stored in S3
4. No you don't have to do anything with IAM. Just manage member on your website.