Question : You are running a web-application on AWS consisting of the following components an Elastic Load Balancer (ELB) an Auto-Scaling Group of EC2 instances running Linux/PHP/Apache, and Relational DataBase Service (RDS) MySQL. Which security measures fall into AWS's responsibility?
Explanation: Network Security - AWS network provides significant protection against traditional network security issues and the developers can implement further protection. Some of the network security features which AWS has are Distributed Denial of Service (DDoS) mitigation, IP spoofing prohibited.IP scanning prohibited,Packet sniffing prevented,All API endpoints are protected by SSL. Please refer "AWS - Overview of Security Processes" for further details. AWS is responsible for protecting the global infrastructure that runs all of the services offered in the AWS cloud. This infrastructure is comprised of the hardware, software, networking, and facilities that develop and run AWS services. For IaaS services like Amazon EC2 and Amazon S3, you have more control and therefore more configuration work to do. For EC2 instances, you're responsible for patching the guest OS on the instances as well as any software you install on them, configuring the security group (firewall) that allows outside access to your instances, and setting up any VPC subnets that the instances reside within, etc. For Amazon S3, you must set the access control policies for each of your storage buckets, set up encryption options for the stored data, and specify backup and archiving preferences. IP Spoofing. Amazon EC2 instances cannot send spoofed network traffic. The AWS-controlled, host-based firewall infrastructure will not permit an instance to send traffic with a source IP or MAC address other than its own.
Question : You use S to store critical data for your company Several users within your group currently have full permissions to your S3 buckets You need to come up with a solution that does not impact your users and also protect against the accidental deletion of objects. Which two options will address this issue? Choose 2 answers A. Enable versioning on your S3 Buckets B. Configure your S3 Buckets with MFA delete C. Create a Bucket policy and only allow read only permissions to all users at the bucket level D. Enable object life cycle policies and configure the data older than 3 months to be archived in Glacier 1. A,C 2. C,D 3. Access Mostly Uused Products by 50000+ Subscribers 4. A,D 5. A,B
Correct Answer : Get Lastest Questions and Answer : Explanation: Versioning allows you to preserve, retrieve, and restore every version of every object stored in an Amazon S3 bucket. Once you enable Versioning for a bucket, Amazon S3 preserves existing objects anytime you perform a PUT, POST, COPY, or DELETE operation on them. By default, GET requests will retrieve the most recently written version. Older versions of an overwritten or deleted object can be retrieved by specifying a version in the request.
Amazon S3 provides customers with a highly durable storage infrastructure. Versioning offers an additional level of protection by providing a means of recovery when customers accidentally overwrite or delete objects. This allows you to easily recover from unintended user actions and application failures. You can also use Versioning for data retention and archiving.
Versioning's MFA Delete capability, which uses multi-factor authentication, can be used to provide an additional layer of security. By default, all requests to your Amazon S3 bucket require your AWS account credentials. If you enable Versioning with MFA Delete on your Amazon S3 bucket, two forms of authentication are required to permanently delete a version of an object: your AWS account credentials and a valid six-digit code and serial number from an authentication device in your physical possession. To learn more about enabling Versioning with MFA Delete, including how to purchase and activate an authentication device
Question : An organization's security policy requires multiple copies of all critical data to be replicated across at least a primary and backup data center. The organization has decided to store some criticaldata on Amazon S3. Which option should you implement to ensure this requirement is met? 1. Use the S3 copy API to replicate data between two S3 buckets in different regions 2. You do not need to implement anything since S3 data is automatically replicated between regions 3. Access Mostly Uused Products by 50000+ Subscribers 4. You do not need to implement anything since S3 data is automatically replicated between multiple facilities within an AWS Region
Explanation: Although, by default, Amazon S3 stores your data across multiple geographically distant Availability Zones, compliance requirements might dictate that you store data at even further distances. Cross-region replication allows you to replicate data between distant AWS regions to satisfy these compliance requirements.