Premium

AWS Certified Solutions Architect – Associate Questions and Answers (Dumps and Practice Questions)



Question : You have created an application for security trading, and you are using Aurora MySQL DB. Till now you have been running your application and Database without data encryption. After audit done by security
team, it is mandated that data in transit as well as at rest needs to be encrypted, which of the following is correct for given scenario
A. You will enable the encryption for new data
B. You will encrypt all the existing data
C. You cannot enable encryption for Aurora Amazon DB
D. You cannot enable encryption already existing data in Aurora DB


 : You have created an application for security trading, and you are using Aurora MySQL DB. Till now you have been running your application and Database without data encryption. After audit done by security
1. A,B
2. B,C
3. C,D
4. A,D
5. B,D

Correct Answer : 4
Explanation: Amazon Aurora DB can encrypt data in transit as well as at rest. They support SSL (AES-256), so that connection between Application and Database instance can be encrypted. If encryption is
enabled than data in transit, as well as all the backups, snapshots, primary storage everything would be encrypted. And entire encryption and decryption is handled seamlessly.

However, if you already have existing data in Aurora DB, which is not encrypted than encryption for unencrypted data is not supported. You have to create new DB instance and then enable encryption and migrate this
entire unencrypted data on new DB instance.




Question : You already have one of the application on MySQL, which was running your own datacenter, and grown up to TB . You decided to move on AWS, your requirement are below
- Initial Capacity of the DB should be 7 TB
- Database will grow by approx. 7-9GB each day and needs to be supported
- Higher performance is required, hence you need 7 read replicas

Which of the following database or storage is good fit for given requirement



 : You already have one of the application on MySQL, which was running your own datacenter, and grown up to TB . You decided to move on AWS, your requirement are below
1. Amazon Redshift

2. Amazon Oracle RDS

3. DynamoDB

4. ElastiCache

5. Amazon AuroraDB



Correct Answer : 5
Explanation: Yes, Aurora DB will be helpful for given requirement, if you have already tested that your existing scripts should work on it as it is or whatever modifications are required done. AuroaDB
support auto scaling in 10GB segments, hence you don't have to provision in advance.

Oracle RDS is not a good option for migrating MySQL DB to Oracle, rather you will prefer same DB engine. Either MySQL RDS or AuroraDB. Aurora DB has many features which are not available in MySQL RDS like auto scaling

Redshift: It is a data warehouse solution and support petabytes of data, but it does not support read replica and also not scale automatically.

DynamoDB: Again it is not an RDBMS, it is a NoSQL DB. Hence, it is not logical to consider it.

ElastiCache: It is good for data caching and not for Persisting data in TB




Question : You have developed a website QuickTechie.com where users can appear for online certification preparation material. All the questions and answers are stored in RDBMS, including user profile data and at a
time millions of users can appear for exam, which require almost 16000IOPS from the database. You have created your own database and installed it on EBS volume, which of the below EBS storage type you will select for
given requirement?

 : You have developed a website QuickTechie.com where users can appear for online certification preparation material. All the questions and answers are stored in RDBMS, including user profile data and at a
1. EBS Provisioned IOPS SSD

2. EBS Magnetic storage

3. EBS Throughput optimized HDD

4. EBS General purpose SSD

5. EBS HDD

Correct Answer : 1
Explanation: What is the requirement? We need high performing single EBS volume which can support up to 16000 IOPS

Option 3 : It talks about throughput. We don't need high throughput and concerned about IOPS. Hence, eliminate this option.
Option 5 : HDD, no it is not good for given IOPS requirement. Hence, eliminate this as well
Option 2 : Magnetic storage also not a solution for high IOPS. Hence, eliminate this option
Option 1 and 4 : Now we need to select correct option from 1 and 4
Let's talk about General Purpose SSD : SSD volumes are good for transactional load.
General Purpose SSD: Max IOPS per volume is = 10000.
Provisioned IOPS SSD: Max IOPS per volume is = 32,000.

Hence option 1is correct.



Related Questions


Question : You are working with a healthcare IT organization, which maintain the health record of many USA health patients. You have two applications one of which create health records and stored it in Amazon S
bucket. This health records cannot be exposed to public and needs to be protected. Another application which is a Web application hosted on EC2 instance needs to read those sensitive documents and whenever user login
on the website can access and view those health records, even their family doctors can also view those documents. However, it is suggested by audit security team that you can access this documents over the public
network, what is the best solution for this problem?


 : You are working with a healthcare IT organization, which maintain the health record of many USA health patients. You have two applications one of which create health records and stored it in Amazon S
1. You will create your custom VPC and attach internet gateway to this and from that gateway, you will access S3 buckets.

2. You will be using VPC peering

3. You will be installing storage gateway to access the data in S3 over the private network.

4. You will be creating a VPN connection, so that data can be accessed over the VPN tunnel

5. You will be using VPC endpoint to access the data from AWS S3


Question : You have a monthly job/batch, which analyzes millions of files accumulated in entire month and contains various patient health detail and want to recommend the patient what he needs to do, hence you have
written good amount of MapReduce code which can run on these files. These jobs needs to be executed once in every 30 days using AWS EC2 instances, which requires approx. 1000 vCPU for approx. 3 hrs. to complete the
entire job. Which of the following approach you will use?



 : You have a monthly job/batch, which analyzes millions of files accumulated in entire month and contains various patient health detail and want to recommend the patient what he needs to do, hence you have
1. You will request 9 EC2 on-demand instances with m5.24xlarge, which can deliver approx. 9X5X24 vCPU = 1080

2. You will request 9 EC2 spot instances with m5.24xlarge, which can deliver approx. 9X5X24 vCPU = 1080 at lower cost

3. You will request 1 EC2 spot instances with m5.24xlarge, which can deliver approx. 9X5X24 vCPU = 216 and run the job for 15 hours

4. You will be using EC2 Fleet to launch EC2 spot instances with m5.24xlarge and capacity would be 1000 vCPU



Question : You have been working with a HealthCare IT company who manages the patients on behalf of various hospitals. This data is very sensitive some research team can run analytics on the data if permitted.
However, this data is very sensitive and needs to be stored in RDBMS. How would you make sure that data stored in RDS is secure and cannot be attacked through network attack, hence research team can access this data
from EC2 instances


 : You have been working with a HealthCare IT company who manages the patients on behalf of various hospitals. This data is very sensitive some research team can run analytics on the data if permitted.
1. You will be having two VPC one for research team and another for RDS instance and make a connection between these two VPC using VPC peering.

2. You will be creating database user for research team so that only permitted users can access data from RDS instance

3. You will be defining security groups such that only data can be accessed from allowed networks.

4. You will be having VPN connection between EC2 instance and RDS instance.


Question : You have developed Docker container and want to run application using this container with the pre-defined EC instance types, EBS volumes and ELB . However, you want to have fine grain control over the
container you have created while deploying your application using Docker container. Which of the following service would be more suitable for given requirement?


 : You have developed Docker container and want to run application using this container with the pre-defined EC instance types, EBS volumes and ELB . However, you want to have fine grain control over the
1. Elastic Beanstalk

2. Amazon Cloudwatch

3. AWS Cloud Formation

4. Amazon Container Service


Question : You have developed a mobile based gaming applications, where various users can participate and maintain their score. You wanted to show top scorer on a particular game, as your application is very
popular and top 1000 scorers are keep changing (It's a leaderboard), however there are in total more than a million users who play this game on regular basis. Which of the following is most suitable data storage which
can give result as fast as needed?


 : You have developed a mobile based gaming applications, where various users can participate and maintain their score. You wanted to show top  scorer on a particular game, as your application is very
1. Amazon ElastiCache using Memcache protocol

2. Amazon ElastiCache using Redis protocol

3. Amazon DynamoDB having index on user id and score together

4. Maintaining data in MySQL RDS and creating primary index on user id and secondary index on score

5. You will use Lambda function which will sort user score in every minute which will be stored in a text file


Question : You have created Docker Image for your application and leverage the AWS ECR (Elastic container Registry). You created a private subnet and wanted to launch instance based on Docker images you created and
registered with the ECR. But you are not able to access that Docker image?

 : You have created Docker Image for your application and leverage the AWS ECR (Elastic container Registry). You created a private subnet and wanted to launch instance based on Docker images you created and
1. You don't have proper IAM role to access this Docker image.

2. You don't have connectivity via internet between your VPC and ECR

3. Your Docker image could be corrupted

4. Datacenter where your Docker image is stored in ECR is down while you are trying to use it.