Spring Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code = simple70

Pass the Amazon Web Services AWS Certified Associate SAA-C03 Questions and answers with Dumpstech

Exam SAA-C03 Premium Access

View all detail and faqs for the SAA-C03 exam

Practice at least 50% of the questions to maximize your chances of passing.
Viewing page 2 out of 14 pages
Viewing questions 21-40 out of questions
Questions # 21:

An ecommerce company has an application that collects order-related information from customers. The company uses one Amazon DynamoDB table to store customer home addresses, phone numbers, and email addresses. Customers can check out without creating an account. The application copies the customer information to a second DynamoDB table if a customer does create an account.

The company requires a solution to delete personally identifiable information (PII) for customers who did not create an account within 28 days.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an AWS Lambda function to delete items from the first DynamoDB table that have a delivery date more than 28 days in the past. Use a scheduled Amazon EventBridge rule to run the Lambda function every day.

B.

Update the application to store PII in an Amazon S3 bucket. Create an S3 Lifecycle rule to expire the objects after 28 days. Move the data to DynamoDB when a user creates an account.

C.

Launch an Amazon EC2 instance. Configure a daily cron job to run on the instance. Configure the cron job to use AWS CLI commands to delete items from DynamoDB.

D.

Use a createdAt timestamp to set TTL for data in the first DynamoDB table to 28 days.

Questions # 22:

A media company is launching a new product platform that artists from around the world can use to upload videos and images directly to an Amazon S3 bucket. The company owns and maintains the S3 bucket. The artists must be able to upload files from personal devices without the need for AWS credentials or an AWS account.

Which solution will meet these requirements MOST securely?

Options:

A.

Enable cross-origin resource sharing (CORS) on the S3 bucket.

B.

Turn off block public access for the S3 bucket. Share the bucket URL to the artists to enable uploads without credentials.

C.

Use an IAM role that has upload permissions for the S3 bucket to generate presigned URLs for S3 prefixes that are specific to each artist. Share the URLs to the artists.

D.

Create a web interface that uses an IAM role that has permission to upload and view objects in the S3 bucket. Share the web interface URL to the artists.

Questions # 23:

A company's application receives requests from customers in JSON format. The company uses Amazon Simple Queue Service (Amazon SQS) to handle the requests.

After the application's most recent update, the company's customers reported that requests were being duplicated. A solutions architect discovers that the application is consuming messages from the SQS queue more than once.

What is the root cause of the issue?

Options:

A.

The visibility timeout is longer than the time it takes the application to process messages from the queue.

B.

The duplicated messages in the SQS queue contain unescaped Unicode characters.

C.

The message size exceeds the maximum of 256 KiB for each SQS message.

D.

The visibility timeout is shorter than the time it takes the application to process messages from the queue.

Questions # 24:

A company runs its legacy web application on AWS. The web application server runs on an Amazon EC2 instance in the public subnet of a VPC. The web application server collects images from customers and stores the image files in a locally attached Amazon Elastic Block Store (Amazon EBS) volume. The image files are uploaded every night to an Amazon S3 bucket for backup.

A solutions architect discovers that the image files are being uploaded to Amazon S3 through the public endpoint. The solutions architect needs to ensure that traffic to Amazon S3 does not use the public endpoint.

Options:

A.

Create a gateway VPC endpoint for the S3 bucket that has the necessary permissions for the VPC. Configure the subnet route table to use the gateway VPC endpoint.

B.

Move the S3 bucket inside the VPC. Configure the subnet route table to access the S3 bucket through private IP addresses.

C.

Create an Amazon S3 access point for the Amazon EC2 instance inside the VPC. Configure the web application to upload by using the Amazon S3 access point.

D.

Configure an AWS Direct Connect connection between the VPC that has the Amazon EC2 instance and Amazon S3 to provide a dedicated network path.

Questions # 25:

A company receives data transfers from a small number of external clients that use SFTP software on an Amazon EC2 instance. The clients use an SFTP client to upload data. The clients use SSH keys for authentication. Every hour, an automated script transfers new uploads to an Amazon S3 bucket for processing.

The company wants to move the transfer process to an AWS managed service and to reduce the time required to start data processing. The company wants to retain the existing user management and SSH key generation process. The solution must not require clients to make significant changes to their existing processes.

Which solution will meet these requirements?

Options:

A.

Reconfigure the script that runs on the EC2 instance to run every 15 minutes. Create an S3 Event Notifications rule for all new object creation events. Set an Amazon Simple Notification Service (Amazon SNS) topic as the destination.

B.

Create an AWS Transfer Family SFTP server that uses the existing S3 bucket as a target. Use service-managed users to enable authentication.

C.

Require clients to add the AWS DataSync agent into their local environments. Create an IAM user for each client that has permission to upload data to the target S3 bucket.

D.

Create an AWS Transfer Family SFTP connector that has permission to access the target S3 bucket for each client. Store credentials in AWS Systems Manager. Create an IAM role to allow the SFTP connector to securely use the credentials.

Questions # 26:

A company hosts its application on several Amazon EC2 instances inside a VPC. The company creates a dedicated Amazon S3 bucket for each customer to store their relevant information in Amazon S3.

The company wants to ensure that the application running on EC2 instances can securely access only the S3 buckets that belong to the company's AWS account.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create a gateway endpoint for Amazon S3 that is attached to the VPC Update the IAM instance profile policy to provide access to only the specific buckets that the application needs.

B.

Create a NAT gateway in a public subnet with a security group that allows access to only Amazon S3 Update the route tables to use the NAT Gateway.

C.

Create a gateway endpoint for Amazon S3 that is attached to the VPC Update the IAM instance profile policy with a Deny action and the following condition key:

D.

Create a NAT Gateway in a public subnet Update route tables to use the NAT Gateway Assign bucket policies for all buckets with a Deny action and the following condition key:

Questions # 27:

A company has an on-premises application that uses SFTP to collect financial data from multiple vendors. The company is migrating to the AWS Cloud. The company has created an application that uses Amazon S3 APIs to upload files from vendors.

Some vendors run their systems on legacy applications that do not support S3 APIs. The vendors want to continue to use SFTP-based applications to upload data. The company wants to use managed services for the needs of the vendors that use legacy applications.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an AWS Database Migration Service (AWS DMS) instance to replicate data from the storage of the vendors that use legacy applications to Amazon S3. Provide the vendors with the credentials to access the AWS DMS instance.

B.

Create an AWS Transfer Family endpoint for vendors that use legacy applications.

C.

Configure an Amazon EC2 instance to run an SFTP server. Instruct the vendors that use legacy applications to use the SFTP server to upload data.

D.

Configure an Amazon S3 File Gateway for vendors that use legacy applications to upload files to an SMB file share.

Questions # 28:

A company is developing software that uses a PostgreSQL database schema. The company needs to configure development environments and test environments for its developers.

Each developer at the company uses their own development environment, which includes a PostgreSQL database. On average, each development environment is used for an 8-hour workday. The test environments will be used for load testing that can take up to 2 hours each day.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Configure development environments and test environments with their own Amazon Aurora Serverless v2 PostgreSQL database.

B.

For each development environment, configure an Amazon RDS for PostgreSQL Single-AZ DB instance. For the test environment, configure a single Amazon RDS for PostgreSQL Multi-AZ DB instance.

C.

Configure development environments and test environments with their own Amazon Aurora PostgreSQL DB cluster.

D.

Configure an Amazon Aurora global database. Allow developers to connect to the database with their own credentials.

Questions # 29:

A company has an application that runs on an Amazon Elastic Kubernetes Service (Amazon EKS) cluster on Amazon EC2 instances. The application has a U1 that uses Amazon DynamoDB and data services that use Amazon S3 as part of the application deployment.

The company must ensure that the EKS Pods for the U1 can access only Amazon DynamoDB and that the EKS Pods for the data services can access only Amazon S3. The company uses AWS Identity and Access Management |IAM).

Which solution meets these requirements?

Options:

A.

Create separate IAM policies (or Amazon S3 and DynamoDB access with the required permissions. Attach both IAM policies to the EC2 instance profile. Use role-based access control (RBAC) to control access to Amazon S3 or DynamoDB (or the respective EKS Pods.

B.

Create separate IAM policies (or Amazon S3 and DynamoDB access with the required permissions. Attach the Amazon S3 IAM policy directly to the EKS Pods (or the data services and the DynamoDB policy to the EKS Pods for the U1.

C.

Create separate Kubernetes service accounts for the U1 and data services to assume an IAM role. Attach the Amazon S3 Full Access policy to the data services account and the AmazonDynamoDBFullAccess policy to the U1 service account.

D.

Create separate Kubernetes service accounts for the U1 and data services to assume an IAM role. Use IAM Role for Service Accounts (IRSA) to provide access to the EKS Pods for the U1 to Amazon S3 and the EKS Pods for the data services to DynamoDB.

Questions # 30:

A finance company uses backup software to back up its data to physical tape storage on-premises. To comply with regulations, the company needs to store the data for 7 years. The company must be able to restore archived data within one week when necessary.

The company wants to migrate the backup data to AWS to reduce costs. The company does not want to change the current backup software.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Use AWS Storage Gateway Tape Gateway to copy the data to virtual tapes. Use AWS DataSync to migrate the virtual tapes to the Amazon S3 Standard-Infrequent Access (S3 Standard-IA). Change the target of the backup software to S3 Standard-IA.

B.

Convert the physical tapes to virtual tapes. Use AWS DataSync to migrate the virtual tapes to Amazon S3 Glacier Flexible Retrieval. Change the target of the backup software to the S3 Glacier Flexible Retrieval.

C.

Use AWS Storage Gateway Tape Gateway to copy the data to virtual tapes. Migrate the virtual tapes to Amazon S3 Glacier Deep Archive. Change the target of the backup software to the virtual tapes.

D.

Convert the physical tapes to virtual tapes. Use AWS Snowball Edge storage-optimized devices to migrate the virtual tapes to Amazon S3 Glacier Flexible Retrieval. Change the target of the backup software to S3 Glacier Flexible Retrieval.

Questions # 31:

A company has a batch processing application that runs every day. The process typically takes an average 3 hours to complete. The application can handle interruptions and can resume the process after a restart. Currently, the company runs the application on Amazon EC2 On-Demand Instances. The company wants to optimize costs while maintaining the same performance level. Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Purchase a 1-year EC2 Instance Savings Plan for the appropriate instance family and size to meet the requirements of the application.

B.

Use EC2 On-Demand Capacity Reservations based on the appropriate instance family and size to meet the requirements of the application. Run the EC2 instances in an Auto Scaling group.

C.

Determine the appropriate instance family and size to meet the requirements of the application. Convert the application to run on AWS Batch with EC2 On-Demand Instances. Purchase a 1-year Compute Savings Plan.

D.

Determine the appropriate instance family and size to meet the requirements of the application. Convert the application to run on AWS Batch with EC2 Spot Instances.

Questions # 32:

A company has 15 employees. The company stores employee start dates in an Amazon DynamoDB table. The company wants to send an email message to each employee on the day of the employee's work anniversary.

Which solution will meet these requirements with the MOST operational efficiency?

Options:

A.

Create a script that scans the DynamoDB table and uses Amazon Simple Notification Service (Amazon SNS) to send email messages to employees when necessary. Use a cron job to run this script every day on an Amazon EC2 instance.

B.

Create a script that scans the DynamoDB table and uses Amazon Simple Queue Service {Amazon SQS) to send email messages to employees when necessary. Use a cron job to run this script every day on an Amazon EC2 instance.

C.

Create an AWS Lambda function that scans the DynamoDB table and uses Amazon Simple Notification Service (Amazon SNS) to send email messages to employees when necessary. Schedule this Lambda function to run every day.

D.

Create an AWS Lambda function that scans the DynamoDB table and uses Amazon Simple Queue Service (Amazon SQS) to send email messages to employees when necessary Schedule this Lambda function to run every day.

Questions # 33:

A company wants to send data from its on-premises systems to Amazon S3 buckets. The company created the S3 buckets in three different accounts. The company must send the data privately without the data traveling across the internet. The company has no existing dedicated connectivity to AWS.

Which combination of steps should a solutions architect take to meet these requirements? (Select TWO.)

Options:

A.

Establish a networking account in the AWS Cloud. Create a private VPC in the networking account. Set up an AWS Direct Connect connection with a private VIF between the on-premises environment and the private VPC.

B.

Establish a networking account in the AWS Cloud. Create a private VPC in the networking account. Set up an AWS Direct Connect connection with a public VIF between the on-premises environment and the private VPC.

C.

Create an Amazon S3 interface endpoint in the networking account.

D.

Create an Amazon S3 gateway endpoint in the networking account.

E.

Establish a networking account in the AWS Cloud. Create a private VPC in the networking account. Peer VPCs from the accounts that host the S3 buckets with the VPC in the network account.

Questions # 34:

A company runs all its business applications in the AWS Cloud. The company uses AWS Organizations to manage multiple AWS accounts.

A solutions architect needs to review all permissions granted to IAM users to determine which users have more permissions than required.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.

Use Network Access Analyzer to review all access permissions in the company's AWS accounts.

B.

Create an AWS CloudWatch alarm that activates when an IAM user creates or modifies resources in an AWS account.

C.

Use AWS Identity and Access Management (IAM) Access Analyzer to review all the company's resources and accounts.

D.

Use Amazon Inspector to find vulnerabilities in existing IAM policies.

Questions # 35:

A company's SAP application has a backend SQL Server database in an on-premises environment. The company wants to migrate its on-premises application and database server to AWS. The company needs an instance type that meets the high demands of its SAP database. On-premises performance data shows that both the SAP application and the database have high memory utilization.

Which solution will meet these requirements?

Options:

A.

Use the compute optimized Instance family for the application Use the memory optimized instance family for the database.

B.

Use the storage optimized instance family for both the application and the database

C.

Use the memory optimized instance family for both the application and the database

D.

Use the high performance computing (HPC) optimized instance family for the application. Use the memory optimized instance family for the database.

Questions # 36:

A solutions architect is designing a system to be highly resilient. The system uses Amazon Route 53 with health checks and an Application Load Balancer (ALB). The system is critical and must have the highest availability possible.

Options:

A.

Automate failover to a healthy resource by automatically updating the value of the Route 53 A record.

B.

Configure the Route 53 health checks to perform a failover automatically.

C.

Automate failover to a healthy resource by updating the weight of the Route 53 weighted record.

D.

Create a new ALB during a failover event, and remap the target group to the new ALB.

Questions # 37:

A solutions architect is configuring a VPC that has public subnets and private subnets. The VPC and subnets use IPv4 CIDR blocks. There is one public subnet and one private subnet in each of three Availability Zones (AZs). An internet gateway is attached to the VPC.

The private subnets require access to the internet to allow Amazon EC2 instances to download software updates.

Which solution will meet this requirement?

Options:

A.

Create a NAT gateway in one of the public subnets. Update the route tables that are attached to the private subnets to forward non-VPC traffic to the NAT gateway.

B.

Create three NAT instances in each private subnet. Create a private route table for each Availability Zone that forwards non-VPC traffic to the NAT instances.

C.

Attach an egress-only internet gateway in the VPC. Update the route tables of the private subnets to forward non-VPC traffic to the egress-only internet gateway.

D.

Create a NAT gateway in one of the private subnets. Update the route tables that are attached to the private subnets to forward non-VPC traffic to the NAT gateway.

Questions # 38:

An ecommerce company hosts an analytics application on AWS. The company deployed the application to one AWS Region. The application generates 300 MB of data each month. The application stores the data in JSON format. The data must be accessible in milliseconds when needed. The company must retain the data for 30 days. The company requires a disaster recovery solution to back up the data.

Options:

A.

Deploy an Amazon OpenSearch Service cluster in the primary Region and in a second Region. Enable OpenSearch Service cluster replication. Configure the clusters to expire data after 30 days. Modify the application to use OpenSearch Service to store the data.

B.

Deploy an Amazon S3 bucket in the primary Region and in a second Region. Enable versioning on both buckets. Use the Standard storage class. Configure S3 Lifecycle policies to expire objects after 30 days. Configure S3 Cross-Region Replication from the bucket in the primary bucket to the backup bucket.

C.

Deploy an Amazon Aurora PostgreSQL global database. Configure cluster replication between the primary Region and a second Region. Use a replicated cluster endpoint during outages in the primary Region.

D.

Deploy an Amazon RDS for PostgreSQL cluster in the same Region where the application is deployed. Configure a read replica in a second Region as a backup.

Questions # 39:

A company is planning to connect a remote office to its AWS infrastructure. The office requires permanent and secure connectivity to AWS. The connection must provide secure access to resources in two VPCs. However, the VPCs must not be able to access each other.

Options:

A.

Create two transit gateways. Set up one AWS Site-to-Site VPN connection from the remote office to each transit gateway. Connect one VPC to the transit gateway. Configure route table propagation to the appropriate transit gateway based on the destination VPC IP range.

B.

Set up one AWS Site-to-Site VPN connection from the remote office to each of the VPCs. Update the VPC route tables with static routes to the remote office resources.

C.

Set up one AWS Site-to-Site VPN connection from the remote office to one of the VPCs. Set up VPC peering between the two VPCs. Update the VPC route tables with static routes to the remote office and peered resources.

D.

Create a transit gateway. Set up an AWS Direct Connect gateway and one Direct Connect connection between the remote office and the Direct Connect gateway. Associate the transit gateway with the Direct Connect gateway. Configure a separate private virtual interface (VIF) for each VPC, and configure routing.

Questions # 40:

A company is developing an application in the AWS Cloud. The application's HTTP API contains critical information that is published in Amazon API Gateway. The critical information must be accessible from only a limited set of trusted IP addresses that belong to the company's internal network.

Which solution will meet these requirements?

Options:

A.

Set up an API Gateway private integration to restrict access to a predefined set ot IP addresses.

B.

Create a resource policy for the API that denies access to any IP address that is not specifically allowed.

C.

Directly deploy the API in a private subnet. Create a network ACL. Set up rules to allow the traffic from specific IP addresses.

D.

Modify the security group that is attached to API Gateway to allow inbound traffic from only the trusted IP addresses.

Viewing page 2 out of 14 pages
Viewing questions 21-40 out of questions