Amazon AWS Certified Solutions Architect - Associate SAA-C02
Prev

There are 450 results

Next
#381 (Accuracy: 100% / 2 votes)
A company has several web servers that need to frequently access a common Amazon RDS MySQL Multi-AZ DB instance. The company wants a secure method for the web servers to connect to the database while meeting a security requirement to rotate user credentials frequently.
Which solution meets these requirements?
  • A. Store the database user credentials in AWS Secrets Manager. Grant the necessary IAM permissions to allow the web servers to access AWS Secrets Manager.
  • B. Store the database user credentials in AWS Systems Manager OpsCenter. Grant the necessary IAM permissions to allow the web servers to access OpsCenter.
  • C. Store the database user credentials in a secure Amazon S3 bucket. Grant the necessary IAM permissions to allow the web servers to retrieve credentials and access the database.
  • D. Store the database user credentials in files encrypted with AWS Key Management Service (AWS KMS) on the web server file system. The web server should be able to decrypt the files and access the database.
#382 (Accuracy: 100% / 2 votes)
A company is developing a video conversion application hosted on AWS. The application will be available in two tiers: a free tier and a paid tier. Users in the paid tier will have their videos converted first, and then the tree tier users will have their videos converted.
Which solution meets these requirements and is MOST cost-effective?
  • A. One FIFO queue for the paid tier and one standard queue for the free tier.
  • B. A single FIFO Amazon Simple Queue Service (Amazon SQS) queue for all file types.
  • C. A single standard Amazon Simple Queue Service (Amazon SQS) queue for all file types.
  • D. Two standard Amazon Simple Queue Service (Amazon SQS) queues with one for the paid tier and one for the free tier.
#383 (Accuracy: 100% / 1 votes)
A company has an application that uses Amazon Elastic File System (Amazon EFS) to store data. The files are 1 GB in size or larger and are accessed often only for the first few days after creation. The application data is shared across a cluster of Linux servers. The company wants to reduce storage costs tor the application.
What should a solutions architect do to meet these requirements?
  • A. Implement Amazon FSx and mount the network drive on each server.
  • B. Move the files from Amazon Elastic File System (Amazon EFS) and store them locally on each Amazon EC2 instance.
  • C. Configure a Lifecycle policy to move the files to the EFS Infrequent Access (IA) storage class after 7 days.
  • D. Move the files to Amazon S3 with S3 lifecycle policies enabled. Rewrite the application to support mounting the S3 bucket.
#384 (Accuracy: 100% / 3 votes)
A company is using a VPC that is provisioned with a 10.10.1.0/24 CIDR block. Because of continued growth, IP address space in this block might be depleted soon. A solutions architect must add more IP address capacity to the VPC.
Which solution will meet these requirements with the LEAST operational overhead?
  • A. Create a new VPC. Associate a larger CIDR block.
  • B. Add a secondary CIDR block of 10.10.2.0/24 to the VPC.
  • C. Resize the existing VPC CIDR block from 10.10.1.0/24 to 10.10.1.0/16.
  • D. Establish VPC peering with a new VPC that has a CIDR block of 10.10.1.0/16.
#385 (Accuracy: 100% / 3 votes)
A company wants to automate the security assessment of its Amazon EC2 instances. The company needs to validate and demonstrate that security and compliance standards are being followed throughout the development process.
What should a solutions architect do to meet these requirements?
  • A. Use Amazon Macie to automatically discover, classify and protect the EC2 instances.
  • B. Use Amazon GuardDuty to publish Amazon Simple Notification Service (Amazon SNS) notifications.
  • C. Use Amazon Inspector with Amazon CloudWatch to publish Amazon Simple Notification Service (Amazon SNS) notifications
  • D. Use Amazon EventBridge (Amazon CloudWatch Events) to detect and react to changes in the status of AWS Trusted Advisor checks.
#386 (Accuracy: 92% / 11 votes)
A company previously migrated its data warehouse solution to AWS. The company also has an AWS Direct Connect connection. Corporate office users query the data warehouse using a visualization tool. The average size of a query returned by the data warehouse is 50 MB and each webpage sent by the visualization tool is approximately 500 KB. Result sets returned by the data warehouse are not cached.
Which solution provides the LOWEST data transfer egress cost for the company?
  • A. Host the visualization tool on premises and query the data warehouse directly over the internet.
  • B. Host the visualization tool in the same AWS Region as the data warehouse. Access it over the internet.
  • C. Host the visualization tool on premises and query the data warehouse directly over a Direct Connect connection at a location in the same AWS Region.
  • D. Host the visualization tool in the same AWS Region as the data warehouse and access it over a DirectConnect connection at a location in the same Region.
#387 (Accuracy: 100% / 3 votes)
A solutions architect is creating a data processing job that runs once daily and can take up to 2 hours to complete. If the job is interrupted, it has to restart from the beginning.
How should the solutions architect address this issue in the MOST cost-effective manner?
  • A. Create a script that runs locally on an Amazon EC2 Reserved Instance that is triggered by a cron job.
  • B. Create an AWS Lambda function triggered by an Amazon EventBridge (Amazon CloudWatch Events) scheduled event.
  • C. Use an Amazon Elastic Container Service (Amazon ECS) Fargate task triggered by an Amazon EventBridge (Amazon CloudWatch Events) scheduled event.
  • D. Use an Amazon Elastic Container Service (Amazon ECS) task running on Amazon EC2 triggered by an Amazon EventBridge (Amazon CloudWatch Events) scheduled event.
#388 (Accuracy: 92% / 6 votes)
A company needs to ingest and handle large amounts of streaming data that its application generates. The application runs on Amazon EC2 instances and sends data to Amazon Kinesis Data Streams, which is configured with default settings. Every other day, the application consumes the data and writes the data to an
Amazon S3 bucket for business intelligence (BI) processing.
The company observes that Amazon S3 is not receiving all the data that the application sends to
Kinesis Data Streams.

What should a solutions architect do to resolve this issue?
  • A. Update the Kinesis Data Streams default settings by modifying the data retention period.
  • B. Update the application to use the Kinesis Producer Library (KPL) to send the data to Kinesis Data Streams.
  • C. Update the number of Kinesis shards to handle the throughput of the data that is sent to Kinesis Data Streams.
  • D. Turn on S3 Versioning within the S3 bucket to preserve every version of every object that is ingested in the S3 bucket.
#389 (Accuracy: 100% / 5 votes)
A company is developing a file-sharing application that will use an Amazon S3 bucket for storage. The company wants to serve all the files through an Amazon
CloudFront distribution.
The company does not want the files to be accessible through direct navigation to the S3 URL.
What should a solutions architect do to meet these requirements?
  • A. Write individual policies for each S3 bucket to grant read permission for only CloudFront access.
  • B. Create an IAM user. Grant the user read permission to objects in the S3 bucket. Assign the user to CloudFront.
  • C. Write an S3 bucket policy that assigns the CloudFront distribution ID as the Principal and assigns the target S3 bucket as the Amazon Resource Name (ARN).
  • D. Create an origin access identity (OAI). Assign the OAI to the CloudFront distribution. Configure the S3 bucket permissions so that only the OAI has read permission.
#390 (Accuracy: 100% / 2 votes)
A company hosts its core network services, including directory services and DNS, in its on-premises data center. The data center is connected to the AWS Cloud using AWS Direct Connect (DX). Additional AWS accounts are planned that will require quick, cost-effective, and consistent access to these network services.
What should a solutions architect implement to meet these requirements with the LEAST amount of operational overhead?
  • A. Create a DX connection in each new account. Route the network traffic to the on-premises servers.
  • B. Configure VPC endpoints in the DX VPC for all required services. Route the network traffic to the on-premises servers.
  • C. Create a VPN connection between each new account and the DX VPC. Route the network traffic to the on-premises servers.
  • D. Configure AWS Transit Gateway between the accounts. Assign DX to the transit gateway and route network traffic to the on-premises servers.