Amazon AWS Certified Database - Specialty
Prev

There are 231 results

Next
#121 (Accuracy: 100% / 2 votes)
A company has a 250 GB Amazon RDS Multi-AZ DB instance. The company’s disaster recovery policy requires an RPO of 6 hours in a second AWS Region.

Which solution will meet these requirements MOST cost-effectively?
  • A. Use RDS automated snapshots. Create an AWS Lambda function to copy the snapshot to a second Region.
  • B. Use RDS automated snapshots every 6 hours. Use Amazon S3 Cross-Region Replication to copy the snapshot to a second Region.
  • C. Use AWS Backup to take an RDS snapshot every 6 hours and to copy the snapshot to a second Region.
  • D. Create an RDS cross-Region read replica in a second Region. Use AWS Backup to take an automated snapshot of the read replica every 6 hours.
#122 (Accuracy: 90% / 4 votes)
A company that is located in the United States wants to expand its operations in Asia. The company’s data in the us-west-2 Region is stored in an Amazon DynamoD8 table. The company’s development team in the ap-northeast-1 Region needs to perform user acceptance testing (UAT) and several other performance feasibility tests with a copy of production data from us-west-2. The feasibility tests do not need to be run on data that is updated in real time.

Which solution will make data available from us-west-2 to ap-northeast-1 MOST cost-effectively?
  • A. Create a new DynamoDB table in ap-northeast-1. Create an AWS Glue job to perform a data export from the DynamoDB table in us-west-2. Import the same data into the DynamoDB table in ap-northeast-1.
  • B. Enable DynamoDB Streams on the DynamoDB table in us-west-2. Create a new DynamoDB table in ap-northeast-1. Create an AWS Lambda function to poll the DynamoDB table stream in us-west-2 and to deliver batch records from the stream to the new DynamoDB table in ap-northeast-1.
  • C. Use point-in-time recovery to restore the DynamoDB table from us-west-2 lo ap-northeast-1.
  • D. Enable DynamoDB Streams on the DynamoDB table in us-west-2. Add ap-northeast-1 to the DynamoDB global tables setting in us-west-2.
#123 (Accuracy: 100% / 5 votes)
A gaming company uses Amazon Aurora Serverless for one of its internal applications. The company's developers use Amazon RDS Data API to work with the
Aurora Serverless DB cluster.
After a recent security review, the company is mandating security enhancements. A database specialist must ensure that access to
RDS Data API is private and never passes through the public internet.

What should the database specialist do to meet this requirement?
  • A. Modify the Aurora Serverless cluster by selecting a VPC with private subnets.
  • B. Modify the Aurora Serverless cluster by unchecking the publicly accessible option.
  • C. Create an interface VPC endpoint that uses AWS PrivateLink for RDS Data API.
  • D. Create a gateway VPC endpoint for RDS Data API.
#124 (Accuracy: 100% / 7 votes)
An online retail company is planning a multi-day flash sale that must support processing of up to 5,000 orders per second. The number of orders and exact schedule for the sale will vary each day. During the sale, approximately 10,000 concurrent users will look at the deals before buying items. Outside of the sale, the traffic volume is very low. The acceptable performance for read/write queries should be under 25 ms. Order items are about 2 KB in size and have a unique identifier. The company requires the most cost-effective solution that will automatically scale and is highly available.
Which solution meets these requirements?
  • A. Amazon DynamoDB with on-demand capacity mode
  • B. Amazon Aurora with one writer node and an Aurora Replica with the parallel query feature enabled
  • C. Amazon DynamoDB with provisioned capacity mode with 5,000 write capacity units (WCUs) and 10,000 read capacity units (RCUs)
  • D. Amazon Aurora with one writer node and two cross-Region Aurora Replicas
#125 (Accuracy: 100% / 5 votes)
A gaming company is developing a new mobile game and decides to store the data for each user in Amazon DynamoDB. To make the registration process as easy as possible, users can log in with their existing Facebook or Amazon accounts. The company expects more than 10,000 users.
How should a database specialist implement access control with the LEAST operational effort?
  • A. Use web identity federation on the mobile app and AWS STS with an attached IAM role to get temporary credentials to access DynamoDB.
  • B. Use web identity federation on the mobile app and create individual IAM users with credentials to access DynamoDB.
  • C. Use a self-developed user management system on the mobile app that lets users access the data from DynamoDB through an API.
  • D. Use a single IAM user on the mobile app to access DynamoDB.
#126 (Accuracy: 100% / 3 votes)
A database specialist is working with a company to launch a new website. The website accesses a database on an Amazon Aurora MySQL DB cluster that is configured with several Aurora Replicas. The website will replace an on-premises website that is connected to a legacy relational database. Because of stability issues in the legacy database, the company wants to test the resiliency of the Aurora cluster.

Which action can the database specialist take to test the resiliency of the Aurora cluster?
  • A. Simulate a failover test of the Aurora cluster resiliency by using the failover testing feature from the Resiliency Toolkit.
  • B. Submit a fault injection query to one of the Aurora Replica instances by connecting to the endpoint for the Aurora Replica.
  • C. Simulate a failover test of the Aurora cluster by using the PromoteReadReplica API operation to promote one of the read replica DB instances to a standalone Aurora DB instance.
  • D. Use Amazon RDS Performance Insights to capture resiliency-related metrics for the Aurora cluster during periods of high load.
#127 (Accuracy: 100% / 3 votes)
A company is using Amazon Redshift. A database specialist needs to allow an existing Redshift cluster to access data from other Redshift clusters. Amazon RDS for PostgreSQL databases, and AWS Glue Data Catalog tables.

Which combination of steps will meet these requirements with the MOST operational efficiency? (Choose three.)
  • A. Take a snapshot of the required tables from the other Redshift clusters. Restore the snapshot into the existing Redshift cluster.
  • B. Create external tables in the existing Redshift database to connect to the AWS Glue Data Catalog tables.
  • C. Unload the RDS tables and the tables from the other Redshift clusters into Amazon S3. Run COPY commands to load the tables into the existing Redshift cluster.
  • D. Use federated queries to access data in Amazon RDS.
  • E. Use data sharing to access data from the other Redshift clusters.
  • F. Use AWS Glue jobs to transfer the AWS Glue Data Catalog tables into Amazon S3. Create external tables in the existing Redshift database to access this data.
#128 (Accuracy: 100% / 4 votes)
A company uses an Amazon Aurora MySQL DB cluster with the most recent version of the MySQL database engine. The company wants all data that is transferred between clients and the DB cluster to be encrypted.

What should a database specialist do to meet this requirement?
  • A. Turn on data encryption when modifying the DB cluster by using the AWS Management Console or by using the AWS CLI to call the modify-db-cluster command.
  • B. Download the key pair for the DB instance. Reference that file from the --key-name option when connecting with a MySQL client.
  • C. Turn on data encryption by using AWS Key Management Service (AWS KMS). Use the AWS KMS key to encrypt the connections between a MySQL client and the Aurora DB cluster.
  • D. Turn on the require_secure_transport parameter in the DB cluster parameter group. Download the root certificate for the DB instance. Reference that file from the --ssl-ca option when connecting with a MySQL client.
#129 (Accuracy: 100% / 5 votes)
A company runs hundreds of Microsoft SQL Server databases on Windows servers in its on-premises data center. A database specialist needs to migrate these databases to Linux on AWS.
Which combination of steps should the database specialist take to meet this requirement? (Choose three.)
  • A. Install AWS Systems Manager Agent on the on-premises servers. Use Systems Manager Run Command to install the Windows to Linux replatforming assistant for Microsoft SQL Server Databases.
  • B. Use AWS Systems Manager Run Command to install and configure the AWS Schema Conversion Tool on the on-premises servers.
  • C. On the Amazon EC2 console, launch EC2 instances and select a Linux AMI that includes SQL Server. Install and configure AWS Systems Manager Agent on the EC2 instances.
  • D. On the AWS Management Console, set up Amazon RDS for SQL Server DB instances with Linux as the operating system. Install AWS Systems Manager Agent on the DB instances by using an options group.
  • E. Open the Windows to Linux replatforming assistant tool. Enter configuration details of the source and destination databases. Start migration.
  • F. On the AWS Management Console, set up AWS Database Migration Service (AWS DMS) by entering details of the source SQL Server database and the destination SQL Server database on AWS. Start migration.
#130 (Accuracy: 100% / 3 votes)
A company is creating a serverless application that uses multiple AWS services and stores data on an Amazon RDS DB instance. The database credentials must be stored securely. An AWS Lambda function must be able to access the credentials. The company also must rotate the database password monthly by using an automated solution.

What should a database specialist do to meet those requirements in the MOST secure manner?
  • A. Store the database credentials by using AWS Systems Manager Parameter Store. Enable automatic rotation of the password. Use the AWS Cloud Development Kit (AWS CDK) in the Lambda function to retrieve the credentials from Parameter Store
  • B. Encrypt the database credentials by using AWS Key Management Service (AWS KMS). Store the credentials in Amazon S3. Use an S3 Lifecycle policy to rotate the password. Retrieve the credentials by using Python code in Lambda
  • C. Store the database credentials by using AWS Secrets Manager. Enable automatic rotation of the password. Configure the Lambda function to use the Secrets Manager API to retrieve the credentials
  • D. Store the database credentials in an Amazon DynamoDB table. Assign an IAM role to the Lambda function to grant the Lambda function read-only access to the DynamoDB table. Rotate the password by using another Lambda function that runs monthly