Certification Practice Test | PDF Questions | Actual Questions | Test Engine | Pass4Sure
DBS-C01 : AWS Certified Database - Specialty Exam

Amazon DBS-C01 Questions & Answers
Full Version: 100 Q&A
DBS-C01 Dumps DBS-C01 Braindumps
DBS-C01 Real Questions DBS-C01 Practice Test DBS-C01 Actual Questions
killexams.com
Amazon
DBS-C01
AWS Certified Database - Specialty
https://killexams.com/pass4sure/exam-detail/DBS-C01
Question: 87
A database specialist manages a critical Amazon RDS for MySQL DB instance for a company. The data stored daily could vary from .01% to 10% of the current database size. The database specialist needs to ensure that the DB instance storage grows as needed.
What is the MOST operationally efficient and cost-effective solution?
1. Configure RDS Storage Auto Scaling.
2. Configure RDS instance Auto Scaling.
3. Modify the DB instance allocated storage to meet the forecasted requirements.
4. Monitor the Amazon CloudWatch FreeStorageSpace metric daily and add storage as required.
Answer: A
Explanation:
If your workload is unpredictable, you can enable storage autoscaling for an Amazon RDS DB instance. With storage autoscaling enabled, when Amazon RDS detects that you are running out of free database space it automatically scales up your storage. https://aws.amazon.com/about-aws/whats-new/2019/06/rds-storage-auto-scaling/
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_PIOPS.StorageTypes.html#USER_PIOPS.Autoscaling
Question: 88
A company is due for renewing its database license. The company wants to migrate its 80 TB transactional database system from on-premises to the AWS Cloud. The migration should incur the least possible downtime on the downstream database applications. The companyâs network infrastructure has limited network bandwidth that is shared with other applications.
Which solution should a database specialist use for a timely migration?
1. Perform a full backup of the source database to AWS Snowball Edge appliances and ship them to be loaded to Amazon S3. Use AWS DMS to migrate change data capture (CDC) data from the source database to Amazon S3. Use a second AWS DMS task to migrate all the S3 data to the target database.
2. Perform a full backup of the source database to AWS Snowball Edge appliances and ship them to be loaded to Amazon S3. Periodically perform incremental backups of the source database to be shipped in another Snowball Edge appliance to handle syncing change data capture (CDC) data from the source to the target database.
3. Use AWS DMS to migrate the full load of the source database over a VPN tunnel using the internet for its primary connection. Allow AWS DMS to handle syncing change data capture (CDC) data from the source to the target database.
4. Use the AWS Schema Conversion Tool (AWS SCT) to migrate the full load of the source database over a VPN tunnel using the internet for its primary connection. Allow AWS SCT to handle syncing change data capture (CDC) data from the source to the target database.
Answer: A
Explanation:
https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Target.S3.html Using Amazon S3 as a target for AWS Database Migration Service
Question: 89
A database specialist is responsible for an Amazon RDS for MySQL DB instance with one read replica. The DB instance and the read replica are assigned to the default parameter group. The database team currently runs test queries against a read replica. The database team wants to create additional tables in the read replica that will only be accessible from the read replica to benefit the tests.
Which should the database specialist do to allow the database team to create the test tables?
1. Contact AWS Support to disable read-only mode on the read replica. Reboot the read replica. Connect to the read replica and create the tables.
2. Change the read_only parameter to false (read_only=0) in the default parameter group of the read replica. Perform a reboot without failover. Connect to the read replica and create the tables using the local_only MySQL option.
3. Change the read_only parameter to false (read_only=0) in the default parameter group. Reboot the read replica. Connect to the read replica and create the tables.
4. Create a new DB parameter group. Change the read_only parameter to false (read_only=0). Associate the read replica with the new group. Reboot the read replica. Connect to the read replica and create the tables.
Answer: D
Explanation:
https://aws.amazon.com/premiumsupport/knowledge-center/rds-read-replica/
Question: 90
A company has a heterogeneous six-node production Amazon Aurora DB cluster that handles online transaction processing (OLTP) for the core business and OLAP reports for the human resources department. To match compute resources to the use case, the company has decided to have the reporting workload for the human resources department be directed to two small nodes in the Aurora DB cluster, while every other workload goes to four large nodes in the same DB cluster.
Which option would ensure that the correct nodes are always available for the appropriate workload while meeting these requirements?
1. Use the writer endpoint for OLTP and the reader endpoint for the OLAP reporting workload.
2. Use automatic scaling for the Aurora Replica to have the appropriate number of replicas for the desired workload.
3. Create additional readers to cater to the different scenarios.
4. Use custom endpoints to satisfy the different workloads.
Answer: D
Explanation:
https://aws.amazon.com/about-aws/whats-new/2018/11/amazon-aurora-simplifies-workload-management-with-custom-endpoints/
You can now create custom endpoints for Amazon Aurora databases. This allows you to distribute and load balance workloads across different sets of database instances in your Aurora cluster. For example, you may provision a set of Aurora Replicas to use an instance type with higher memory capacity in order to run an analytics workload. A custom endpoint can then help you route the analytics workload to these appropriately-configured instances, while keeping other instances in your cluster isolated from this workload. As you add or remove instances from the custom endpoint to match your workload, the endpoint helps spread the load around.
Question: 91
Developers have requested a new Amazon Redshift cluster so they can load new third-party marketing data. The new cluster is ready and the user credentials are given to the developers.
The developers indicate that their copy jobs fail with the following error message:
âAmazon Invalid operation: S3ServiceException:Access Denied,Status 403,Error AccessDenied.â
The developers need to load this data soon, so a database specialist must act quickly to solve this issue. What is the MOST secure solution?
1. Create a new IAM role with the same user name as the Amazon Redshift developer user I
2. Provide the IAM role with read-only access to Amazon S3 with the assume role action.
3. Create a new IAM role with read-only access to the Amazon S3 bucket and include the assume role action. Modify the Amazon Redshift cluster to add the IAM role.
4. Create a new IAM role with read-only access to the Amazon S3 bucket with the assume role action. Add this role to the developer IAM user ID used for the copy job that ended with an error message.
5. Create a new IAM user with access keys and a new role with read-only access to the Amazon S3 bucket. Add this role to the Amazon Redshift cluster. Change the copy job to use the access keys created.
Answer: B
Explanation:
https://docs.aws.amazon.com/redshift/latest/gsg/rs-gsg-create-an-iam-role.html
"Now that you have created the new role, your next step is to attach it to your cluster. You can attach
the role when you launch a new cluster or you can attach it to an existing cluster. In the next step, you attach the role to a new cluster." https://docs.aws.amazon.com/redshift/latest/dg/copy-usage_notes-access-permissions.html
Question: 92
A database specialist at a large multi-national financial company is in charge of designing the disaster recovery strategy for a highly available application that is in development. The application uses an Amazon DynamoDB table as its data store. The application requires a recovery time objective (RTO) of 1 minute and a recovery point objective (RPO) of 2 minutes.
Which operationally efficient disaster recovery strategy should the database specialist recommend for the DynamoDB table?
1. Create a DynamoDB stream that is processed by an AWS Lambda function that copies the data to a DynamoDB table in another Region.
2. Use a DynamoDB global table replica in another Region. Enable point-in-time recovery for both tables.
3. Use a DynamoDB Accelerator table in another Region. Enable point-in-time recovery for the table.
4. Create an AWS Backup plan and assign the DynamoDB table as a resource.
Answer: C Question: 93
A small startup company is looking to migrate a 4 TB on-premises MySQL database to AWS using an Amazon RDS for MySQL DB instance. Which strategy would allow for a successful migration with the LEAST amount of downtime?
1. Deploy a new RDS for MySQL DB instance and configure it for access from the on-premises data center. Use the mysqldump utility to create an initial snapshot from the on-premises MySQL server, and copy it to an Amazon S3 bucket. Import the snapshot into the DB instance utilizing the MySQL utilities running on an Amazon EC2 instance. Immediately point the application to the DB instance.
2. Deploy a new Amazon EC2 instance, install the MySQL software on the EC2 instance, and configure networking for access from the on-premises data center. Use the mysqldump utility to create a snapshot of the on-premises MySQL server. Copy the snapshot into the EC2 instance and restore it into the EC2 MySQL instance. Use AWS DMS to migrate data into a new RDS for MySQL DB instance. Point the application to the DB instance.
3. Deploy a new Amazon EC2 instance, install the MySQL software on the EC2 instance, and configure networking for access from the on-premises data center. Use the mysqldump utility to create a snapshot of the on-premises MySQL server. Copy the snapshot into an Amazon S3 bucket and import the snapshot into a new RDS for MySQL DB instance using the MySQL utilities running on an EC2 instance. Point the application to the DB instance.
4. Deploy a new RDS for MySQL DB instance and configure it for access from the on-premises data center. Use the mysqldump utility to create an initial snapshot from the on-premises MySQL server, and copy it to an Amazon S3 bucket. Import the snapshot into the DB instance using the MySQL utilities running on an Amazon EC2 instance. Establish replication into the new DB instance using MySQL replication. Stop application access to the on-premises MySQL server and let the remaining transactions replicate over. Point the application to the DB instance.
Answer: B Question: 94
A software development company is using Amazon Aurora MySQL DB clusters for several use cases, including development and reporting. These use cases place unpredictable and varying demands on the Aurora DB clusters, and can cause momentary spikes in latency. System users run ad-hoc queries sporadically throughout the week. Cost is a primary concern for the company, and a solution that does not require significant rework is needed.
Which solution meets these requirements?
1. Create new Aurora Serverless DB clusters for development and reporting, then migrate to these new DB clusters.
2. Upgrade one of the DB clusters to a larger size, and consolidate development and reporting activities on this larger DB cluster.
3. Use existing DB clusters and stop/start the databases on a routine basis using scheduling tools.
4. Change the DB clusters to the burstable instance family.
Answer: A
Explanation: https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/Concepts.DBInstanceClass.html
Question: 95
A database specialist is building a system that uses a static vendor dataset of postal codes and related territory information that is less than 1 GB in size. The dataset is loaded into the applicationâs cache at start up. The company needs to store this data in a way that provides the lowest cost with a low application startup time.
Which approach will meet these requirements?
1. Use an Amazon RDS DB instance. Shut down the instance once the data has been read.
2. Use Amazon Aurora Serverless. Allow the service to spin resources up and down, as needed.
3. Use Amazon DynamoDB in on-demand capacity mode.
4. Use Amazon S3 and load the data from flat files.
Answer: D
Explanation:
https://www.sumologic.com/insight/s3-cost-optimization/
For example, for 1 GB file stored on S3 with 1 TB of storage provisioned, you are billed for 1 GB only. In a lot of other services such as Amazon EC2, Amazon Elastic Block Storage (Amazon EBS) and Amazon DynamoDB you pay for provisioned capacity. For example, in the case of Amazon EBS disk you pay for the size of 1 TB of disk even if you just save 1 GB file. This makes managing S3 cost easier than many other services including Amazon EBS and Amazon EC2. On S3 there is no risk of over-provisioning and no need to manage disk utilization.
Question: 96
A database specialist needs to review and optimize an Amazon DynamoDB table that is experiencing performance issues. A thorough investigation by the database specialist reveals that the partition key is causing hot partitions, so a new partition key is created. The database specialist must effectively apply this new partition key to all existing and new data.
How can this solution be implemented?
1. Use Amazon EMR to export the data from the current DynamoDB table to Amazon S3. Then use Amazon EMR again to import the data from Amazon S3 into a new DynamoDB table with the new partition key.
2. Use AWS DMS to copy the data from the current DynamoDB table to Amazon S3. Then import the DynamoDB table to create a new DynamoDB table with the new partition key.
3. Use the AWS CLI to update the DynamoDB table and modify the partition key.
4. Use the AWS CLI to back up the DynamoDB table. Then use the restore-table-from-backup command and modify the partition key.
Answer: A
Explanation:
https://aws.amazon.com/premiumsupport/knowledge-center/back-up-dynamodb-s3/
Question: 97
A company is going through a security audit. The audit team has identified cleartext master user password in the AWS CloudFormation templates for Amazon RDS for MySQL DB instances. The audit team has flagged this as a security risk to the database team.
What should a database specialist do to mitigate this risk?
1. Change all the databases to use AWS IAM for authentication and remove all the cleartext passwords in CloudFormation templates.
2. Use an AWS Secrets Manager resource to generate a random password and reference the secret in the CloudFormation template.
3. Remove the passwords from the CloudFormation templates so Amazon RDS prompts for the password when the database is being created.
4. Remove the passwords from the CloudFormation template and store them in a separate file. Replace the passwords by running CloudFormation using a sed command.
Answer: B
Explanation:
https://aws.amazon.com/blogs/infrastructure-and-automation/securing-passwords-in-aws-quick-starts-using-aws-secrets-manager/
Question: 98
A companyâs database specialist disabled TLS on an Amazon DocumentDB cluster to perform benchmarking tests. A few days after this change was implemented, a database specialist trainee accidentally deleted multiple tables. The database specialist restored the database from available snapshots. An hour after restoring the cluster, the database specialist is still unable to connect to the new cluster endpoint.
What should the database specialist do to connect to the new, restored Amazon DocumentDB cluster?
1. Change the restored clusterâs parameter group to the original clusterâs custom parameter group.
2. Change the restored clusterâs parameter group to the Amazon DocumentDB default parameter group.
3. Configure the interface VPC endpoint and associate the new Amazon DocumentDB cluster.
4. Run the syncInstances command in AWS DataSync.
Answer: A
Explanation:
You can't modify the parameter settings of the default parameter groups. You can use a DB parameter group to act as a container for engine configuration values that are applied to one or more DB instances. If you create a DB instance without specifying a DB parameter group, the DB instance uses a default DB parameter group. Each default DB parameter group contains database engine defaults and Amazon RDS system defaults. You can't modify the parameter settings of a default parameter group. Instead, you create your own parameter group where you choose your own parameter settings. Not all DB engine parameters can be changed in a parameter group that you create.
Question: 99
A company runs a customer relationship management (CRM) system that is hosted on-premises with a MySQL database as the backend. A custom stored procedure is used to send email notifications to another system when data is inserted into a table. The company has noticed that the performance of the CRM system has decreased due to database reporting applications used by various teams. The company requires an AWS solution that would reduce maintenance, improve performance, and
accommodate the email notification feature. Which AWS solution meets these requirements?
1. Use MySQL running on an Amazon EC2 instance with Auto Scaling to accommodate the reporting applications. Configure a stored procedure and an AWS Lambda function that uses Amazon SES to send email notifications to the other system.
2. Use Amazon Aurora MySQL in a multi-master cluster to accommodate the reporting applications. Configure Amazon RDS event subscriptions to publish a message to an Amazon SNS topic and subscribe the other system's email address to the topic.
3. Use MySQL running on an Amazon EC2 instance with a read replica to accommodate the reporting applications. Configure Amazon SES integration to send email notifications to the other system.
4. Use Amazon Aurora MySQL with a read replica for the reporting applications. Configure a stored procedure and an AWS Lambda function to publish a message to an Amazon SNS topic. Subscribe the other system's email address to the topic.
Answer: D
Explanation:
RDS event subscriptions do not cover "data is inserted into a table" - see https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/USER_Events.Messages.html We can use stored procedure to invoke Lambda function -
https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/AuroraMySQL.Integrating.Lamb da.html
Question: 100
A company needs to migrate Oracle Database Standard Edition running on an Amazon EC2 instance to an Amazon RDS for Oracle DB instance with Multi-AZ. The database supports an ecommerce website that runs continuously. The company can only provide a maintenance window of up to 5 minutes.
Which solution will meet these requirements?
1. Configure Oracle Real Application Clusters (RAC) on the EC2 instance and the RDS DB instance. Update the connection string to point to the RAC cluster. Once the EC2 instance and RDS DB instance are in sync, fail over from Amazon EC2 to Amazon RD
2. Export the Oracle database from the EC2 instance using Oracle Data Pump and perform an import into Amazon RD
3. Stop the application for the entire process. When the import is complete, change the database connection string and then restart the application.
4. Configure AWS DMS with the EC2 instance as the source and the RDS DB instance as the destination. Stop the application when the replication is in sync, change the database connection string, and then restart the application.
5. Configure AWS DataSync with the EC2 instance as the source and the RDS DB instance as the destination. Stop the application when the replication is in sync, change the database connection string, and then restart the application.
Answer: B
Explanation:
Reference: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_UpgradeDBInstance.Oracle.html
User: Maria*****![]() ![]() ![]() ![]() ![]() The dbs-c01 questions from Killexams.com are excellent and reflect what is covered in the actual exam. I loved the entire guidance material from Killexams.com. I passed the exam with over 80%. |
User: Edward*****![]() ![]() ![]() ![]() ![]() Struggling with the Amazon DBS-C01 exam, I sought help from friends but found their materials confusing. Killexams.com’s Questions and Answers package was a treasure trove of clarity, covering every topic comprehensively. Thanks to them, I answered every question correctly and achieved professional success. |
User: Aarav*****![]() ![]() ![]() ![]() ![]() High-quality training materials were key to my 98% score on the DBS-C01 exam. By memorizing their questions and answers, I confidently marked the correct answers during the test. Their resources made preparation seamless, and I highly recommend them to all candidates. |
User: Taniya*****![]() ![]() ![]() ![]() ![]() Testprep resources energized my DBS-C01 exam preparation, providing critical educational data for high rankings. Their effective materials ensured success, and I am grateful for their outstanding support. |
User: Tanny*****![]() ![]() ![]() ![]() ![]() I want to take the opportunity to thank all the crew members of killexams.com for creating such an exquisite platform for us. With the help of their online questions and cases, I easily passed my aws certified database - specialty certification with 81% marks. It was sincerely helpful to understand the type and patterns of questions and explanations provided for answers, which made my concepts crystal clear. Thank you for all of the guides, and keep up the good work, killexams.com. I am grateful for Killexams.com and their extraordinary efforts to provide top-quality study materials for dbs-c01 exam participants. Their commitment to ensuring candidates success is admirable, and I was able to pass the dbs-c01 exam with their materials help. I am delighted to report that I scored 84% in the dbs-c01 exam within the stipulated time, thanks to Killexams.com. Working full-time made it challenging to cover the extensive syllabus, but the concise answers provided by Killexams.com helped me prepare well, especially for elaborate topics. I plan to take further exams with the help of Killexams.com in the future to enhance my professional growth. With just a week remaining until my dbs-c01 exam, I was not confident about passing. I decided to use killexams.com practice tests for my exam preparation, and I was amazed at how enjoyable the subject matter became. Thanks to their materials, I passed with flying colors. I passed the dbs-c01 exam on my first attempt, all thanks to the Killexams questions and answers. The workbook-style of questions helped me apply my understanding to the query and answer format. The exam simulator provided me with a complete understanding of the exam paper, and I am extremely grateful for this tool. As a busy person, I did not have time to prepare for the dbs-c01 exam. I was worried that I would fail the exam, but Killexams.com turned out to be a lifesaver. I was able to prepare for the exam easily using my computer and the reliable and high-quality material provided by Killexams.com. Before discovering Killexams.com, I had doubts about the capabilities of the internet. However, after creating an account, I saw a whole new world of possibilities. Their test questions and answers, along with the structured approach, helped me achieve success in my dbs-c01 exam. Although I missed more than one question, I still passed the exam with a score of 43/50. I got the questions right, but did not keep in mind the answers given in the study material. My advice is to thoroughly study all the material from killexams.com Questions and Answers - this is everything I needed to pass. Killexams is 100% trustworthy, and a big portion of the questions were similar to what I got on the aws certified database - specialty exam. I am proud to have passed my dbs-c01 exam, achieving a score of 89%, thanks to my studies with killexams.com. This was not just a simple pass but a great one, and I would proudly recommend this guide to anyone. Passing the dbs-c01 exam was a challenging task, but killexams.com helped me gain composure by using their dbs-c01 practice tests to prepare myself for the exam. The dbs-c01 exam simulator was a useful tool that enabled me to pass the dbs-c01 exam and get promoted in my organization. Thanks to Killexams.com extraordinary practice tests test materials, I passed my dbs-c01 exam within two weeks with a score of 96%. I am now very confident that I will do better in my remaining three exams and will honestly use the practice practice test and recommend it to my friends. Thank you very much for your great assistance. The training provided by killexams.com for the dbs-c01 exam was the best I have ever come across. I passed the dbs-c01 exam without any hassle or stress, thanks to killexams.com dbs-c01 Questions. The questions were valid, and I heard from my friend that their refund guarantee works too. They do provide you with the money back in case you fail, but the best part is that they make it very easy to pass. I highly recommend using Killexams for anyone preparing for the dbs-c01 exam. Their questions and answers are precise and to the point, which saved me a lot of time and effort in my studies. Thanks to them, I can now consider pursuing other Amazon certifications. Thanks to killexams.com, I passed the dbs-c01 exam in just weeks with 96% marks. I am very confident now that I can do better in my remaining three exams and certainly use your practice material and recommend it to my friends. The online practice engine product is extremely good, and I highly recommend it to all students. I work for Clever Corp and was nervous about taking the dbs-c01 exam due to its difficult case memorization and other challenges. However, I applied the questions and answers guide from killexams.com, and my doubts were cleared with the explanations provided for the answers. Additionally, I received the solved cases in my email, which helped me prepare more effectively. I scored 73.75% on the exam and give the entire credit to killexams.com. I extend my congratulations and look forward to passing more tests with your help |
Features of iPass4sure DBS-C01 Exam
- Files: PDF / Test Engine
- Premium Access
- Online Test Engine
- Instant download Access
- Comprehensive Q&A
- Success Rate
- Real Questions
- Updated Regularly
- Portable Files
- Unlimited Download
- 100% Secured
- Confidentiality: 100%
- Success Guarantee: 100%
- Any Hidden Cost: $0.00
- Auto Recharge: No
- Updates Intimation: by Email
- Technical Support: Free
- PDF Compatibility: Windows, Android, iOS, Linux
- Test Engine Compatibility: Mac / Windows / Android / iOS / Linux
Premium PDF with 100 Q&A
Get Full VersionAll Amazon Exams
Amazon ExamsCertification and Entry Test Exams
Complete exam list