Certification Practice Test | PDF Questions | Actual Questions | Test Engine | Pass4Sure
DVA-C02 : AWS Certified Developer - Associate Exam
Amazon DVA-C02 Questions & Answers
Full Version: 352 Q&A
Latest DVA-C02 Practice Tests with Actual Questions
Get Complete pool of questions with Premium PDF and Test Engine
Exam Code : DVA-C02
Exam Name : AWS Certified Developer - Associate
Vendor Name :
"Amazon"
DVA-C02 Dumps DVA-C02 Braindumps
DVA-C02 Real Questions DVA-C02 Practice Test DVA-C02 Actual Questions
killexams.com
Amazon
DVA-C02
AWS Certified Developer - Associate
https://killexams.com/pass4sure/exam-detail/DVA-C02
Question: 334
A company is migrating legacy internal applications to AWS. Leadership wants to rewrite the internal employee directory to use native AWS services. A developer needs to create a solution for storing employee contact details and high-resolution photos for use with the new application.
Which solution will enable the search and retrieval of each employee's individual details and high-resolution photos using AWS APIs?
Encode each employee's contact information and photos using Base64. Store the information in an Amazon DynamoDB table using a sort key.
Store each employee's contact information in an Amazon DynamoDB table along with the object keys for the photos stored in Amazon S3.
Use Amazon Cognito user pools to implement the employee directory in a fully managed software-as-a-service (SaaS) method.
Store employee contact information in an Amazon RDS DB instance with the photos stored in Amazon Elastic File System (Amazon EFS).
Answer: B
Question: 335
A developer is migrating some features from a legacy monolithic application to use AWS Lambda functions instead. The application currently stores data in an Amazon Aurora DB cluster that runs in private subnets in a VPC. The AWS account has one VPC deployed.
The Lambda functions and the DB cluster are deployed in the same AWS Region in the same AWS account. The developer needs to ensure that the Lambda functions can securely access the DB cluster without crossing the
public internet.
Which solution will meet these requirements?
Configure the DB cluster's public access setting to Yes.
Configure an Amazon RDS database proxy for he Lambda functions.
Configure a NAT gateway and a security group for the Lambda functions.
Configure the VPC, subnets, and a security group for the Lambda functions.
Answer: D
Question: 336
A company wants to share information with a third party. The third party has an HTTP API endpoint that the company can use to share the information. The company has the required API key to access the HTTP API.
The company needs a way to manage the API key by using code. The integration of the API key with the application code cannot affect application performance.
Which solution will meet these requirements MOST securely?
Store the API credentials in AWS Secrets Manager. Retrieve the API credentials at runtime by using the AWS SD
Use the credentials to make the API call.
Store the API credentials in a local code variable. Push the code to a secure Git repository. Use the local code variable at runtime to make the API call.
Store the API credentials as an object in a private Amazon S3 bucket. Restrict access to the S3 object by using IAM policies. Retrieve the API credentials at runtime by using the AWS SD
Use the credentials to make the API call.
Store the API credentials in an Amazon DynamoDB table. Restrict access to the table by using resource-based policies. Retrieve the API credentials at runtime by using the AWS SD
Use the credentials to make the API call.
Answer: B
Question: 337
An application uses Lambda functions to extract metadata from files uploaded to an S3 bucket; the metadata is stored in Amazon DynamoDB. The application starts behavingunexpectedly, and the developer wants to examine the logs of the Lambda function code for errors.
Based on this system configuration, where would the developer find the logs?
Amazon S3
AWS CloudTrail
Amazon CloudWatch
Amazon DynamoDB
Answer: C
Question: 338
A developer is creating an application that includes an Amazon API Gateway REST API in the us-east-2 Region. The developer wants to use Amazon CloudFront and a custom domain name for the API. The developer has acquired an SSL/TLS certificate for the domain from a third-party provider.
How should the developer configure the custom domain for the application?
Import the SSL/TLS certificate into AWS Certificate Manager (ACM) in the same Region as the AP
Create a DNS A record for the custom domain.
Import the SSL/TLS certificate into CloudFront. Create a DNS CNAME record for the custom domain.
Import the SSL/TLS certificate into AWS Certificate Manager (ACM) in the same Region as the AP
Create a DNS CNAME record for the custom domain.
Import the SSL/TLS certificate into AWS Certificate Manager (ACM) in the us-east-1 Region. Create a DNS CNAME record for the custom domain.
Answer: B
Question: 339
An application that is hosted on an Amazon EC2 instance needs access to files that are stored in an Amazon S3 bucket. The application lists the objects that are stored in the S3 bucket and displays a table to the user. During testing, a developer discovers that the application does not show any objects in the list.
What is the MOST secure way to resolve this issue?
Update the IAM instance profile that is attached to the EC2 instance to include the S3: * permission for the S3
bucket.
Update the IAM instance profile that is attached to the EC2 instance to include the S3: ListBucket permission for the S3 bucket.
Update the developer's user permissions to include the S3: ListBucket permission for the S3 bucket.
Update the S3 bucket policy by including the S3: ListBucket permission and by setting the Principal element to specify the account number of the EC2 instance.
Answer: B
Question: 340
A developer is designing a serverless application with two AWS Lambda functions to process photos. One Lambda function stores objects in an Amazon S3 bucket and stores the associated metadata in an Amazon DynamoDB table. The other Lambda function fetches the objects from the S3 bucket by using the metadata from the DynamoDB table.
Both Lambda functions use the same Python library to perform complex computations and are approaching the quota for the maximum size of zipped deployment packages.
What should the developer do to reduce the size of the Lambda deployment packages with the LEAST operational overhead?
Package each Python library in its own .zip file archive. Deploy each Lambda function with its own copy of the library.
Create a Lambda layer with the required Python library. Use the Lambda layer in both Lambda functions.
Combine the two Lambda functions into one Lambda function. Deploy the Lambda function as a single .zip file archive.
Download the Python library to an S3 bucket. Program the Lambda functions to reference the object URLs.
Answer: B
Question: 241
A developer is creating an AWS Lambda function that needs credentials to connect to an Amazon RDS for MySQL database. An Amazon S3 bucket currently stores the credentials. The developer needs to improve the existing solution by implementing credential rotation and secure storage. The developer also needs to provide integration with the Lambda function.
Which solution should the developer use to store and retrieve the credentials with the LEAST management overhead?
Store the credentials in AWS Systems Manager Parameter Store. Select the database that the parameter will access. Use the default AWS Key Management Service (AWS KMS) key to encrypt the parameter. Enable automatic rotation for the parameter. Use the parameter from Parameter Store on the Lambda function to connect to the database.
Encrypt the credentials with the default AWS Key Management Service (AWS KMS) key. Store the credentials as environment variables for the Lambda function. Create a second Lambda function to generate new credentials and to rotate the credentials by updating the environment variables of the first Lambda function. Invoke the second Lambda function by using an Amazon EventBridge rule that runs on a schedule. Update the database to use the new credentials. On the first Lambda function, retrieve the credentials from the environment variables. Decrypt the credentials by using AWS KMS, Connect to the database.
Store the credentials in AWS Secrets Manager. Set the secret type to Credentials for Amazon RDS database. Select the database that the secret will access. Use the default AWS Key Management Service (AWS KMS) key to encrypt the secret. Enable automatic rotation for the secret. Use the secret from Secrets Manager on the Lambda function to connect to the database.
Encrypt the credentials by using AWS Key Management Service (AWS KMS). Store the credentials in an Amazon
DynamoDB table. Create a second Lambda function to rotate the credentials. Invoke the second Lambda function by using an Amazon EventBridge rule that runs on a schedule. Update the DynamoDB table. Update the database to use the generated credentials. Retrieve the credentials from DynamoDB with the first Lambda function. Connect to the database.
Answer: C
Question: 341
A developer wants to insert a record into an Amazon DynamoDB table as soon as a new file is added to an Amazon S3 bucket.
Which set of steps would be necessary to achieve this?
Create an event with Amazon EventBridge that will monitor the S3 bucket and then insert the records into DynamoD
Configure an S3 event to invoke an AWS Lambda function that inserts records into DynamoD
Create an AWS Lambda function that will poll the S3 bucket and then insert the records into DynamoD
Create a cron job that will run at a scheduled time and insert the records into DynamoD
Answer: C
Question: 342
A developer is deploying an AWS Lambda function. The developer wants the ability to return to older versions of the function quickly and seamlessly.
How can the developer achieve this goal with the LEAST operational overhead?
Use AWS OpsWorks to perform blue/green deployments.
Use a function alias with different versions.
Maintain deployment packages for older versions in Amazon S3.
Use AWS CodePipeline for deployments and rollbacks.
Answer: B
Question: 243
A development team maintains a web application by using a single AWS CloudFormation template. The template defines web servers and an Amazon RDS database. The team uses the Cloud Formation template to deploy the Cloud Formation stack to different environments.
During a recent application deployment, a developer caused the primary development database to be dropped and recreated. The result of this incident was a loss of data. The team needs to avoid accidental database deletion in the future.
Which solutions will meet these requirements? (Choose two.)
Add a CloudFormation Deletion Policy attribute with the Retain value to the database resource.
Update the CloudFormation stack policy to prevent updates to the database.
Modify the database to use a Multi-AZ deployment.
Create a CloudFormation stack set for the web application and database deployments.
Add a Cloud Formation DeletionPolicy attribute with the Retain value to the stack.
Answer: A,D
Question: 344
A company hosts a client-side web application for one of its subsidiaries on Amazon S3. The web application can be accessed through Amazon CloudFront from https://www.example.com. After a successful rollout, the company wants to host three more client-side web applications for its remaining subsidiaries on three separate S3 buckets.
To achieve this goal, a developer moves all the common JavaScript files and web fonts to a central S3 bucket that serves the web applications. However, during testing, the developer notices that the browser blocks the JavaScript files and web fonts.
What should the developer do to prevent the browser from blocking the JavaScript files and web fonts?
Create four access points that allow access to the central S3 bucket. Assign an access point to each web application bucket.
Create a bucket policy that allows access to the central S3 bucket. Attach the bucket policy to the central S3 bucket.
Create a cross-origin resource sharing (CORS) configuration that allows access to the central S3 bucket. Add the CORS configuration to the central S3 bucket.
Create a Content-MD5 header that provides a message integrity check for the central S3 bucket. Insert the Content- MD5 header for each web application request.
Answer: C
Question: 345
A company wants to deploy and maintain static websites on AWS. Each website's source code is hosted in one of several version control systems, including AWS CodeCommit, Bitbucket, and GitHub.
The company wants to implement phased releases by using development, staging, user acceptance testing, and production environments in the AWS Cloud. Deployments to each environment must be started by code merges on the relevant Git branch. The company wants to use HTTPS for all data exchange. The company needs a solution that does not require servers to run continuously.
Which solution will meet these requirements with the LEAST operational overhead?
Host each website by using AWS Amplify with a serverless backend. Conned the repository branches that correspond to each of the desired environments. Start deployments by merging code changes to a desired branch.
Host each website in AWS Elastic Beanstalk with multiple environments. Use the EB CLI to link each repository branch. Integrate AWS CodePipeline to automate deployments from version control code merges.
Host each website in different Amazon S3 buckets for each environment. Configure AWS CodePipeline to pull source code from version control. Add an AWS CodeBuild stage to copy source code to Amazon S3.
Host each website on its own Amazon EC2 instance. Write a custom deployment script to bundle each website's static assets. Copy the assets to Amazon EC2. Set up a workflow to run the script when code is merged.
Answer: A
Question: 346
For a deployment using AWS Code Deploy, what is the run order of the hooks for in-place deployments?
BeforeInstall -> ApplicationStop -> ApplicationStart -> AfterInstall
ApplicationStop -> BeforeInstall -> AfterInstall -> ApplicationStart
BeforeInstall -> ApplicationStop -> ValidateService -> ApplicationStart
ApplicationStop -> BeforeInstall -> ValidateService -> ApplicationStart
Answer: A
Question: 347
A company is implementing an application on Amazon EC2 instances. The application needs to process incoming transactions. When the application detects a transaction that is not valid, the application must send a chat message to the company's support team. To send the message, the application needs to retrieve the access token to authenticate by using the chat API.
A developer needs to implement a solution to store the access token. The access token must be encrypted at rest and in transit. The access token must also be accessible from other AWS accounts.
Which solution will meet these requirements with the LEAST management overhead?
Use an AWS Systems Manager Parameter Store SecureString parameter that uses an AWS Key Management Service (AWS KMS) AWS managed key to store the access token. Add a resource-based policy to the parameter to allow access from other accounts. Update the IAM role of the EC2 instances with permissions to access Parameter Store. Retrieve the token from Parameter Store with the decrypt flag enabled. Use the decrypted access token to send the message to the chat.
Encrypt the access token by using an AWS Key Management Service (AWS KMS) customer managed key. Store the access token in an Amazon DynamoDB table. Update the IAM role of the EC2 instances with permissions to access DynamoDB and AWS KM
Retrieve the token from DynamoD
Decrypt the token by using AWS KMS on the EC2 instances. Use the decrypted access token to send the message to the chat.
Use AWS Secrets Manager with an AWS Key Management Service (AWS KMS) customer managed key to store the access token. Add a resource-based policy to the secret to allow access from other accounts. Update the IAM role of the EC2 instanceswith permissions to access Secrets Manager. Retrieve the token from Secrets Manager. Use the decrypted access token to send the message to the chat.
Encrypt the access token by using an AWS Key Management Service (AWS KMS) AWS managed key. Store the access token in an Amazon S3 bucket. Add a bucket policy to the S3 bucket to allow access from other accounts. Update the IAM role of the EC2 instances with permissions to access Amazon S3 and AWS KM
Retrieve the token from the S3 bucket. Decrypt the token by using AWS KMS on the EC2 instances. Use the decrypted access token to send the massage to the chat.
Answer: B
Question: 348
A company is building a scalable data management solution by using AWS services to improve the speed and agility of development. The solution will ingest large volumes of data from various sources and will process this data through multiple business rules and transformations.
The solution requires business rules to run in sequence and to handle reprocessing of data if errors occur when the business rules run. The company needs the solution to be scalable and to require the least possible maintenance.
Which AWS service should the company use to manage and automate the orchestration of the data flows to meet these requirements?
AWS Batch
AWS Step Functions
AWS Glue
AWS Lambda
Answer: D
Question: 349
A company is building a serverless application on AWS. The application uses an AWS Lambda function to process customer orders 24 hours a day, 7 days a week. The Lambda function calls an external vendor's HTTP API to process payments.
During load tests, a developer discovers that the external vendor payment processing API occasionally times out and returns errors. The company expects that some payment processing API calls will return errors.
The company wants the support team to receive notifications in near real time only when the payment processing external API error rate exceed 5% of the total number of transactions in an hour. Developers need to use an existing Amazon Simple Notification Service (Amazon SNS) topic that is configured to notify the support team.
Which solution will meet these requirements?
Write the results of payment processing API calls to Amazon CloudWatch. Use Amazon CloudWatch Logs Insights to query the CloudWatch logs. Schedule the Lambda function to check the CloudWatch logs and notify the existing SNS topic.
Publish custom metrics to CloudWatch that record the failures of the external payment processing API calls. Configure a CloudWatch alarm to notify the existing SNS topic when error rate exceeds the specified rate.
Publish the results of the external payment processing API calls to a new Amazon SNS topic. Subscribe the support team members to the new SNS topic.
Write the results of the external payment processing API calls to Amazon S3. Schedule an Amazon Athena query to run at regular intervals. Configure Athena to send notifications to the existing SNS topic when the error rate exceeds the specified rate.
Answer: B
Question: 350
A developer is creating an application that will be deployed on IoT devices. The application will send data to a RESTful API that is deployed as an AWS Lambda function. The application will assign each API request a unique identifier. The volume of API requests from the application can randomly increase at any given time of day.
During periods of request throttling, the application might need to retry requests. The API must be able to handle duplicate requests without inconsistencies or data loss.
Which solution will meet these requirements?
Create an Amazon RDS for MySQL DB instance. Store the unique identifier for each request in a database table. Modify the Lambda function to check the table for the identifier before processing the request.
Create an Amazon DynamoDB table. Store the unique identifier for each request in the table. Modify the Lambda function to check the table for the identifier before processing the request.
Create an Amazon DynamoDB table. Store the unique identifier for each request in the table. Modify the Lambda function to return a client error response when the function receives a duplicate request.
Create an Amazon ElastiCache for Memcached instance. Store the unique identifier for each request in the cache.
Modify the Lambda function to check the cache for the identifier before processing the request.
Answer: B
Question: 351
A company is running a custom application on a set of on-premises Linux servers that are accessed using Amazon API Gateway. AWS X-Ray tracing has been enabled on the API test stage.
How can a developer enable X-Ray tracing on the on-premises servers with the LEAST amount of configuration?
Install and run the X-Ray SDK on the on-premises servers to capture and relay the data to the X-Ray service.
Install and run the X-Ray daemon on the on-premises servers to capture and relay the data to the X-Ray service.
Capture incoming requests on-premises and configure an AWS Lambda function to pull, process, and relay relevant data to X-Ray using the PutTraceSegments API call.
Capture incoming requests on-premises and configure an AWS Lambda function to pull, process, and relay relevant data to X-Ray using the PutTelemetryRecords API call.
Answer: B
Question: 352
A developer has created an AWS Lambda function that is written in Python. The Lambda function reads data from objects in Amazon S3 and writes data to an Amazon DynamoDB table. The function is successfully invoked from an S3 event notification when an object is created. However, the function fails when it attempts to write to the DynamoDB table.
What is the MOST likely cause of this issue?
The Lambda function's concurrency limit has been exceeded.
DynamoDB table requires a global secondary index (GSI) to support writes.
The Lambda function does not have IAM permissions to write to DynamoD
The DynamoDB table is not running in the same Availability Zone as the Lambda function.
Answer: B
User: Nataliya***** Prior to enrolling in the Killexams.com software, I had attempted the dva-c02 practice questions without success. After joining the program, I realized that the problem was with the practice books I was using. The Killexams.com designed books are more effective and cover the entire syllabus for dva-c02 exam preparation. |
User: Massey***** When the DVA-C02 exam approached, going through Killexams.com question and answer resources became an addiction for me. With the exam only six days away, it became critical for me to study the material. However, I needed a few reference guides to gain better help with the subject. Thanks to Killexams.com questions and answers, I was able to easily understand the subject, which would have been otherwise difficult. I managed to score 980, which was the highest mark in my class, all thanks to Killexams.com products. |
User: Nadya***** Thanks to Killexams.com, I am now dva-c02 certified. Their website offers an extraordinary series of practice tests and exam preparation resources. I used them extensively to prepare for my dva-c02 certification exam, and their material was just as appropriate. The questions are authentic, and the exam simulator works great. I had no issues during the exam. I ordered the material, practiced for a week, and passed the dva-c02 exam with flying colors. Killexams.com offers the perfect exam preparation that everyone should endorse. |
User: Nadine***** I have been using killexams.com for all my tests, and last week, I passed the dva-c02 exam with an excellent mark using their Questions and Answers test resources. I had some doubts about some subjects, but the material cleared all my doubts. Thanks for providing me with the robust and reliable practice test. It is a great product. |
User: Benicio***** I owe my success in passing the DVA-C02 exam to Killexams.com exam prep materials. I had failed the exam on my first attempt, but their questions were so similar to the real ones that I passed with ease the second time around. Their format is easy to understand, and the information you learn sticks with you even after the exam. |
Features of iPass4sure DVA-C02 Exam
- Files: PDF / Test Engine
- Premium Access
- Online Test Engine
- Instant download Access
- Comprehensive Q&A
- Success Rate
- Real Questions
- Updated Regularly
- Portable Files
- Unlimited Download
- 100% Secured
- Confidentiality: 100%
- Success Guarantee: 100%
- Any Hidden Cost: $0.00
- Auto Recharge: No
- Updates Intimation: by Email
- Technical Support: Free
- PDF Compatibility: Windows, Android, iOS, Linux
- Test Engine Compatibility: Mac / Windows / Android / iOS / Linux
Premium PDF with 352 Q&A
Get Full VersionAll Amazon Exams
Amazon ExamsCertification and Entry Test Exams
Complete exam list