Certification Practice Test | PDF Questions | Actual Questions | Test Engine | Pass4Sure
DVA-C02 : AWS Certified Developer - Associate Exam
Amazon DVA-C02 Questions & Answers
Full Version: 352 Q&A
DVA-C02 Dumps
DVA-C02 Braindumps
DVA-C02 Real Questions
DVA-C02 Practice Test
DVA-C02 Actual Questions
Amazon
DVA-C02
AWS Certified Developer - Associate
https://killexams.com/pass4sure/exam-detail/DVA-C02
Question: 334
A company is migrating legacy internal applications to AWS. Leadership wants to rewrite the internal employee
directory to use native AWS services. A developer needs to create a solution for storing employee contact details and
high-resolution photos for use with the new application.
Which solution will enable the search and retrieval of each employee's individual details and high-resolution photos
using AWS APIs?
A. Encode each employee's contact information and photos using Base64. Store the information in an Amazon
DynamoDB table using a sort key.
B. Store each employee's contact information in an Amazon DynamoDB table along with the object keys for the
photos stored in Amazon S3.
C. Use Amazon Cognito user pools to implement the employee directory in a fully managed software-as-a-service
(SaaS) method.
D. Store employee contact information in an Amazon RDS DB instance with the photos stored in Amazon Elastic File
System (Amazon EFS).
Answer: B
Question: 335
A developer is migrating some features from a legacy monolithic application to use AWS Lambda functions instead.
The application currently stores data in an Amazon Aurora DB cluster that runs in private subnets in a VPC. The AWS
account has one VPC deployed.
The Lambda functions and the DB cluster are deployed in the same AWS Region in the same AWS account.
The developer needs to ensure that the Lambda functions can securely access the DB cluster without crossing the
public internet.
Which solution will meet these requirements?
A. Configure the DB cluster's public access setting to Yes.
B. Configure an Amazon RDS database proxy for he Lambda functions.
C. Configure a NAT gateway and a security group for the Lambda functions.
D. Configure the VPC, subnets, and a security group for the Lambda functions.
Answer: D
Question: 336
A company wants to share information with a third party. The third party has an HTTP API endpoint that the company
can use to share the information. The company has the required API key to access the HTTP API.
The company needs a way to manage the API key by using code. The integration of the API key with the application
code cannot affect application performance.
Which solution will meet these requirements MOST securely?
A. Store the API credentials in AWS Secrets Manager. Retrieve the API credentials at runtime by using the AWS SD
B. Use the credentials to make the API call.
C. Store the API credentials in a local code variable. Push the code to a secure Git repository. Use the local code
variable at runtime to make the API call.
D. Store the API credentials as an object in a private Amazon S3 bucket. Restrict access to the S3 object by using IAM
policies. Retrieve the API credentials at runtime by using the AWS SD
E. Use the credentials to make the API call.
F. Store the API credentials in an Amazon DynamoDB table. Restrict access to the table by using resource-based
policies. Retrieve the API credentials at runtime by using the AWS SD
G. Use the credentials to make the API call.
Answer: B
Question: 337
An application uses Lambda functions to extract metadata from files uploaded to an S3 bucket; the metadata is stored
in Amazon DynamoDB. The application starts behavingunexpectedly, and the developer wants to examine the logs of
the Lambda function code for errors.
Based on this system configuration, where would the developer find the logs?
A. Amazon S3
B. AWS CloudTrail
C. Amazon CloudWatch
D. Amazon DynamoDB
Answer: C
Question: 338
A developer is creating an application that includes an Amazon API Gateway REST API in the us-east-2 Region. The
developer wants to use Amazon CloudFront and a custom domain name for the API. The developer has acquired an
SSL/TLS certificate for the domain from a third-party provider.
How should the developer configure the custom domain for the application?
A. Import the SSL/TLS certificate into AWS Certificate Manager (ACM) in the same Region as the AP
B. Create a DNS A record for the custom domain.
C. Import the SSL/TLS certificate into CloudFront. Create a DNS CNAME record for the custom domain.
D. Import the SSL/TLS certificate into AWS Certificate Manager (ACM) in the same Region as the AP
E. Create a DNS CNAME record for the custom domain.
F. Import the SSL/TLS certificate into AWS Certificate Manager (ACM) in the us-east-1 Region. Create a DNS
CNAME record for the custom domain.
Answer: B
Question: 339
An application that is hosted on an Amazon EC2 instance needs access to files that are stored in an Amazon S3 bucket.
The application lists the objects that are stored in the S3 bucket and displays a table to the user. During testing, a
developer discovers that the application does not show any objects in the list.
What is the MOST secure way to resolve this issue?
A. Update the IAM instance profile that is attached to the EC2 instance to include the S3: * permission for the S3
bucket.
B. Update the IAM instance profile that is attached to the EC2 instance to include the S3: ListBucket permission for
the S3 bucket.
C. Update the developer's user permissions to include the S3: ListBucket permission for the S3 bucket.
D. Update the S3 bucket policy by including the S3: ListBucket permission and by setting the Principal element to
specify the account number of the EC2 instance.
Answer: B
Question: 340
A developer is designing a serverless application with two AWS Lambda functions to process photos. One Lambda
function stores objects in an Amazon S3 bucket and stores the associated metadata in an Amazon DynamoDB table.
The other Lambda function fetches the objects from the S3 bucket by using the metadata from the DynamoDB table.
Both Lambda functions use the same Python library to perform complex computations and are approaching the quota
for the maximum size of zipped deployment packages.
What should the developer do to reduce the size of the Lambda deployment packages with the LEAST operational
overhead?
A. Package each Python library in its own .zip file archive. Deploy each Lambda function with its own copy of the
library.
B. Create a Lambda layer with the required Python library. Use the Lambda layer in both Lambda functions.
C. Combine the two Lambda functions into one Lambda function. Deploy the Lambda function as a single .zip file
archive.
D. Download the Python library to an S3 bucket. Program the Lambda functions to reference the object URLs.
Answer: B
Question: 241
A developer is creating an AWS Lambda function that needs credentials to connect to an Amazon RDS for MySQL
database. An Amazon S3 bucket currently stores the credentials. The developer needs to improve the existing solution
by implementing credential rotation and secure storage. The developer also needs to provide integration with the
Lambda function.
Which solution should the developer use to store and retrieve the credentials with the LEAST management overhead?
A. Store the credentials in AWS Systems Manager Parameter Store. Select the database that the parameter will access.
Use the default AWS Key Management Service (AWS KMS) key to encrypt the parameter. Enable automatic rotation
for the parameter. Use the parameter from Parameter Store on the Lambda function to connect to the database.
B. Encrypt the credentials with the default AWS Key Management Service (AWS KMS) key. Store the credentials as
environment variables for the Lambda function. Create a second Lambda function to generate new credentials and to
rotate the credentials by updating the environment variables of the first Lambda function. Invoke the second Lambda
function by using an Amazon EventBridge rule that runs on a schedule. Update the database to use the new credentials.
On the first Lambda function, retrieve the credentials from the environment variables. Decrypt the credentials by using
AWS KMS, Connect to the database.
C. Store the credentials in AWS Secrets Manager. Set the secret type to Credentials for Amazon RDS database. Select
the database that the secret will access. Use the default AWS Key Management Service (AWS KMS) key to encrypt
the secret. Enable automatic rotation for the secret. Use the secret from Secrets Manager on the Lambda function to
connect to the database.
D. Encrypt the credentials by using AWS Key Management Service (AWS KMS). Store the credentials in an Amazon
DynamoDB table. Create a second Lambda function to rotate the credentials. Invoke the second Lambda function by
using an Amazon EventBridge rule that runs on a schedule. Update the DynamoDB table. Update the database to use
the generated credentials. Retrieve the credentials from DynamoDB with the first Lambda function. Connect to the
database.
Answer: C
Question: 341
A developer wants to insert a record into an Amazon DynamoDB table as soon as a new file is added to an Amazon
S3 bucket.
Which set of steps would be necessary to achieve this?
A. Create an event with Amazon EventBridge that will monitor the S3 bucket and then insert the records into
DynamoD
B. Configure an S3 event to invoke an AWS Lambda function that inserts records into DynamoD
C. Create an AWS Lambda function that will poll the S3 bucket and then insert the records into DynamoD
D. Create a cron job that will run at a scheduled time and insert the records into DynamoD
Answer: C
Question: 342
A developer is deploying an AWS Lambda function. The developer wants the ability to return to older versions of the
function quickly and seamlessly.
How can the developer achieve this goal with the LEAST operational overhead?
A. Use AWS OpsWorks to perform blue/green deployments.
B. Use a function alias with different versions.
C. Maintain deployment packages for older versions in Amazon S3.
D. Use AWS CodePipeline for deployments and rollbacks.
Answer: B
Question: 243
A development team maintains a web application by using a single AWS CloudFormation template. The template
defines web servers and an Amazon RDS database. The team uses the Cloud Formation template to deploy the Cloud
Formation stack to different environments.
During a recent application deployment, a developer caused the primary development database to be dropped and
recreated. The result of this incident was a loss of data. The team needs to avoid accidental database deletion in the
future.
Which solutions will meet these requirements? (Choose two.)
A. Add a CloudFormation Deletion Policy attribute with the Retain value to the database resource.
B. Update the CloudFormation stack policy to prevent updates to the database.
C. Modify the database to use a Multi-AZ deployment.
D. Create a CloudFormation stack set for the web application and database deployments.
E. Add a Cloud Formation DeletionPolicy attribute with the Retain value to the stack.
Answer: A,D
Question: 344
A company hosts a client-side web application for one of its subsidiaries on Amazon S3. The web application can be
accessed through Amazon CloudFront from https://www.example.com. After a successful rollout, the company wants
to host three more client-side web applications for its remaining subsidiaries on three separate S3 buckets.
To achieve this goal, a developer moves all the common JavaScript files and web fonts to a central S3 bucket that
serves the web applications. However, during testing, the developer notices that the browser blocks the JavaScript files
and web fonts.
What should the developer do to prevent the browser from blocking the JavaScript files and web fonts?
A. Create four access points that allow access to the central S3 bucket. Assign an access point to each web application
bucket.
B. Create a bucket policy that allows access to the central S3 bucket. Attach the bucket policy to the central S3 bucket.
C. Create a cross-origin resource sharing (CORS) configuration that allows access to the central S3 bucket. Add the
CORS configuration to the central S3 bucket.
D. Create a Content-MD5 header that provides a message integrity check for the central S3 bucket. Insert the Content-
MD5 header for each web application request.
Answer: C
Question: 345
A company wants to deploy and maintain static websites on AWS. Each website's source code is hosted in one of
several version control systems, including AWS CodeCommit, Bitbucket, and GitHub.
The company wants to implement phased releases by using development, staging, user acceptance testing, and
production environments in the AWS Cloud. Deployments to each environment must be started by code merges on the
relevant Git branch. The company wants to use HTTPS for all data exchange. The company needs a solution that does
not require servers to run continuously.
Which solution will meet these requirements with the LEAST operational overhead?
A. Host each website by using AWS Amplify with a serverless backend. Conned the repository branches that
correspond to each of the desired environments. Start deployments by merging code changes to a desired branch.
B. Host each website in AWS Elastic Beanstalk with multiple environments. Use the EB CLI to link each repository
branch. Integrate AWS CodePipeline to automate deployments from version control code merges.
C. Host each website in different Amazon S3 buckets for each environment. Configure AWS CodePipeline to pull
source code from version control. Add an AWS CodeBuild stage to copy source code to Amazon S3.
D. Host each website on its own Amazon EC2 instance. Write a custom deployment script to bundle each website's
static assets. Copy the assets to Amazon EC2. Set up a workflow to run the script when code is merged.
Answer: A
Question: 346
For a deployment using AWS Code Deploy, what is the run order of the hooks for in-place deployments?
A. BeforeInstall -> ApplicationStop -> ApplicationStart -> AfterInstall
B. ApplicationStop -> BeforeInstall -> AfterInstall -> ApplicationStart
C. BeforeInstall -> ApplicationStop -> ValidateService -> ApplicationStart
D. ApplicationStop -> BeforeInstall -> ValidateService -> ApplicationStart
Answer: A
Question: 347
A company is implementing an application on Amazon EC2 instances. The application needs to process incoming
transactions. When the application detects a transaction that is not valid, the application must send a chat message to
the company's support team. To send the message, the application needs to retrieve the access token to authenticate by
using the chat API.
A developer needs to implement a solution to store the access token. The access token must be encrypted at rest and in
transit. The access token must also be accessible from other AWS accounts.
Which solution will meet these requirements with the LEAST management overhead?
A. Use an AWS Systems Manager Parameter Store SecureString parameter that uses an AWS Key Management
Service (AWS KMS) AWS managed key to store the access token. Add a resource-based policy to the parameter to
allow access from other accounts. Update the IAM role of the EC2 instances with permissions to access Parameter
Store. Retrieve the token from Parameter Store with the decrypt flag enabled. Use the decrypted access token to send
the message to the chat.
B. Encrypt the access token by using an AWS Key Management Service (AWS KMS) customer managed key. Store
the access token in an Amazon DynamoDB table. Update the IAM role of the EC2 instances with permissions to
access DynamoDB and AWS KM
C. Retrieve the token from DynamoD
D. Decrypt the token by using AWS KMS on the EC2 instances. Use the decrypted access token to send the message
to the chat.
E. Use AWS Secrets Manager with an AWS Key Management Service (AWS KMS) customer managed key to store
the access token. Add a resource-based policy to the secret to allow access from other accounts. Update the IAM role
of the EC2 instanceswith permissions to access Secrets Manager. Retrieve the token from Secrets Manager. Use the
decrypted access token to send the message to the chat.
F. Encrypt the access token by using an AWS Key Management Service (AWS KMS) AWS managed key. Store the
access token in an Amazon S3 bucket. Add a bucket policy to the S3 bucket to allow access from other accounts.
Update the IAM role of the EC2 instances with permissions to access Amazon S3 and AWS KM
G. Retrieve the token from the S3 bucket. Decrypt the token by using AWS KMS on the EC2 instances. Use the
decrypted access token to send the massage to the chat.
Answer: B
Question: 348
A company is building a scalable data management solution by using AWS services to improve the speed and agility
of development. The solution will ingest large volumes of data from various sources and will process this data through
multiple business rules and transformations.
The solution requires business rules to run in sequence and to handle reprocessing of data if errors occur when the
business rules run. The company needs the solution to be scalable and to require the least possible maintenance.
Which AWS service should the company use to manage and automate the orchestration of the data flows to meet these
requirements?
A. AWS Batch
B. AWS Step Functions
C. AWS Glue
D. AWS Lambda
Answer: D
Question: 349
A company is building a serverless application on AWS. The application uses an AWS Lambda function to process
customer orders 24 hours a day, 7 days a week. The Lambda function calls an external vendor's HTTP API to process
payments.
During load tests, a developer discovers that the external vendor payment processing API occasionally times out and
returns errors. The company expects that some payment processing API calls will return errors.
The company wants the support team to receive notifications in near real time only when the payment processing
external API error rate exceed 5% of the total number of transactions in an hour. Developers need to use an existing
Amazon Simple Notification Service (Amazon SNS) topic that is configured to notify the support team.
Which solution will meet these requirements?
A. Write the results of payment processing API calls to Amazon CloudWatch. Use Amazon CloudWatch Logs Insights
to query the CloudWatch logs. Schedule the Lambda function to check the CloudWatch logs and notify the existing
SNS topic.
B. Publish custom metrics to CloudWatch that record the failures of the external payment processing API calls.
Configure a CloudWatch alarm to notify the existing SNS topic when error rate exceeds the specified rate.
C. Publish the results of the external payment processing API calls to a new Amazon SNS topic. Subscribe the support
team members to the new SNS topic.
D. Write the results of the external payment processing API calls to Amazon S3. Schedule an Amazon Athena query
to run at regular intervals. Configure Athena to send notifications to the existing SNS topic when the error rate exceeds
the specified rate.
Answer: B
Question: 350
A developer is creating an application that will be deployed on IoT devices. The application will send data to a
RESTful API that is deployed as an AWS Lambda function. The application will assign each API request a unique
identifier. The volume of API requests from the application can randomly increase at any given time of day.
During periods of request throttling, the application might need to retry requests. The API must be able to handle
duplicate requests without inconsistencies or data loss.
Which solution will meet these requirements?
A. Create an Amazon RDS for MySQL DB instance. Store the unique identifier for each request in a database table.
Modify the Lambda function to check the table for the identifier before processing the request.
B. Create an Amazon DynamoDB table. Store the unique identifier for each request in the table. Modify the Lambda
function to check the table for the identifier before processing the request.
C. Create an Amazon DynamoDB table. Store the unique identifier for each request in the table. Modify the Lambda
function to return a client error response when the function receives a duplicate request.
D. Create an Amazon ElastiCache for Memcached instance. Store the unique identifier for each request in the cache.
Modify the Lambda function to check the cache for the identifier before processing the request.
Answer: B
Question: 351
A company is running a custom application on a set of on-premises Linux servers that are accessed using Amazon API
Gateway. AWS X-Ray tracing has been enabled on the API test stage.
How can a developer enable X-Ray tracing on the on-premises servers with the LEAST amount of configuration?
A. Install and run the X-Ray SDK on the on-premises servers to capture and relay the data to the X-Ray service.
B. Install and run the X-Ray daemon on the on-premises servers to capture and relay the data to the X-Ray service.
C. Capture incoming requests on-premises and configure an AWS Lambda function to pull, process, and relay relevant
data to X-Ray using the PutTraceSegments API call.
D. Capture incoming requests on-premises and configure an AWS Lambda function to pull, process, and relay relevant
data to X-Ray using the PutTelemetryRecords API call.
Answer: B
Question: 352
A developer has created an AWS Lambda function that is written in Python. The Lambda function reads data from
objects in Amazon S3 and writes data to an Amazon DynamoDB table. The function is successfully invoked from an
S3 event notification when an object is created. However, the function fails when it attempts to write to the
DynamoDB table.
What is the MOST likely cause of this issue?
A. The Lambda function's concurrency limit has been exceeded.
B. DynamoDB table requires a global secondary index (GSI) to support writes.
C. The Lambda function does not have IAM permissions to write to DynamoD
D. The DynamoDB table is not running in the same Availability Zone as the Lambda function.
Answer: B
User: Shasha***** I dedicated enough time to study with Killexams.com materials and successfully passed the dva-c02 exam. The material is of good quality, and even though they are practice tests, constructed using real exam questions, I do not understand why people try to complain about the questions. In my case, not all questions were 100% identical, but the topics and general approach were accurate. So, if you study hard enough, you will do just fine. |
User: Delfina***** For those with fears related to DVA-C02 certification, I highly recommend coming to this platform. Killexams.com offers quality products for your preparations, and thanks to them, I was able to succeed in my studies. The DVA-C02 exam materials increased my self-confidence, and now I feel satisfied with this invaluable assistance. |
User: Samuel***** I am grateful to killexams.com for providing contemporary test materials for the dva-c02 exam. All the individuals at killexams.com are doing an extraordinary job and ensuring the success of candidates in dva-c02 exams. I passed the dva-c02 exam just because I used killexams.com material. |
User: Nikolai***** I cannot believe that I passed the DVA-C02 exam with such an excellent score. I owe it to killexams.com for their exceptional assistance. Their exam preparation material helped me perform beyond my expectations. |
User: Santiago***** I have used Killexams for my DVA-C02 exam several times, and I have never failed. I truly depend on this guidance. This time, I had some technical troubles with my laptop, so I had to contact their customer service. They were remarkable and helped me sort things out, even though the issue was on my end. |
Features of iPass4sure DVA-C02 Exam
- Files: PDF / Test Engine
- Premium Access
- Online Test Engine
- Instant download Access
- Comprehensive Q&A
- Success Rate
- Real Questions
- Updated Regularly
- Portable Files
- Unlimited Download
- 100% Secured
- Confidentiality: 100%
- Success Guarantee: 100%
- Any Hidden Cost: $0.00
- Auto Recharge: No
- Updates Intimation: by Email
- Technical Support: Free
- PDF Compatibility: Windows, Android, iOS, Linux
- Test Engine Compatibility: Mac / Windows / Android / iOS / Linux
Premium PDF with 352 Q&A
Get Full VersionAll Amazon Exams
Amazon ExamsCertification and Entry Test Exams
Complete exam list