Business-Objects DMDI301 Questions & Answers

Full Version: 33 Q&A


Latest DMDI301 Exam Questions and Practice Tests 2024 - Killexams.com

Latest DMDI301 Practice Tests with Actual Questions


Get Complete pool of questions with Premium PDF and Test Engine


Exam Code : DMDI301
Exam Name : BusinessObjects Data Integrator XI - Level Two
Vendor Name :
"Business-Objects"








DMDI301 Dumps DMDI301 Braindumps DMDI301 Real Questions DMDI301 Practice Test DMDI301 Actual Questions


killexams.com Business-Objects DMDI301


BusinessObjects Data Integrator XI - Level Two


https://killexams.com/pass4sure/exam-detail/DMDI301


D. to_data(sales_date&' '& sales_time.'dd_mmmm-yyyy hh24:mi:ss')




Answer: A



Question: 25

You have a production job that retrieves data from your oracle 10g operational source system and loads the data into your data warehouse. The operations of the source system are complaining that the data integrator load process is causing the system to perform poorly. Which two actions can you take to reduce the impact data integrator jobs have on the source system? (Choose two)


  1. Implement a CDC data store for the source system t reduce the number of rows extracted.

  2. Increase the value of the "array _fetch_size" parameter on the source table.

  3. Perform intensive operations such as "group by" and "joins" in a staging area instead of on the source system.

  4. Use "linked data stores" to connect the source and target data stores.




Answer: A, B



Question: 26

Your data integrator environment interprets year values greater than 15 as 1915 instead of 2015. you must ensure data integrator interprets any date from "00 to 90" as "2000 to 2090" without making direct modifications to the underlying data flow. Which method should you use to accomplish this task?


  1. Log into the designer and select tools l Options l data l General and modify the "Century change year" to 90.

  2. Open the server manger and select edit job server config and modify the "Century change year to 90".

  3. Open the web administration tool and select management l requisiteness edit theproduction requsitury and modify the "Century change year to 90".

  4. On the job server, open the windows l control panel l regional settings l Customize data and modify the two digit year interpretation to 90.

  5. Configure the source database to interpret the two digit dates appropriately.




Answer: A



Question: 27


You load over 10,000.000 records from the "customer" source table into a staging area. You need to remove the duplicate customer during the loading of the source table. You do not nee tot record or audit the duplicates. Which two do-duplicating techniques will ensure that best performance? (Choose two.)


  1. Use a Query transform to order the incoming data set an use the previous_row-value in the where clause to filter any duplicate row.

  2. Use the Query transform to order the incoming data set. Then a table_comparison transform with "input contains duplicates" and the "sorted input" options selected.

  3. Use tha table_ comparison transform with the "input contains duplicates" and "cached comparison table" selected.

  4. Use the lookup_ext function. With the Pre_load_cache" option selected to test each row for duplicates.




Answer: A, B



Question: 28

You want to join the "sales" and "customer" tables. Both tables reside in different data stores. The "sales" table contains approximately five million rows. The "customer" table Contains approximately five thousand rows, the join occurs in memory. How would you set the source table options to maximize the performance of the operation?


  1. Set the sales table joins tank to 10 and the cache to "No" then set the customer table join tank to 5 and cache to "yes".

  2. Set the sales table joins tank to 10 and the cache to "yes" then set the customer table join tank to 5 and cache to "yes".

  3. Set the sales table joins tank to 5 and the cache to "Yes" then set the customer table join tank to 10 and cache to "No".

  4. Set the sales table joins tank to 5 and the cache to "No" then set the customer table join tank to10 and cache to "No".




Answer: A



Question: 29

Where can the XML_Pipeline transform be used within a data flow? (Choose two)


  1. Immediately after an XML source file.

  2. Immediately after an XML source message.

  3. Immediately after a Query containing nested data.

  4. Immediately after an XML template.




Answer: A, B



Question: 30

You create a two stage process for transferring data from a source system to a target data warehouse via a staging area. The job you create runs both processes in an overnight schedule. The job fails at the point of transferring the data from the satging area to the target data warehouse. During the work day you want to return the job without impacting the source system and therefore want to just run the second stage of the process to ransfer the data from the staging area to the data warehouse. How would you design this job?


  1. Create two data flows the first extracting the data from the source system the second transferring the data to the target data warehouse.

  2. Create one data flow which extracts the data form the source system and uses a data_transfer transform to stage the data in the staging area before then continuing to transfer the data to the target data warehouse.

  3. Create two data flows the first extracting the data from the source system and uses a data_tranfer transform to write the data to the staging area. The second data flow extracts the data from the staging area and transfers it to the target data warehouse.

  4. Create one data flow which extracts from the source system and populates both the staging area and the target data warehouse.




Answer: A



Question: 31

Which two data integrator objects/operations support load balancing in a server Group based architecture? (Choose two.)


  1. Job

  2. Lookup_ext

  3. Script

  4. While loop




Answer: A, B



Question: 32

You have a data flow the read multiple XML files form a directory by specifying wildcard in the file name. which method can use to link the XML file name to the records being read?


  1. Select "include file name column" in the XML source file.

  2. Use the function get_xml file name in the query mapping

  3. Use the column "XML_fileNAME" listed at the top of the XML file structure.

  4. Use the variable$ current_XML_file in the query mapping




Answer: A



Question: 33

You are trying to improve the performance of a simple data flow that loads data from a source table into a staging area and only applies some simple remapping using a Query transform. The source database is located on the wan. The network administrator has told you that you can improve performance if you reduce the number or round trips that occur between the data integrator job server and the source database. What can you do in your data flow to achieve this?


  1. increase the array reach size parameter in the source table editor

  2. Increase the commit size in the target table editor.

  3. Increase the commit size in the source table editor.

  4. Replace the source table with the SQL transform.




Answer: A


User: Leonardo*****

I passed the DMDI301 exam with the Killexams.com question set. I did not have much time to prepare, but purchasing these DMDI301 questions, answers, and exam simulator was the best professional decision I ever made. I got through the exam easily, even though it is not an easy one. Yet this included all recent questions, and I got many of them on the DMDI301 exam and was able to figure out the rest, based on my experience. I guess it was as close to a no-brainer as an IT exam can get. So yes, Killexams.com is just as good as they say it is.
User: Elaine*****

Killexams.com helped make passing the DMDI301 exam possible for me, even with only 10 days to prepare. The topics were presented well, and I was able to score a 959 on the exam. Thank you, Killexams, for giving me hope when I thought it was impossible.
User: Kate*****

Killexams.com provided me with valid questions and answers for the dmdi301 exam. Everything was accurate and real, and I had no trouble passing the exam, even though I did not spend much time studying. Even if you have only a basic knowledge of dmdi301, you can pass the exam with this bundle. I was a little overwhelmed with the amount of information at first, but as I kept going through the questions, everything started to make sense.
User: Agustina*****

I am writing this to express my gratitude to you. I have successfully passed the dmdi301 exam with a score of 96%, thanks to the test questions and answers collection made by your team. It provided me with an experience similar to an online exam, with each question accompanied by a detailed explanation in a simple and easy-to-understand language. I am delighted that I made the right decision by purchasing your test collection.
User: Senya*****

Working in an IT firm means that I have limited time to prepare for the DMDI301 exam. So, I opted for Killexams.com Questions and Answers practice tests, and to my surprise, it worked wonders for me. I had to answer all the questions in the least possible time, and the questions were quite easy to understand with a fantastic reference guide. I scored 939 marks, which was a great surprise for me. Thanks to Killexams!

Features of iPass4sure DMDI301 Exam

  • Files: PDF / Test Engine
  • Premium Access
  • Online Test Engine
  • Instant download Access
  • Comprehensive Q&A
  • Success Rate
  • Real Questions
  • Updated Regularly
  • Portable Files
  • Unlimited Download
  • 100% Secured
  • Confidentiality: 100%
  • Success Guarantee: 100%
  • Any Hidden Cost: $0.00
  • Auto Recharge: No
  • Updates Intimation: by Email
  • Technical Support: Free
  • PDF Compatibility: Windows, Android, iOS, Linux
  • Test Engine Compatibility: Mac / Windows / Android / iOS / Linux

Premium PDF with 33 Q&A

Get Full Version

All Business-Objects Exams

Business-Objects Exams

Certification and Entry Test Exams

Complete exam list