Informatica Test Data Management (Version ) Release Guide

Similar documents
Informatica Data Archive (Version HotFix 1) Amdocs Accelerator Reference

Informatica (Version HotFix 4) Metadata Manager Repository Reports Reference

Informatica (Version ) SQL Data Service Guide

Informatica (Version 10.0) Rule Specification Guide

Informatica (Version 10.1) Metadata Manager Custom Metadata Integration Guide

Informatica (Version 9.1.0) Data Quality Installation and Configuration Quick Start

Informatica PowerExchange for SAP NetWeaver (Version 10.2)

Informatica Cloud (Version Fall 2016) Qlik Connector Guide

Informatica Cloud (Version Spring 2017) Magento Connector User Guide

Informatica (Version ) Intelligent Data Lake Administrator Guide

Informatica Cloud (Version Spring 2017) Microsoft Azure DocumentDB Connector Guide

Informatica Cloud (Version Spring 2017) Microsoft Dynamics 365 for Operations Connector Guide

Informatica Informatica (Version ) Installation and Configuration Guide

Informatica Test Data Management (Version 9.6.0) User Guide

Informatica (Version HotFix 3) Reference Data Guide

Informatica (Version 10.0) Mapping Specification Guide

Informatica Data Integration Hub (Version 10.1) Developer Guide

Informatica PowerExchange for MSMQ (Version 9.0.1) User Guide

Informatica PowerExchange for Web Content-Kapow Katalyst (Version ) User Guide

Informatica PowerExchange for MapR-DB (Version Update 2) User Guide

Informatica PowerCenter Data Validation Option (Version 10.0) User Guide

Informatica Test Data Management Release Guide

Informatica (Version 10.1) Metadata Manager Administrator Guide

Informatica Data Integration Hub (Version 10.0) Developer Guide

Informatica PowerExchange for Microsoft Azure SQL Data Warehouse (Version ) User Guide for PowerCenter

Informatica (Version HotFix 4) Installation and Configuration Guide

Informatica Test Data Management (Version 9.7.0) User Guide

Informatica Cloud (Version Spring 2017) Box Connector Guide

Informatica Cloud (Version Fall 2015) Data Integration Hub Connector Guide

Informatica Cloud (Version Spring 2017) DynamoDB Connector Guide

Informatica (Version 9.6.1) Mapping Guide

Informatica (Version ) Developer Workflow Guide

Informatica PowerExchange for Microsoft Azure Cosmos DB SQL API User Guide

Informatica (Version 10.0) Exception Management Guide

Informatica Data Services (Version 9.5.0) User Guide

Informatica 4.0. Installation and Configuration Guide

Informatica (Version 10.1) Security Guide

Informatica PowerCenter Express (Version 9.6.1) Mapping Guide

Informatica Cloud (Version Spring 2017) Salesforce Analytics Connector Guide

Infomatica PowerCenter (Version 10.0) PowerCenter Repository Reports

Informatica Enterprise Data Catalog Installation and Configuration Guide

Informatica Dynamic Data Masking (Version 9.8.3) Installation and Upgrade Guide

Informatica PowerCenter Express (Version HotFix2) Release Guide

Informatica Cloud (Version Spring 2017) NetSuite RESTlet Connector Guide

Informatica Data Director for Data Quality (Version HotFix 4) User Guide

Informatica (Version ) Profiling Getting Started Guide

Informatica B2B Data Exchange (Version 10.2) Administrator Guide

Informatica Cloud (Version Spring 2017) XML Target Connector Guide

Informatica Dynamic Data Masking (Version 9.8.1) Dynamic Data Masking Accelerator for use with SAP

Informatica Cloud (Version Winter 2015) Box API Connector Guide

Informatica (Version 10.1) Live Data Map Administrator Guide

Informatica Cloud (Version Winter 2015) Dropbox Connector Guide

Informatica PowerExchange for Hive (Version 9.6.0) User Guide

Informatica (Version HotFix 3) Business Glossary 9.5.x to 9.6.x Transition Guide

Informatica MDM Multidomain Edition (Version ) Provisioning Tool Guide

Informatica Data Integration Hub (Version ) Administrator Guide

Informatica Data Integration Hub (Version 10.2) Administrator Guide

Informatica PowerExchange for Tableau (Version HotFix 4) User Guide

Informatica PowerExchange for Tableau (Version HotFix 1) User Guide

Informatica Fast Clone (Version 9.6.0) Release Guide

Informatica (Version 9.6.1) Profile Guide

Informatica PowerCenter Express (Version 9.6.1) Getting Started Guide

Informatica PowerExchange for Web Content- Kapow Katalyst (Version 10.0) User Guide

Informatica Dynamic Data Masking (Version 9.8.1) Administrator Guide

Informatica PowerExchange for Cloud Applications HF4. User Guide for PowerCenter

Informatica 4.5. Installation and Configuration Guide

Informatica Development Platform Developer Guide

Informatica Data Integration Hub (Version 10.1) High Availability Guide

Informatica MDM Multidomain Edition (Version 10.2) Data Steward Guide

Informatica PowerExchange for Greenplum (Version 10.0) User Guide

Informatica PowerExchange for Salesforce (Version 10.0) User Guide

Informatica PowerExchange for Tableau (Version 10.0) User Guide

Informatica Enterprise Data Catalog Upgrading from Versions 10.1 and Later

Informatica Cloud Customer 360 (Version Spring 2017 Version 6.45) User Guide

Informatica Cloud Customer 360 (Version Winter 2017 Version 6.42) Setup Guide

Informatica PowerCenter Express (Version 9.6.0) Administrator Guide

Informatica Version HotFix 1. Business Glossary Guide

Informatica Security Guide

Informatica SQL Data Service Guide

Informatica Data Services (Version 9.6.0) Web Services Guide

Informatica Data Quality for SAP Point of Entry (Version 9.5.1) Installation and Configuration Guide

Informatica (Version 10.1) Upgrading from Version 9.5.1

Informatica PowerExchange for Hadoop (Version ) User Guide for PowerCenter

Informatica Development Platform (Version 9.6.1) Developer Guide

Informatica Dynamic Data Masking (Version 9.8.0) Administrator Guide

Informatica Enterprise Data Catalog Installation and Configuration Guide

Informatica Intelligent Data Lake (Version 10.1) Installation and Configuration Guide

Informatica B2B Data Transformation (Version 10.1) XMap Tutorial

Informatica Big Data Management (Version Update 2) Installation and Configuration Guide

Informatica PowerExchange for Hive (Version 9.6.1) User Guide

Informatica PowerExchange for Amazon S3 (Version HotFix 3) User Guide for PowerCenter

Informatica Big Data Management (Version Update 2) User Guide

Informatica PowerExchange (Version ) PowerExchange Interfaces for PowerCenter

Informatica Cloud Integration Hub Spring 2018 August. User Guide

Informatica Cloud (Version Winter 2016) REST API Connector Guide

Informatica PowerExchange for Netezza (Version 10.0) User Guide

Informatica (Version HotFix 2) Upgrading from Version 9.1.0

Informatica (Version 10.1) Analyst Tool Guide

Informatica MDM Multidomain Edition (Version 10.2) Infrastructure Planning Guide

Informatica PowerExchange for Hive (Version HotFix 1) User Guide

Transcription:

Informatica Test Data Management (Version 10.1.0) Release Guide

Informatica Test Data Management Release Guide Version 10.1.0 December 2016 Copyright Informatica LLC 2003, 2017 This software and documentation are provided only under a separate license agreement containing restrictions on use and disclosure. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica LLC. Informatica, the Informatica logo, and PowerCenter are trademarks or registered trademarks of Informatica LLC in the United States and many jurisdictions throughout the world. A current list of Informatica trademarks is available on the web at https://www.informatica.com/trademarks.html. Other company and product names may be trade names or trademarks of their respective owners. Portions of this software and/or documentation are subject to copyright held by third parties, including without limitation: Copyright DataDirect Technologies. All rights reserved. Copyright Sun Microsystems. All rights reserved. Copyright RSA Security Inc. All Rights Reserved. Copyright Ordinal Technology Corp. All rights reserved. Copyright Aandacht c.v. All rights reserved. Copyright Genivia, Inc. All rights reserved. Copyright Isomorphic Software. All rights reserved. Copyright Meta Integration Technology, Inc. All rights reserved. Copyright Intalio. All rights reserved. Copyright Oracle. All rights reserved. Copyright Adobe Systems Incorporated. All rights reserved. Copyright DataArt, Inc. All rights reserved. Copyright ComponentSource. All rights reserved. Copyright Microsoft Corporation. All rights reserved. Copyright Rogue Wave Software, Inc. All rights reserved. Copyright Teradata Corporation. All rights reserved. Copyright Yahoo! Inc. All rights reserved. Copyright Glyph & Cog, LLC. All rights reserved. Copyright Thinkmap, Inc. All rights reserved. Copyright Clearpace Software Limited. All rights reserved. Copyright Information Builders, Inc. All rights reserved. Copyright OSS Nokalva, Inc. All rights reserved. Copyright Edifecs, Inc. All rights reserved. Copyright Cleo Communications, Inc. All rights reserved. Copyright International Organization for Standardization 1986. All rights reserved. Copyright ejtechnologies GmbH. All rights reserved. Copyright Jaspersoft Corporation. All rights reserved. Copyright International Business Machines Corporation. All rights reserved. Copyright yworks GmbH. All rights reserved. Copyright Lucent Technologies. All rights reserved. Copyright University of Toronto. All rights reserved. Copyright Daniel Veillard. All rights reserved. Copyright Unicode, Inc. Copyright IBM Corp. All rights reserved. Copyright MicroQuill Software Publishing, Inc. All rights reserved. Copyright PassMark Software Pty Ltd. All rights reserved. Copyright LogiXML, Inc. All rights reserved. Copyright 2003-2010 Lorenzi Davide, All rights reserved. Copyright Red Hat, Inc. All rights reserved. Copyright The Board of Trustees of the Leland Stanford Junior University. All rights reserved. Copyright EMC Corporation. All rights reserved. Copyright Flexera Software. All rights reserved. Copyright Jinfonet Software. All rights reserved. Copyright Apple Inc. All rights reserved. Copyright Telerik Inc. All rights reserved. Copyright BEA Systems. All rights reserved. Copyright PDFlib GmbH. All rights reserved. Copyright Orientation in Objects GmbH. All rights reserved. Copyright Tanuki Software, Ltd. All rights reserved. Copyright Ricebridge. All rights reserved. Copyright Sencha, Inc. All rights reserved. Copyright Scalable Systems, Inc. All rights reserved. Copyright jqwidgets. All rights reserved. Copyright Tableau Software, Inc. All rights reserved. Copyright MaxMind, Inc. All Rights Reserved. Copyright TMate Software s.r.o. All rights reserved. Copyright MapR Technologies Inc. All rights reserved. Copyright Amazon Corporate LLC. All rights reserved. Copyright Highsoft. All rights reserved. Copyright Python Software Foundation. All rights reserved. Copyright BeOpen.com. All rights reserved. Copyright CNRI. All rights reserved. This product includes software developed by the Apache Software Foundation (http://www.apache.org/), and/or other software which is licensed under various versions of the Apache License (the "License"). You may obtain a copy of these Licenses at http://www.apache.org/licenses/. Unless required by applicable law or agreed to in writing, software distributed under these Licenses is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the Licenses for the specific language governing permissions and limitations under the Licenses. This product includes software which was developed by Mozilla (http://www.mozilla.org/), software copyright The JBoss Group, LLC, all rights reserved; software copyright 1999-2006 by Bruno Lowagie and Paulo Soares and other software which is licensed under various versions of the GNU Lesser General Public License Agreement, which may be found at http:// www.gnu.org/licenses/lgpl.html. The materials are provided free of charge by Informatica, "as-is", without warranty of any kind, either express or implied, including but not limited to the implied warranties of merchantability and fitness for a particular purpose. The product includes ACE(TM) and TAO(TM) software copyrighted by Douglas C. Schmidt and his research group at Washington University, University of California, Irvine, and Vanderbilt University, Copyright ( ) 1993-2006, all rights reserved. This product includes software developed by the OpenSSL Project for use in the OpenSSL Toolkit (copyright The OpenSSL Project. All Rights Reserved) and redistribution of this software is subject to terms available at http://www.openssl.org and http://www.openssl.org/source/license.html. This product includes Curl software which is Copyright 1996-2013, Daniel Stenberg, <daniel@haxx.se>. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://curl.haxx.se/docs/copyright.html. Permission to use, copy, modify, and distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. The product includes software copyright 2001-2005 ( ) MetaStuff, Ltd. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://www.dom4j.org/ license.html. The product includes software copyright 2004-2007, The Dojo Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://dojotoolkit.org/license. This product includes ICU software which is copyright International Business Machines Corporation and others. All rights reserved. Permissions and limitations regarding this software are subject to terms available at http://source.icu-project.org/repos/icu/icu/trunk/license.html. This product includes software copyright 1996-2006 Per Bothner. All rights reserved. Your right to use such materials is set forth in the license which may be found at http:// www.gnu.org/software/ kawa/software-license.html. This product includes OSSP UUID software which is Copyright 2002 Ralf S. Engelschall, Copyright 2002 The OSSP Project Copyright 2002 Cable & Wireless Deutschland. Permissions and limitations regarding this software are subject to terms available at http://www.opensource.org/licenses/mit-license.php. This product includes software developed by Boost (http://www.boost.org/) or under the Boost software license. Permissions and limitations regarding this software are subject to terms available at http:/ /www.boost.org/license_1_0.txt. This product includes software copyright 1997-2007 University of Cambridge. Permissions and limitations regarding this software are subject to terms available at http:// www.pcre.org/license.txt. This product includes software copyright 2007 The Eclipse Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http:// www.eclipse.org/org/documents/epl-v10.php and at http://www.eclipse.org/org/documents/edl-v10.php. This product includes software licensed under the terms at http://www.tcl.tk/software/tcltk/license.html, http://www.bosrup.com/web/overlib/?license, http:// www.stlport.org/doc/ license.html, http://asm.ow2.org/license.html, http://www.cryptix.org/license.txt, http://hsqldb.org/web/hsqllicense.html, http:// httpunit.sourceforge.net/doc/ license.html, http://jung.sourceforge.net/license.txt, http://www.gzip.org/zlib/zlib_license.html, http://www.openldap.org/software/release/ license.html, http://www.libssh2.org, http://slf4j.org/license.html, http://www.sente.ch/software/opensourcelicense.html, http://fusesource.com/downloads/licenseagreements/fuse-message-broker-v-5-3- license-agreement; http://antlr.org/license.html; http://aopalliance.sourceforge.net/; http://www.bouncycastle.org/licence.html; http://www.jgraph.com/jgraphdownload.html; http://www.jcraft.com/jsch/license.txt; http://jotm.objectweb.org/bsd_license.html;. http://www.w3.org/consortium/legal/ 2002/copyright-software-20021231; http://www.slf4j.org/license.html; http://nanoxml.sourceforge.net/orig/copyright.html; http://www.json.org/license.html; http:// forge.ow2.org/projects/javaservice/, http://www.postgresql.org/about/licence.html, http://www.sqlite.org/copyright.html, http://www.tcl.tk/software/tcltk/license.html, http:// www.jaxen.org/faq.html, http://www.jdom.org/docs/faq.html, http://www.slf4j.org/license.html; http://www.iodbc.org/dataspace/iodbc/wiki/iodbc/license; http:// www.keplerproject.org/md5/license.html; http://www.toedter.com/en/jcalendar/license.html; http://www.edankert.com/bounce/index.html; http://www.net-snmp.org/about/

license.html; http://www.openmdx.org/#faq; http://www.php.net/license/3_01.txt; http://srp.stanford.edu/license.txt; http://www.schneier.com/blowfish.html; http:// www.jmock.org/license.html; http://xsom.java.net; http://benalman.com/about/license/; https://github.com/createjs/easeljs/blob/master/src/easeljs/display/bitmap.js; http://www.h2database.com/html/license.html#summary; http://jsoncpp.sourceforge.net/license; http://jdbc.postgresql.org/license.html; http:// protobuf.googlecode.com/svn/trunk/src/google/protobuf/descriptor.proto; https://github.com/rantav/hector/blob/master/license; http://web.mit.edu/kerberos/krb5- current/doc/mitk5license.html; http://jibx.sourceforge.net/jibx-license.html; https://github.com/lyokato/libgeohash/blob/master/license; https://github.com/hjiang/jsonxx/ blob/master/license; https://code.google.com/p/lz4/; https://github.com/jedisct1/libsodium/blob/master/license; http://one-jar.sourceforge.net/index.php? page=documents&file=license; https://github.com/esotericsoftware/kryo/blob/master/license.txt; http://www.scala-lang.org/license.html; https://github.com/tinkerpop/ blueprints/blob/master/license.txt; http://gee.cs.oswego.edu/dl/classes/edu/oswego/cs/dl/util/concurrent/intro.html; https://aws.amazon.com/asl/; https://github.com/ twbs/bootstrap/blob/master/license; https://sourceforge.net/p/xmlunit/code/head/tree/trunk/license.txt; https://github.com/documentcloud/underscore-contrib/blob/ master/license, and https://github.com/apache/hbase/blob/master/license.txt. This product includes software licensed under the Academic Free License (http://www.opensource.org/licenses/afl-3.0.php), the Common Development and Distribution License (http://www.opensource.org/licenses/cddl1.php) the Common Public License (http://www.opensource.org/licenses/cpl1.0.php), the Sun Binary Code License Agreement Supplemental License Terms, the BSD License (http:// www.opensource.org/licenses/bsd-license.php), the new BSD License (http://opensource.org/ licenses/bsd-3-clause), the MIT License (http://www.opensource.org/licenses/mit-license.php), the Artistic License (http://www.opensource.org/licenses/artisticlicense-1.0) and the Initial Developer s Public License Version 1.0 (http://www.firebirdsql.org/en/initial-developer-s-public-license-version-1-0/). This product includes software copyright 2003-2006 Joe WaInes, 2006-2007 XStream Committers. All rights reserved. Permissions and limitations regarding this software are subject to terms available at http://xstream.codehaus.org/license.html. This product includes software developed by the Indiana University Extreme! Lab. For further information please visit http://www.extreme.indiana.edu/. This product includes software Copyright (c) 2013 Frank Balluffi and Markus Moeller. All rights reserved. Permissions and limitations regarding this software are subject to terms of the MIT license. See patents at https://www.informatica.com/legal/patents.html. DISCLAIMER: Informatica LLC provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied warranties of noninfringement, merchantability, or use for a particular purpose. Informatica LLC does not warrant that this software or documentation is error free. The information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation is subject to change at any time without notice. NOTICES This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating company of Progress Software Corporation ("DataDirect") which are subject to the following terms and conditions: 1. THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT. 2. IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE OF THE ODBC DRIVERS, WHETHER OR NOT INFORMED OF THE POSSIBILITIES OF DAMAGES IN ADVANCE. THESE LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT LIMITATION, BREACH OF CONTRACT, BREACH OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS. The information in this documentation is subject to change without notice. If you find any problems in this documentation, please report them to us in writing at Informatica LLC 2100 Seaport Blvd. Redwood City, CA 94063. INFORMATICA LLC PROVIDES THE INFORMATION IN THIS DOCUMENT "AS IS" WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING WITHOUT ANY WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND ANY WARRANTY OR CONDITION OF NON- INFRINGEMENT. Publication Date: 2017-01-09

Table of Contents Preface.... 7 Informatica Resources.... 7 Informatica Network.... 7 Informatica Knowledge Base.... 7 Informatica Documentation.... 7 Informatica Product Availability Matrixes.... 8 Informatica Velocity.... 8 Informatica Marketplace.... 8 Informatica Global Customer Support.... 8 Part I: Test Data Management 10.1.0.... 9 Chapter 1: New Features and Enhancements (10.1.0).... 10 New Features and Enhancements for Version 10.1.0.... 10 Data Generation Rules.... 10 Blaze Execution Engine.... 11 Avro and Parquet Data Sources.... 11 Single Masking Rule Assignment.... 11 Advanced Search Text Filter.... 11 Oracle Data Source Import.... 11 Data Coverage Analysis.... 12 View and Edit Data in a Data Set.... 12 Related Data Sets.... 12 Subset Data Set Commands.... 12 Pre Workflow and Post Workflow Plan Settings.... 12 Chapter 2: Changes (10.1.0).... 13 Changes for Version 10.1.0.... 13 Configure Parent Child Ratio.... 13 TDM Repository Configuration.... 13 Test Data Warehouse Service.... 14 Roles and Privileges for Test Data Warehouse.... 14 Repository and Data Mart Tab.... 14 infacmd Command Line Program.... 14 Data Set Reset Configuration.... 15 tdwquery.... 15 tdwcmd.... 15 Templates for Data Subset Objects.... 16 4 Table of Contents

Part II: Test Data Management 9.7.1 HotFix 1.... 17 Chapter 3: New Features and Enhancements (9.7.1 HotFix 1).... 18 New Features and Enhancements for Version 9.7.1 HotFix 1.... 18 Support for Cassandra and MongoDB Connections.... 18 Bulk Load in Netezza Connections.... 18 Chapter 4: Changes (9.7.1 HotFix 1).... 19 Changes for Version 9.7.1 HotFix 1.... 19 Change to Support in Version 9.7.1 HotFix 1.... 19 Test Data Warehouse.... 19 tdwcmd.... 19 tdwquery.... 20 Part III: Test Data Management 9.7.1.... 21 Chapter 5: New Features and Enhancements (9.7.1).... 22 New Features and Enhancements for Version 9.7.1.... 22 XML Sources.... 22 Session and Workflow Logs.... 22 Naming Convention for Workflows, Mappings, and Sessions.... 22 Import Data Domains From the Model Repository.... 23 Seed Parameterization.... 23 Change Propagation.... 23 JDBC Connection.... 23 Hadoop Data Sources.... 23 Flat Files.... 24 Data Generation Rules.... 24 Add Data Generation Rules to a Data Domain.... 24 Column Profile.... 24 Unique Key Constraints.... 25 Default Settings.... 25 Chapter 6: Changes (9.7.1).... 26 Changes for Version 9.7.1.... 26 Import Profiles from the Model Repository.... 26 Rule Simulation With Pre- and Post-Processing of Rules.... 26 Disable and Enable Constraints in Plans That Use Connection Parameters.... 26 Default Settings.... 27 Create a Primary Key Constraint.... 27 Table of Contents 5

Part IV: Test Data Management 9.7.0.... 28 Chapter 7: New Features and Enhancements (9.7.0).... 29 New Features and Enhancements for Version 9.7.0.... 29 Test Data Warehouse.... 29 tdwquery.... 29 TDM Support for Hadoop.... 30 Connection Permissions.... 30 Chapter 8: Changes (9.7.0).... 31 Changes for Version 9.7.0.... 31 Default Value in Reference Lookup Generation.... 31 Override Entity Relations at Run Time.... 31 6 Table of Contents

Preface The Test Data Management Release Guide lists new features and enhancements, behavior changes between versions, and tasks that you might need to complete after you upgrade from a previous version. Informatica Resources Informatica Network Informatica Network hosts Informatica Global Customer Support, the Informatica Knowledge Base, and other product resources. To access Informatica Network, visit https://network.informatica.com. As a member, you can: Access all of your Informatica resources in one place. Search the Knowledge Base for product resources, including documentation, FAQs, and best practices. View product availability information. Review your support cases. Find your local Informatica User Group Network and collaborate with your peers. Informatica Knowledge Base Use the Informatica Knowledge Base to search Informatica Network for product resources such as documentation, how-to articles, best practices, and PAMs. To access the Knowledge Base, visit https://kb.informatica.com. If you have questions, comments, or ideas about the Knowledge Base, contact the Informatica Knowledge Base team at KB_Feedback@informatica.com. Informatica Documentation To get the latest documentation for your product, browse the Informatica Knowledge Base at https://kb.informatica.com/_layouts/productdocumentation/page/productdocumentsearch.aspx. If you have questions, comments, or ideas about this documentation, contact the Informatica Documentation team through email at infa_documentation@informatica.com. 7

Informatica Product Availability Matrixes Product Availability Matrixes (PAMs) indicate the versions of operating systems, databases, and other types of data sources and targets that a product release supports. If you are an Informatica Network member, you can access PAMs at https://network.informatica.com/community/informatica-network/product-availability-matrices. Informatica Velocity Informatica Velocity is a collection of tips and best practices developed by Informatica Professional Services. Developed from the real-world experience of hundreds of data management projects, Informatica Velocity represents the collective knowledge of our consultants who have worked with organizations from around the world to plan, develop, deploy, and maintain successful data management solutions. If you are an Informatica Network member, you can access Informatica Velocity resources at http://velocity.informatica.com. If you have questions, comments, or ideas about Informatica Velocity, contact Informatica Professional Services at ips@informatica.com. Informatica Marketplace The Informatica Marketplace is a forum where you can find solutions that augment, extend, or enhance your Informatica implementations. By leveraging any of the hundreds of solutions from Informatica developers and partners, you can improve your productivity and speed up time to implementation on your projects. You can access Informatica Marketplace at https://marketplace.informatica.com. Informatica Global Customer Support You can contact a Global Support Center by telephone or through Online Support on Informatica Network. To find your local Informatica Global Customer Support telephone number, visit the Informatica website at the following link: http://www.informatica.com/us/services-and-training/support-services/global-support-centers. If you are an Informatica Network member, you can use Online Support at http://network.informatica.com. 8 Preface

Part I: Test Data Management 10.1.0 This part contains the following chapters: New Features and Enhancements (10.1.0), 10 Changes (10.1.0), 13 9

C H A P T E R 1 New Features and Enhancements (10.1.0) This chapter includes the following topic: New Features and Enhancements for Version 10.1.0, 10 New Features and Enhancements for Version 10.1.0 This section describes the new features and enhancements for version 10.1.0. Data Generation Rules To generate test data, you can create the following types of generation rules: Advanced Generation Rule An advanced generation rule is a combination of generation techniques that generates test data in target columns based on the configuration of one or more than one input port, variable port, and output port. You can generate test data for string, numeric, and date data types. When you create an advanced rule, create input ports, variable ports, and output ports to generate data as per the requirements. Credit Card Generation Rule Use a credit card generation rule to generate credit card numbers based on the issuing credit card network. You can select the credit card issuer type that you want to generate, specify the starting numbers for the card, and the distribution percentage. You can generate test data for the following issuing networks: American Express, Discover, JCB, Mastercard, and Visa. You can generate credit card numbers for string data types. You cannot generate credit card numbers for numeric, date, and binary data types. Conditional Generation Rule A conditional generation rule is an ad hoc rule in which you can specify a conditional expression and a generation rule to generate test data. You can use the string, numeric, and date data types in a conditional generation rule. You cannot apply conditions on effective dates and sequence generation rules. For information about how to create and use the data generation rules, see the Test Data Management 10.1.0 User Guide. 10

Blaze Execution Engine You can select an execution engine to run the mappings for a Hadoop connection. Select a Blaze execution engine at the plan level or from the Administrator Preferences view. TDM prioritizes the plan-level settings when you run the plan in the Hadoop environment. If you select the Blaze execution engine, Blaze uses an internal workflow compiler to run the mapping. Use a Blaze engine to improve the speed and performance of the task. For information about the Blaze execution engine, see the Test Data Management 10.1.0 User Guide. Avro and Parquet Data Sources Avro and Parquet are semi-structured data sources. When you select an HDFS target connection, use Avro or Parquet resource formats to mask data and to move data in groups. You can select the resource format at the plan level. For information about Avro and Parquet data sources, see the Test Data Management 10.1.0 User Guide. Single Masking Rule Assignment You can assign a single masking rule, a policy, or both to multiple columns of similar data type. You can search for column names with similar data type. Assign a single rule or a policy to all the selected columns at once instead of individual assignment to each column. For information about how to assign a single rule to multiple columns, see the Test Data Management 10.1.0 User Guide. Advanced Search Text Filter In Test Data Manager, you can enter a text instead of the wild characters in the filter criteria to search for columns. Based on the type of filter operators, TDM searches and displays the results. You can use the following types of filters: exact match, starts with, ends with, and contains For information about how to use advanced search text filter, see the Test Data Management 10.1.0 User Guide. Oracle Data Source Import If you import data sources from an Oracle database, you can choose to ignore the tables that do not contain any data. Test Data Manager excludes all the empty tables in the Oracle database and lists the tables that contain data. For information about how to import Oracle data source without empty tables, see the Test Data Management 10.1.0 User Guide. New Features and Enhancements for Version 10.1.0 11

Data Coverage Analysis You can create data coverage tasks to analyze test data in a data set. Create a data coverage task to analyze values in a combination of two columns. You can add up to two filter columns to configure the scope of the analysis. Change the combinations of data to verify that a data set contains data required for specific test cases. You can update data across combinations of columns to add data where there might not be sufficient data density. For information about how to perform data coverage analysis, see the Test Data Management 10.1.0 User Guide. View and Edit Data in a Data Set You can view and edit the data in a data set from Test Data Manager. View and manage data from the data set page. You can add, delete, or update data in the data set. You can use SQL queries to perform these tasks or directly update the data in the grid. Add row-level tags to the data set to track changes or mark specific rows. You can perform a keyword search based on row-level tags. For information about how to view and edit data in a data set, see the Test Data Management 10.1.0 User Guide. Related Data Sets You can create a related data set from an existing data set. The related data set contains a subset of the data in the original data set. Create a related data set when you want to separately store and use a specific chunk of the data in a data set. For information about how to create a related data set, see the Test Data Management 10.1.0 User Guide. Subset Data Set Commands To create a subset of a data set, you can use the following ilmcmd commands: ListPlans. Lists all the plans within a data set version. TDWPlanGenerate. Generates a workflow to create a subset of a data set. TDWPlanExecute. Runs a plan to create a subset of a data set. TDWPlanGenExe. Generates and runs a data set subset plan from a data set version. For information about how to run the commands, see the Test Data Management 10.1.0 User Guide. Pre Workflow and Post Workflow Plan Settings In a TDM plan, you can select workflows that you want to run before and after the current workflow. Configure the pre workflow parameter if you want to run a specific workflow before you run the workflow from the current plan. Configure the post workflow parameter if you want to run a specific workflow after you run the workflow from the current plan. For information about how to configure and use pre workflow and post workflow parameters, see the Test Data Management 10.1.0 User Guide. 12 Chapter 1: New Features and Enhancements (10.1.0)

C H A P T E R 2 Changes (10.1.0) This chapter includes the following topic: Changes for Version 10.1.0, 13 Changes for Version 10.1.0 This section describes the changes for version 10.1.0. Configure Parent Child Ratio Effective in 10.1.0, you can configure the number of child records individually for entities and for XML elements to generate test data. You can select a relationship in an entity and specify the minimum and maximum number of child records to generate test data. To generate test data from XSD metadata, select a link between the XML elements and attributes and specify the minimum and maximum number of child records for each parent. When you run a data generation plan, TDM prioritizes the parent child ratio configured at the individual relationship level over the plan-level configuration. Previously, you could specify the minimum and maximum number of child records only at the plan level and at the global preferences level to generate test data. For information about how to configure the parent child ratio, see the Test Data Management 10.1.0 User Guide. TDM Repository Configuration Effective in version 10.1.0, you cannot use the Custom database type to configure the TDM repository. You can use the Oracle, Microsoft SQL SERVER, or IBM DB2 database types to configure the TDM repository. Previously, you could use the Custom database type to configure the TDM repository with custom database drivers. If you upgrade from a previous version with a TDM repository that uses custom JDBC drivers, you must delete the Test Data Manager Service and create the TDM repository with native drivers after you upgrade. For information about how to configure the TDM repository, see the Test Data Management 10.1.0 Installation Guide. 13

Test Data Warehouse Service Effective in version 10.1.0, test data warehouse is available as a managed service in the Informatica domain. Create and administer the Test Data Warehouse Service from the Administrator tool to create the test data warehouse repository and repository content. Alternatively, use the infacmd tdw command line program to create and manage the Test Data Warehouse Service. Previously, the Test Data Manager Service managed the test data warehouse. If you upgrade from a previous version, you must use the same test data warehouse repository and test data warehouse information when you create the Test Data Warehouse Service. For information about how to create a Test Data Warehouse Service, see the Test Data Management 10.1.0 Installation Guide. Roles and Privileges for Test Data Warehouse Effective in version 10.1.0, you require Test Data Warehouse Service privileges to perform tasks in the test data warehouse. You can assign a Test Engineer role specific to test data warehouse. Test Data Warehouse Service privileges include the following data set privileges: View data set Manage data set Reset data set View data in data set Manage data in data set The Test Engineer role includes all the data set privileges. Previously, you required specific Test Data Manager Service privileges to perform test data warehouse tasks. Repository and Data Mart Tab Effective in version 10.1.0, the Administrator Repository and Data Mart tab is not available in Test Data Manager. You configure a test data warehouse repository and test data warehouse when you create the Test Data Warehouse Service in Informatica Administrator. Previously, you created a test data warehouse repository and test data warehouse from the Administrator Repository and Data Mart tab in Test Data Manager. infacmd Command Line Program Effective in version 10.1.0, the infacmd command line program contains the following changes: You run the infacmd tdw command line program to create and administer the Test Data Warehouse Service. Previously, you used the tdwcmd command line program to create the test data warehouse repository and the repository content. The infacmd tdm CreateService parameters are changed. The following parameters are removed: customdriver 14 Chapter 2: Changes (10.1.0)

httpprotocoltype EnableTDWService The following parameters are added: TDWServiceName hadoopdistdir hadoopkerbspn hadoopkerbkeytab SSLProtocol For information about how to run the infacmd command line program, see the Test Data Management 10.1.0 Installation Guide. Data Set Reset Configuration Effective in version 10.1.0, you can configure a reset operation to perform an insert in master tables instead of an upsert. You can use this configuration if all tables in the data set are master tables. The reset operation might take less time to run. This is because the reset directly inserts all records from the data set instead of first checking if any records exist in the target. Previously, you could not change the default reset configuration. The default reset configuration is to perform an upsert on master tables. For information about how to configure a reset operation, see the Test Data Management 10.1.0 User Guide. tdwquery Effective in version 10.1.0, the tdwquery properties file contains the properties TDWSERVICENAME and TDW Service URL. This is the name of the Test Data Warehouse Service that manages the test data warehouse and the URL of the Test Data Warehouse Service. Previously, you entered the name of the Test Data Manager Service and the Test Data Manager Service URL. If you upgrade from a previous version, update the userconfig.ilm properties file with the name and URL of the Test Data Warehouse Service before you use tdwquery in 10.1.0. For information about how to configure the userconfig.ilm file and use tdwquery, see the Test Data Management 10.1.0User Guide. tdwcmd Effective in 10.1.0, the CreateTDWREPO and CreateTDWMART commands are removed. The following table lists the tdwcmd commands and parameters that are changed from TDM 9.7.0 to TDM 10.1.0: TDM 9.7.0 and TDM 9.7.1 TDM 9.7.1 HotFix 1 TDM 10.1.0 TDMServiceName (tsn) TDMServiceName (tsn) TDWServiceName (tsn) MartName (mn) TestDataWarehouseName (twn) TestDataWarehouseName (twn) Changes for Version 10.1.0 15

TDM 9.7.0 and TDM 9.7.1 TDM 9.7.1 HotFix 1 TDM 10.1.0 DataMart TestDataWarehouse TestDataWarehouse MartName (mn) TestDataWarehouseName (twn) TestDataWarehouseName (twn) VersionMartName (vmn) TestDataWarehouseName (twn) TestDataWarehouseName (twn) TableMartName (tmn) TestDataWarehouseName (twn) TestDataWarehouseName (twn) Effective in 10.1.0, the HostName, HTTPPort, HTTPSPort, and DomainName parameters connect to Test Data Warehouse hosts, ports, and domains respectively. Previously, the HostName, HTTPPort, HTTPSPort, and DomainName parameters connected to Test Data Manager hosts, ports, and domains respectively. For information on tdwcmd commands and parameters, see the Test Data Management 10.1.0 User Guide. Templates for Data Subset Objects Effective in version 10.1.0, you cannot create a template to add entities and groups. You can use and edit existing templates but you cannot create new templates. Previously, you could create a template to add multiple entities and groups and use the template in a plan. 16 Chapter 2: Changes (10.1.0)

Part II: Test Data Management 9.7.1 HotFix 1 This part contains the following chapters: New Features and Enhancements (9.7.1 HotFix 1), 18 Changes (9.7.1 HotFix 1), 19 17

C H A P T E R 3 New Features and Enhancements (9.7.1 HotFix 1) This chapter includes the following topic: New Features and Enhancements for Version 9.7.1 HotFix 1, 18 New Features and Enhancements for Version 9.7.1 HotFix 1 This section describes the new features and enhancements for version 9.7.1 HotFix 1. Support for Cassandra and MongoDB Connections You can use Cassandra and MongoDB connections to perform data domain discovery and data masking operations. In Test Data Manager, use ODBC connections to connect to Cassandra and MongoDB. You can use value cascades for Cassandra and MongoDB data sources. You cannot perform data subset and data generation operations. You cannot use auto cascades for Cassandra and MongoDB data sources. For information about Cassandra and MongoDB data types, see the Test Data Management 9.7.1 HotFix 1 User Guide. Bulk Load in Netezza Connections You can use the bulk reader and bulk writer to read and write data from a Netezza source and target. For information about how to configure the bulk reader properties and bulk writer properties for a Netezza connection, see the Test Data Management 9.7.1 HotFix 1 Administrator Guide. 18

C H A P T E R 4 Changes (9.7.1 HotFix 1) This chapter includes the following topic: Changes for Version 9.7.1 HotFix 1, 19 Changes for Version 9.7.1 HotFix 1 This section describes the changes for version 9.7.1 HotFix 1. Change to Support in Version 9.7.1 HotFix 1 Effective in version 9.7.1 HotFix 1, TDM deferred support for Hadoop connections. Support will be resumed with version 10.1.0. Test Data Warehouse Effective in 9.7.1 HotFix 1, the naming convention used for the test data warehouse databases is changed. The database that stores the source table metadata and the project metadata generated when you create a data set is called the test data warehouse repository. Previously, it was called the test data repository. The database that stores the source data that you include in a data set is called the test data warehouse. Previously, it was called the test data mart. tdwcmd Effective in 9.7.1 HotFix 1, you must use the command line scripts with the updated commands and parameters. The following tdwcmd commands and parameters are changed: TDM 9.7.1 TDM 9.7.1 HotFix 1 CreateTDWMART MartName (mn) DataMart CreateTestDataWarehouse TestDataWarehouseName (twn) TestDataWarehouse 19

TDM 9.7.1 TDM 9.7.1 HotFix 1 MartName (mn) VersionMartName (vmn) TableMartName (tmn) TestDataWarehouseName (twn) TestDataWarehouseName (twn) TestDataWarehouseName (twn) For information on tdwcmd commands and parameters, see the Test Data Management 9.7.1 HotFix 1 Administrator Guide and Test Data Management 9.7.1 HotFix 1 User Guide. tdwquery Effective in 9.7.1 HotFix 1, the tdwquery properties file contains a property TESTDATAWAREHOUSENAME. This is the name of the test data warehouse where you store the data set. Previously, the property was called MARTNAME. Update the userconfig.ilm properties file before you use tdwquery in 9.7.1 HotFix 1. 20 Chapter 4: Changes (9.7.1 HotFix 1)

Part III: Test Data Management 9.7.1 This part contains the following chapters: New Features and Enhancements (9.7.1), 22 Changes (9.7.1), 26 21

C H A P T E R 5 New Features and Enhancements (9.7.1) This chapter includes the following topic: New Features and Enhancements for Version 9.7.1, 22 New Features and Enhancements for Version 9.7.1 This section describes the new features and enhancements for version 9.7.1. XML Sources You can perform data subset, data masking, and data generation operations on XML sources. Import the source definition from the PowerCenter repository. Apply subset criteria, data masking rules, or data generation rules to perform the required operation. For information about how to perform TDM operations on XML sources, see the Test Data Management 9.7.1 User Guide. Session and Workflow Logs You can configure the workflow and session log properties when you create a plan. Edit the Log Settings section when you configure a plan. Choose to create workflow logs by runs or by time stamp. Choose to create a log file in addition to binary logs for log events. Configure properties for session and partial session log files. For information about how to configure session and workflow logs in a plan, see the Test Data Management 9.7.1 User Guide. Naming Convention for Workflows, Mappings, and Sessions You can customize the naming convention for workflows, mappings, and sessions that you create in TDM. Create a naming format to standardize the names of workflows, mappings, and sessions. When you customize the naming format, all workflows, mappings, and sessions that TDM creates use the same format. For information about how to customize the naming format, see the Test Data Management 9.7.1 Administrator Guide. 22

Import Data Domains From the Model Repository You can import data domains from the Model repository into the TDM repository. The TDM repository must contain data domains listed in profiles that you import from the Model repository. Import the data domains from the Model repository before you import the profile into TDM. For information about how to import a data domain from Test Data Manager, see the Test Data Management 9.7.1 User Guide. Seed Parameterization You can enter the seed value as a parameter when you create a masking rule that requires repeatable output. Add a variable for the seed value to the parameter file and enter the variable when you create the masking rule. If you require a different seed value, you can edit the seed value in the parameter file and then run the plan again. You do not need to edit the masking rule. For information about how to use seed values in data masking rules, see the Test Data Management 9.7.1 User Guide. Change Propagation When you update or delete policies, data domains, masking rules, generation rules, and data subset components, a warning message appears with the list of affected objects that contain assignments. You can either choose to cancel the updates or continue with the changes. You can download the list of affected objects and save the.csv file. If you update a rule, you change the properties within the rule. The changes do not take effect in the plans that contain the rules. You must fix the changes in the plan, and generate and run the plan again. If you make changes in the name or type of the field of a custom rule, the assignments do not contain the changes. You must import the mapplet again. For information about how to use seed values in data masking rules, see the Test Data Management 9.7.1 User Guide. JDBC Connection You can use the connection type JDBC to create a JDBC target connection. Use this connection when your target connection is a MySQL database and you want to disable and enable constraints in a plan. You need to select a target JDBC connection. You can use the JDBC connection type to create a connection with a MYSQL database. Hadoop Data Sources You can enable Kerberos authentication for Hadoop data sources. You can use auto cascades and value cascades for Hadoop data sources. You can apply shuffle, substitution, mapplet, and advanced masking rules to mask Hadoop data sources. You cannot use advanced email masking and unique substitution rules to mask Hadoop data. For information about the data masking techniques, see the Test Data Management 9.7.1 User Guide. New Features and Enhancements for Version 9.7.1 23

Flat Files You can configure the flat file properties such as file encoding, file format, optional quotes, and include headers. If the source is relational, you can specify the file encoding type for the target flat file. If the source is a flat file, the default file encoding type of the target flat file is the same as that of the source flat file. You can choose the fixed width or delimited format for a flat file. You can select an optional quote to separate columns in a.csv file based on the data. You can select single quote, double quotes, or no quotes. You can choose to include headers in the target flat file. For information about the flat file properties, see the Test Data Management 9.7.1 User Guide. Data Generation Rules The data generation rules include the following enhancements: Decimal Values in Percentages In a data generation rule, you can enter decimal values up to two places in distribution percentages for data patterns, list of values, null values, and values that are not valid. Generate Numbers in a String Column In an ad hoc generation rule, you can generate numbers in a column with string data type. To generate numbers in the column, select Override Data Type to Numeric when you create an ad hoc generation rule. Start Sequence from Last Value In a sequence generation rule, you can choose to start the data sequence from the last output value. When you run a plan again, TDM generates the sequence of data from the previous output value. You can choose to start the sequence from the last value when you want to generate primary and unique keys. For information about the data masking techniques, see the Test Data Management 9.7.1 User Guide. Add Data Generation Rules to a Data Domain Assign one or more data generation rules to a data domain. When you assign a generation rule to a data domain, the columns in the domain receive the data generation rule when you configure data generation. When you add data generation rules to a data domain, you can enable one of the rules to be the default rule. The default rule is applicable to all columns in the data domain. For information about how to add data generation rules to a data domain, see the Test Data Management 9.7.1 User Guide. Column Profile Create and run a column profile to determine the data source column characteristics, such as a list of values, data patterns, and value ranges. When you create a column profile, you can select the columns on which you want to run a profile, and configure data sampling options. After you configure and run a column profile, TDM analyzes column data and infers rules based on the data type. Column data types include all the inferred data types for each column in the profile results. Use the column profile results to generate test data. 24 Chapter 5: New Features and Enhancements (9.7.1)

Auto rules are the inferred rules that you assign after you run a column profile. When the status of inferred rule is Yes, you can assign the auto rule to target columns to generate test data. For information about how to create a column profile, see the Test Data Management 9.7.1 User Guide. Unique Key Constraints Add a column or a set of columns as a constraint to define a unique key in a table. A unique key column can contain null values. You can create more than one unique key in table. You can perform multiple data generation assignments when you filter the columns based on unique keys. For information about how to create a unique key constraint, see the Test Data Management 9.7.1 User Guide. Default Settings Use default settings to generate test data for the columns that you have not assigned specific generation rules to. When you create a project and import target metadata, the project contains the default generation rules for all data types that a TDM administrator configures. In a project, you can view the default data generation rules, default primary key generation rules, and default unique key generation rules for all data types. For information about how to configure default settings to generate test data, see the Test Data Management 9.7.1 User Guide. New Features and Enhancements for Version 9.7.1 25

C H A P T E R 6 Changes (9.7.1) This chapter includes the following topic: Changes for Version 9.7.1, 26 Changes for Version 9.7.1 This section describes the changes for version 9.7.1. Import Profiles from the Model Repository Effective in 9.7.1, you can import profiles with curation information directly from the Model repository in Test Data Manager. Previously, you used Informatica Developer to export the profile as an XML file and then imported the XML file in Test Data Manager. You could not import the curation information. For information about how to import a profile in Test Data Manager, see the Test Data Management 9.7.1 User Guide. Rule Simulation With Pre- and Post-Processing of Rules Effective in 9.7.1, you can perform rule simulation with pre- and post-processing of rules for columns with Double and Date data types also. Previously, you could perform rule simulation with pre and post processing of rules for columns with the String data type. Disable and Enable Constraints in Plans That Use Connection Parameters Effective in 9.7.1, you can choose to disable constraints when you run a plan that uses connection parameters. Previously, you could not disable constraints and indexes when you used connection parameters in a plan. 26

Default Settings Effective in 9.7.1, you can view the default data generation rules, default primary key generation rules, and default unique key generation rules for all data types in the Administrator view. Previously, you could only view the default data generation rules in the Administrator view. Create a Primary Key Constraint Effective in 9.7.1, you can create a primary key constraint from the Tables Constraints view. Previously, you created a logical primary key from the Tables Columns view. Changes for Version 9.7.1 27

Part IV: Test Data Management 9.7.0 This part contains the following chapters: New Features and Enhancements (9.7.0), 29 Changes (9.7.0), 31 28

C H A P T E R 7 New Features and Enhancements (9.7.0) This chapter includes the following topic: New Features and Enhancements for Version 9.7.0, 29 New Features and Enhancements for Version 9.7.0 This section describes the new features and enhancements for version 9.7.0. Test Data Warehouse Configure a test data warehouse to store and manage multiple versions of test data in a central location. Store versions of test data from relational sources as versions of a data set. A test data warehouse consists of a test data repository and a test data mart. The test data repository stores the metadata related to the data sets. The test data mart stores the data. You can use a single test data repository and a single test data mart for multiple machines in the domain that host a Test Data Manager Service. You can share and reuse test data across teams and projects. To return a test environment to a specific state, you can reset a data set version from the test data mart to a target. To search the test data repository, you can add tags to the data set versions. For information about how to configure a test data repository and test data mart, see the Test Data Management 9.7.0 Administrator Guide. For information about how to create a data set in the test data mart, see the Test Data Management 9.7.0 User Guide. tdwquery tdwquery is a command line program that you can use to analyze data in a data set version. Store multiple versions of test data in the test data mart and then use tdwquery to run queries to analyze the data. You can use the Select and Select Distinct clauses with multiple functions and joins to filter and search for data when you run queries. For information on how to configure and use tdwquery, see the Test Data Management 9.7.0 User Guide. 29