Informatica (Version 10.1) Live Data Map Administrator Guide

Size: px
Start display at page:

Download "Informatica (Version 10.1) Live Data Map Administrator Guide"

Transcription

1 Informatica (Version 10.1) Live Data Map Administrator Guide

2 Informatica Live Data Map Administrator Guide Version 10.1 June 2016 Copyright (c) Informatica LLC. All rights reserved. This software and documentation contain proprietary information of Informatica LLC and are provided under a license agreement containing restrictions on use and disclosure and are also protected by copyright law. Reverse engineering of the software is prohibited. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica LLC. This Software may be protected by U.S. and/or international Patents and other Patents Pending. Use, duplication, or disclosure of the Software by the U.S. Government is subject to the restrictions set forth in the applicable software license agreement and as provided in DFARS (a) and (a) (1995), DFARS (1)(ii) (OCT 1988), FAR (a) (1995), FAR , or FAR (ALT III), as applicable. The information in this product or documentation is subject to change without notice. If you find any problems in this product or documentation, please report them to us in writing. Informatica, Informatica Platform, Informatica Data Services, PowerCenter, PowerCenterRT, PowerCenter Connect, PowerCenter Data Analyzer, PowerExchange, PowerMart, Metadata Manager, Informatica Data Quality, Informatica Data Explorer, Informatica B2B Data Transformation, Informatica B2B Data Exchange Informatica On Demand, Informatica Identity Resolution, Informatica Application Information Lifecycle Management, Informatica Complex Event Processing, Ultra Messaging, Informatica Master Data Management, and Live Data Map are trademarks or registered trademarks of Informatica LLC in the United States and in jurisdictions throughout the world. All other company and product names may be trade names or trademarks of their respective owners. Portions of this software and/or documentation are subject to copyright held by third parties, including without limitation: Copyright DataDirect Technologies. All rights reserved. Copyright Sun Microsystems. All rights reserved. Copyright RSA Security Inc. All Rights Reserved. Copyright Ordinal Technology Corp. All rights reserved. Copyright Aandacht c.v. All rights reserved. Copyright Genivia, Inc. All rights reserved. Copyright Isomorphic Software. All rights reserved. Copyright Meta Integration Technology, Inc. All rights reserved. Copyright Intalio. All rights reserved. Copyright Oracle. All rights reserved. Copyright Adobe Systems Incorporated. All rights reserved. Copyright DataArt, Inc. All rights reserved. Copyright ComponentSource. All rights reserved. Copyright Microsoft Corporation. All rights reserved. Copyright Rogue Wave Software, Inc. All rights reserved. Copyright Teradata Corporation. All rights reserved. Copyright Yahoo! Inc. All rights reserved. Copyright Glyph & Cog, LLC. All rights reserved. Copyright Thinkmap, Inc. All rights reserved. Copyright Clearpace Software Limited. All rights reserved. Copyright Information Builders, Inc. All rights reserved. Copyright OSS Nokalva, Inc. All rights reserved. Copyright Edifecs, Inc. All rights reserved. Copyright Cleo Communications, Inc. All rights reserved. Copyright International Organization for Standardization All rights reserved. Copyright ejtechnologies GmbH. All rights reserved. Copyright Jaspersoft Corporation. All rights reserved. Copyright International Business Machines Corporation. All rights reserved. Copyright yworks GmbH. All rights reserved. Copyright Lucent Technologies. All rights reserved. Copyright (c) University of Toronto. All rights reserved. Copyright Daniel Veillard. All rights reserved. Copyright Unicode, Inc. Copyright IBM Corp. All rights reserved. Copyright MicroQuill Software Publishing, Inc. All rights reserved. Copyright PassMark Software Pty Ltd. All rights reserved. Copyright LogiXML, Inc. All rights reserved. Copyright Lorenzi Davide, All rights reserved. Copyright Red Hat, Inc. All rights reserved. Copyright The Board of Trustees of the Leland Stanford Junior University. All rights reserved. Copyright EMC Corporation. All rights reserved. Copyright Flexera Software. All rights reserved. Copyright Jinfonet Software. All rights reserved. Copyright Apple Inc. All rights reserved. Copyright Telerik Inc. All rights reserved. Copyright BEA Systems. All rights reserved. Copyright PDFlib GmbH. All rights reserved. Copyright Orientation in Objects GmbH. All rights reserved. Copyright Tanuki Software, Ltd. All rights reserved. Copyright Ricebridge. All rights reserved. Copyright Sencha, Inc. All rights reserved. Copyright Scalable Systems, Inc. All rights reserved. Copyright jqwidgets. All rights reserved. Copyright Tableau Software, Inc. All rights reserved. Copyright MaxMind, Inc. All Rights Reserved. Copyright TMate Software s.r.o. All rights reserved. Copyright MapR Technologies Inc. All rights reserved. Copyright Amazon Corporate LLC. All rights reserved. Copyright Highsoft. All rights reserved. Copyright Python Software Foundation. All rights reserved. Copyright BeOpen.com. All rights reserved. Copyright CNRI. All rights reserved. This product includes software developed by the Apache Software Foundation ( and/or other software which is licensed under various versions of the Apache License (the "License"). You may obtain a copy of these Licenses at Unless required by applicable law or agreed to in writing, software distributed under these Licenses is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the Licenses for the specific language governing permissions and limitations under the Licenses. This product includes software which was developed by Mozilla ( software copyright The JBoss Group, LLC, all rights reserved; software copyright by Bruno Lowagie and Paulo Soares and other software which is licensed under various versions of the GNU Lesser General Public License Agreement, which may be found at The materials are provided free of charge by Informatica, "as-is", without warranty of any kind, either express or implied, including but not limited to the implied warranties of merchantability and fitness for a particular purpose. The product includes ACE(TM) and TAO(TM) software copyrighted by Douglas C. Schmidt and his research group at Washington University, University of California, Irvine, and Vanderbilt University, Copyright ( ) , all rights reserved. This product includes software developed by the OpenSSL Project for use in the OpenSSL Toolkit (copyright The OpenSSL Project. All Rights Reserved) and redistribution of this software is subject to terms available at and This product includes Curl software which is Copyright , Daniel Stenberg, <daniel@haxx.se>. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at Permission to use, copy, modify, and distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. The product includes software copyright ( ) MetaStuff, Ltd. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at license.html. The product includes software copyright , The Dojo Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at This product includes ICU software which is copyright International Business Machines Corporation and others. All rights reserved. Permissions and limitations regarding this software are subject to terms available at This product includes software copyright Per Bothner. All rights reserved. Your right to use such materials is set forth in the license which may be found at kawa/software-license.html. This product includes OSSP UUID software which is Copyright 2002 Ralf S. Engelschall, Copyright 2002 The OSSP Project Copyright 2002 Cable & Wireless Deutschland. Permissions and limitations regarding this software are subject to terms available at This product includes software developed by Boost ( or under the Boost software license. Permissions and limitations regarding this software are subject to terms available at / This product includes software copyright University of Cambridge. Permissions and limitations regarding this software are subject to terms available at This product includes software copyright 2007 The Eclipse Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at and at

3 This product includes software licensed under the terms at license.html, httpunit.sourceforge.net/doc/ license.html, license.html, license-agreement; /copyright-software ; forge.ow2.org/projects/javaservice/, license.html; protobuf.googlecode.com/svn/trunk/src/google/protobuf/descriptor.proto; current/doc/mitk5license.html; blob/master/license; page=documents&file=license; blueprints/blob/master/license.txt; twbs/bootstrap/blob/master/license; master/license, and This product includes software licensed under the Academic Free License ( the Common Development and Distribution License ( the Common Public License ( the Sun Binary Code License Agreement Supplemental License Terms, the BSD License ( the new BSD License ( licenses/bsd-3-clause), the MIT License ( the Artistic License ( and the Initial Developer s Public License Version 1.0 ( This product includes software copyright Joe WaInes, XStream Committers. All rights reserved. Permissions and limitations regarding this software are subject to terms available at This product includes software developed by the Indiana University Extreme! Lab. For further information please visit This product includes software Copyright (c) 2013 Frank Balluffi and Markus Moeller. All rights reserved. Permissions and limitations regarding this software are subject to terms of the MIT license. See patents at DISCLAIMER: Informatica LLC provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied warranties of noninfringement, merchantability, or use for a particular purpose. Informatica LLC does not warrant that this software or documentation is error free. The information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation is subject to change at any time without notice. NOTICES This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating company of Progress Software Corporation ("DataDirect") which are subject to the following terms and conditions: 1. THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT. 2. IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE OF THE ODBC DRIVERS, WHETHER OR NOT INFORMED OF THE POSSIBILITIES OF DAMAGES IN ADVANCE. THESE LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT LIMITATION, BREACH OF CONTRACT, BREACH OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS. Part Number: IN-LDMAG

4 Table of Contents Preface Informatica Resources Informatica Network Informatica Knowledge Base Informatica Documentation Informatica Product Availability Matrixes Informatica Velocity Informatica Marketplace Informatica Global Customer Support Chapter 1: Introduction to Live Data Map Administration Live Data Map Overview Live Data Map Architecture Live Data Map Administration Overview Live Data Map Administration Process Accessing Live Data Map Administrator Prerequisites Log In To Live Data Map Administrator Changing the Password Chapter 2: Live Data Map Concepts Live Data Map Concepts Overview Catalog Resource Type Resource Scanner Schedule Business Example Chapter 3: Using Live Data Map Administrator Live Data Map Administrator Overview Start Workspace Resource Workspace Monitoring Workspace Library Workspace Chapter 4: Managing Resources Managing Resources Overview Resources and Scanners Resources and Schedules Table of Contents

5 Resources and Attributes Resource Type Informatica Cloud Service Resource Connection Properties Custom Lineage Resource Properties Amazon S3 Resource Properties HDFS Resource Connection Properties Amazon Redshift Resource Connection Properties MicroStrategy Resource Connection Properties Business Glossary Classification Resource Type Properties IBM Cognos Connection Properties Tableau Server Properties Cloudera Navigator Connection Properties Hive Connection Properties Informatica Platform Resource Type Properties PowerCenter Resource Type Properties IBM DB2 Resource Type Properties IBMDB2 for z/os Resource Type Properties IBM Netezza Resource Type Properties JDBC Resource Type Properties SQL Server Resource Type Properties Oracle Resource Type Properties Sybase Resource Type Properties Teradata Resource Type Properties SAP Business Objects Resource Type Properties Salesforce Resource Type Properties Creating a Resource Editing a Resource Running a Scan on a Resource Viewing a Resource Chapter 5: Managing Schedules Managing Schedules Overview Schedule Types Reusable Schedules Custom Schedules Creating a Schedule Viewing the List of Schedules Chapter 6: Managing Attributes Managing Attributes Overview System Attributes Custom Attributes General Attribute Properties Table of Contents 5

6 Search Configuration Properties Editing a System Attribute Creating a Custom Attribute Chapter 7: Assigning Connections Assigning Connections Overview Auto-assigned Connections User-assigned Connections Managing Connections Chapter 8: Configuring Reusable Settings Reusable Configuration Overview General Configuration Properties Data Integration Service Connection Properties Setting Up a Reusable Data Integration Service Configuration Chapter 9: Monitoring Live Data Map Monitoring Live Data Map Overview Task Status Task Distribution Monitoring by Resource Monitoring by Task Applying Filters to Monitor Tasks Index Table of Contents

7 Preface The Informatica Live Data Map Administrator Guide is written for administrators who manage and monitor Live Data Map. Informatica Resources Informatica Network Informatica Network hosts Informatica Global Customer Support, the Informatica Knowledge Base, and other product resources. To access Informatica Network, visit As a member, you can: Access all of your Informatica resources in one place. Search the Knowledge Base for product resources, including documentation, FAQs, and best practices. View product availability information. Review your support cases. Find your local Informatica User Group Network and collaborate with your peers. As a member, you can: Access all of your Informatica resources in one place. Search the Knowledge Base for product resources, including documentation, FAQs, and best practices. View product availability information. Find your local Informatica User Group Network and collaborate with your peers. Informatica Knowledge Base Use the Informatica Knowledge Base to search Informatica Network for product resources such as documentation, how-to articles, best practices, and PAMs. To access the Knowledge Base, visit If you have questions, comments, or ideas about the Knowledge Base, contact the Informatica Knowledge Base team at KB_Feedback@informatica.com. 7

8 Informatica Documentation To get the latest documentation for your product, browse the Informatica Knowledge Base at If you have questions, comments, or ideas about this documentation, contact the Informatica Documentation team through at Informatica Product Availability Matrixes Product Availability Matrixes (PAMs) indicate the versions of operating systems, databases, and other types of data sources and targets that a product release supports. If you are an Informatica Network member, you can access PAMs at Informatica Velocity Informatica Velocity is a collection of tips and best practices developed by Informatica Professional Services. Developed from the real-world experience of hundreds of data management projects, Informatica Velocity represents the collective knowledge of our consultants who have worked with organizations from around the world to plan, develop, deploy, and maintain successful data management solutions. If you are an Informatica Network member, you can access Informatica Velocity resources at If you have questions, comments, or ideas about Informatica Velocity, contact Informatica Professional Services at ips@informatica.com. Informatica Marketplace The Informatica Marketplace is a forum where you can find solutions that augment, extend, or enhance your Informatica implementations. By leveraging any of the hundreds of solutions from Informatica developers and partners, you can improve your productivity and speed up time to implementation on your projects. You can access Informatica Marketplace at Informatica Global Customer Support You can contact a Global Support Center by telephone or through Online Support on Informatica Network. To find your local Informatica Global Customer Support telephone number, visit the Informatica website at the following link: If you are an Informatica Network member, you can use Online Support at 8 Preface

9 C H A P T E R 1 Introduction to Live Data Map Administration This chapter includes the following topics: Live Data Map Overview, 9 Live Data Map Architecture, 10 Live Data Map Administration Overview, 11 Live Data Map Administration Process, 12 Accessing Live Data Map Administrator, 12 Live Data Map Overview Live Data Map brings together all data assets in an enterprise and presents a comprehensive view of the data assets and data asset relationships. A data asset is a type of data object, such as a physical data source, HDFS, or big-data repository. The data assets in the enterprise might exist in relational databases, purpose-built applications, reporting tools, HDFS, and other big-data repositories. Live Data Map captures the physical and operational metadata for a large number of data assets that you use to determine the effectiveness of enterprise data. Metadata is data about data. Metadata contains details about the structure of data sources. Metadata also includes information, such as data patterns, data types, relationships between columns, and relationships between multiple data sources. Live Data Map gathers information related to metadata across the enterprise. The metadata includes column data statistics, data domains, data object relationships, and data lineage information. A comprehensive view of enterprise metadata can help you make critical decisions on data integration, data quality, and data governance in the enterprise. Live Data Map addresses the following key questions related to metadata in the enterprise: What content does a data asset contain? What does the content in a data asset mean? Who is responsible for a specific data asset in the enterprise? Which is the source for data in a data asset? What sensitive data does the enterprise have? Where is the sensitive data located? 9

10 Live Data Map Architecture The Live Data Map architecture consists of applications, services, and databases. The applications layer consists of client applications, such as Enterprise Information Catalog. The services layer has application services, such as the Catalog Service, Data Integration Service, and Model Repository Service. Live Data Map requires the Catalog Service to extract metadata from data sources and manage the administrative tasks. The databases layer consists of the Model repository and internal or external Hadoop cluster for metadata storage and analysis. Data sources and metadata sources include source data repositories, such as Oracle, Microsoft SQL Server, PowerCenter repository, and SAP Business Objects. The following image shows the architecture components of Live Data Map: The following table describes the architecture components: Component External Application Scanner Framework Model Repository Service Content Management Service Data Integration Service An application, such as Enterprise Information Catalog, that you use to discover, explore, and relate different types of metadata from disparate sources in the enterprise. A framework that runs scanners and manages a registry of available scanners. A scanner is a pluggable component of Live Data Map that extracts specific metadata from external data sources. An application service that manages the Model repository. An application service that manages reference data. It provides reference data information to the Data Integration Service and Informatica Developer. You can use Informatica Developer to import data domains into Model repository. An application service that performs data integration tasks for Live Data Map and external applications. 10 Chapter 1: Introduction to Live Data Map Administration

11 Component Catalog Service Model repository Internal Hadoop Cluster An application service that runs Live Data Map and manages connections between service components and external applications. A relational database that stores the resource configuration and data domain information. An HDFS-based cluster based on HortonWorks that stores large amounts of metadata. External Hadoop Cluster Metadata Persistence Store An HDFS-based cluster based on Cloudera or HortonWorks that stores large amounts of metadata. A staging database that stores extracted metadata for further analysis. Search Index Graph Database Data sources and metadata sources Apache Solr-based search index information. The search index is based on the Model repository assets and assets in the metadata persistence store. Live Data Map uses the indexed information to display search results based on the appropriate asset metadata and relationships. An Apache HBase distributed database that uses graph structures to represent and store large amounts of metadata. The source databases or metadata sources that Live Data Map scans to extract relevant metadata for further use. Live Data Map Administration Overview Live Data Map Administrator is the administration tool that you can use to manage and monitor the resources, schedules, attributes, and connections. You can use Live Data Map Administrator to perform the following tasks: Resource management. Create, edit, and remove resources. Schedule management. Create, edit, and remove schedules. Attribute management. View system-defined attributes for metadata object types. Create custom attributes and assign to metadata object types, such as tables, views, and columns. Connection management. View automatically assigned connections and schemas. Assign schemas and connections to resources. Unassign user-assigned connections. Profile configuration management. Create and edit reusable profile-definition settings. Resource monitoring. Monitor resources and tasks. Live Data Map Administration Overview 11

12 Live Data Map Administration Process The administration tasks include configuring resources, assigning schedules, and custom attributes. You also need to monitor the tasks that extract metadata using the resources. You can perform the following tasks as part of the administration process: 1. Create resources for each resource type based on the type of sources that you need to extract metadata from. 2. Choose whether you want to extract the source metadata, profiling metadata, or both. 3. Choose whether you want to run the resources one time or multiple times based on a common or custom schedule. 4. Optionally, assign a common schedule or custom schedule to the resources. 5. Monitor the tasks that extract metadata from different sources. 6. Troubleshoot tasks that do not perform as expected. Accessing Live Data Map Administrator Use Live Data Map Administrator to consolidate the administrative tasks for resources, attributes, and schedules. You launch Live Data Map Administrator from Informatica Administrator. You must know the host name of the gateway node and the Informatica Administrator port number to log in to Informatica Administrator. Perform the following steps before you log in: 1. Launch Informatica Administrator using the gateway node and Informatica Administrator port number in the Informatica Administrator URL. 2. In Informatica Administrator, configure an Informatica domain, user accounts, database connections, and services if you haven't created them as part of the installation. The services include Data Integration Service, Model Repository Service, Content Management Service, Informatica Cluster Service and Catalog Service. 3. On the Services and Nodes tab of Informatica Administrator, select the Catalog Service, and then click the service URL to launch Live Data Map Administrator from Informatica Administrator. 4. Use the login credentials to log in to Live Data Map Administrator. Prerequisites The prerequisites to launch Live Data Map Administrator include an Informatica domain, domain connectivity information, and administrator user account. You need to verify the following prerequisites before you log in to the Live Data Map Administrator: 1. The Informatica Domain is running. 2. The Informatica domain has Data integration Service, Model Repository Service, and Catalog Service enabled. 3. You have the domain connectivity information and administrator user account in Informatica Administrator. 12 Chapter 1: Introduction to Live Data Map Administration

13 Log In To Live Data Map Administrator Use either Microsoft Internet Explorer or Google Chrome to log in to Live Data Map administrator. 1. Start Microsoft Internet Explorer or Google Chrome. 2. In the Address field, enter the URL for the Live Data Map Administrator login page in the following format: The host is the gateway node host name. The port is the Informatica Administrator port number. 3. In the Live Data Map Administrator login page, enter the user name and password. 4. Verify that the default domain option Native is selected. You can also select an LDAP domain. The Domain field appears when the Informatica domain contains an LDAP security domain. 5. Click Log In. Changing the Password Use the Administrator menu to change the password. 1. In the Live Data Map Administrator tool header area, click Administrator > Change Password. The Change Password page appears. 2. Enter the current password in the Password field and the new password in the New Password and Confirm New Password fields. 3. Click Update. Accessing Live Data Map Administrator 13

14 C H A P T E R 2 Live Data Map Concepts This chapter includes the following topics: Live Data Map Concepts Overview, 14 Catalog, 14 Resource Type, 15 Resource, 15 Scanner, 15 Schedule, 15 Business Example, 16 Live Data Map Concepts Overview Live Data Map helps you analyze and understand large volumes of metadata in the enterprise. You can extract physical and operational metadata for a large number of objects, organize the metadata based on business concepts, and view the data lineage and relationship information for each object. The key concepts in Live Data Map include catalog, resource, resource type, scanner, and schedule. Catalog stores all the metadata extracted from sources. Resource type represents different metadata source systems. Resource is a representation of a resource type, such as Oracle, SQL Server, or PowerCenter. Scanners fetch the metadata and save it in the catalog. Schedules determine the intervals at which scanners extract metadata from the source systems and save the metadata in the catalog. Catalog The catalog represents an indexed inventory of all the data assets in the enterprise that you configure in Live Data Map Administrator. Live Data Map organizes all the enterprise metadata in the catalog and enables the users of external applications discover and understand the data. The catalog stores all the metadata extracted from external data sources. You can find metadata and statistical information, such as profile results, data domains, and data asset relationships, from the catalog. Data domains represent inferred business semantics based on source column data. Examples include Social Security number, phone number, and credit card number. 14

15 Resource Type A type of external data source or metadata repository from which scanners extract metadata. Examples include relational sources, Business Intelligence sources, and PowerCenter sources. Resource A resource is a catalog object that represents an external data source or metadata repository from where scanners extract metadata. A resource represents an instance of a specific resource type. The basic metadata operations, such as extraction and storage of metadata, are performed at the resource level. A resource can be of resource types, such as relational databases, Business Glossary classification, and business intelligence sources. A resource might have an associated schedule. Each resource can extract both source metadata and profile metadata from the external data sources. Scanner A pluggable component of Live Data Map that extracts specific metadata, such as source metadata or profile metadata, from external data sources and stores the metadata in the catalog. A scanner typically maps to a single resource type. However, there can be more than one scanner for a resource type. Examples are profiling scanner and lineage analyzer. A scanner performs a scan job on the metadata sources to fetch metadata into the catalog. When you have scanners for newer resource types ready, you can plug in those scanners to Live Data Map without having to upgrade Live Data Map. Schedule A schedule determines when scanners extract metadata from sources. You can have recurring daily, weekly, and monthly schedules to extract metadata at regular intervals. You can create the following types of schedules: Global schedule Reusable schedules that you can attach to more than one resource. Custom schedule A customized schedule assigned to a single resource. Resource Type 15

16 Business Example You are a catalog administrator in a multinational retail organization. The data analysts in your department need to view the metadata from different database schemas and database tables in multiple sources to perform an advanced data analysis. You also need to make sure that the data analysts understand and trust the data that they use. The organization might plan regular security audits to find sensitive data in the data sources and mask or protect them as required. The retail organization that you work for has the following configured systems: Human resources management system set up on an Oracle database. Order management system set up on the same Oracle database. Data warehouse hosted on a Hadoop repository. The data warehouse has integrated information from multiple data sources. PowerCenter to perform data integration tasks across databases and schemas. Reporting system set up on an SAP Business Object source. The administrator in the organization can perform the following tasks in Live Data Map Administrator to effectively meet the data governance needs in this example: Use Live Data Map Administrator to create an Oracle resource for the human resource management system and another Oracle resource for the order management system. You can configure the source metadata settings to extract the metadata into the catalog. You might not need to configure the profiling metadata settings for these resources. The resources provide the required database table and source column objects into catalog for analysis. Create a Hive resource for the Hadoop warehouse. The Hive resource fetches tables and columns to the catalog. In addition to the source metadata extraction, you can configure the profiling metadata settings so that you have information related to data quality for further analysis. Create a PowerCenter resource that maps to the data integration requirements. The resource configuration provides the links between the Oracle data objects and Hive objects. Create an SAP Business Objects resource and configure the resource to extract reporting metadata. The resource provides reporting metadata based on the links between the Business Objects and the Oracle and Hive objects. Set up a recurring schedule for each resource so that the scanners extract source and profiling metadata from the source systems at regular intervals. Periodically monitor the tasks and jobs in Live Data Map Administrator that extract metadata. Monitor the tasks and jobs to get a functional view of Live Data Map. Monitoring also helps you to analyze and estimate the type of content the scanners fetch into the catalog. 16 Chapter 2: Live Data Map Concepts

17 C H A P T E R 3 Using Live Data Map Administrator This chapter includes the following topics: Live Data Map Administrator Overview, 17 Start Workspace, 19 Resource Workspace, 19 Monitoring Workspace, 19 Library Workspace, 20 Live Data Map Administrator Overview Live Data Map administrator is the administration tool that you can use to perform administrative tasks, such as the management of resources, schedules, and attributes. Use the Live Data Map Administrator to complete the following types of tasks: Manage Resources Create, configure, edit, and remove resources. A resource is an object that represents an external data source or metadata repository from where scanners extract metadata. Live Data Map performs all basic operations, such as extracting metadata, storing metadata in the Hadoop cluster, and managing metadata, at the resource level. Manage Schedules Create schedules that you can attach to resources. You can create global, recurring schedules that you can assign to multiple resources. Manage Attributes Assign predefined system attributes to specific metadata object types, such as table, column, report, and resource. You can create custom attributes and assign them to metadata object types based on the business requirements. Attributes assigned to resources can help Enterprise Information Catalog users to quickly find the data assets and related information. You can configure the system and custom attributes so that Enterprise Information Catalog displays the attributes as search filters. Manage PowerCenter, SAP Business Objects, and Big Data Object Connections You can view the details of connections that are automatically assigned to each resource. You can also view assigned and unassigned PowerCenter, SAP Business Objects, Cloudera, and Hive object 17

18 connections that are user-defined and the schemas for each connection. You can assign specific schemas to the appropriate resources. Unassign the connections and schemas as required. The following figure shows the Live Data Map Administrator interface: Live Data Map Administrator has the following tabs: Start View the monitoring statistics for resources and tasks. You can view the task distribution by the task status and running time. You can also view the resource distribution and predictive job load for the current week. Resource Create resources. You can also open the recently configured objects. Monitoring Library View monitoring statistics by the task type and task status. Apply filters to shortlist tasks and resources that meet specific conditions. View the list of resources and schedules. Open a resource or schedule for further analysis. Live Data Map Administrator has the following header items: New Open Manage Create resources and schedules. View the Library workspace. Manage system and custom attributes, connection assignments, and reusable configuration settings. Administrator Help Change the password for Live Data Map Administrator, and log out of Live Data Map Administrator. Access specific help topics for the current workspace or page, launch the online help, and view the Informatica version. 18 Chapter 3: Using Live Data Map Administrator

19 Start Workspace The Start workspace displays visual charts that represent monitoring statistics of tasks, resources, and system load. Click the appropriate sections of the interactive charts to view the details. You can view the following visual, monitoring statistics on the Start workspace: Total number of assets in the catalog. Total number of resources and the number of resources by resource type. These numbers can help you monitor the resource load in Live Data Administrator. Total number of configured resources that you did not use yet to extract metadata. You can also view the number of such resources by resource type. Total number of tasks and their statuses. The task statuses include Running, Failed, Queued, and Paused. You can focus on any unexpected job discrepancy based on the task status details. You can view all the tasks in the last 24 hours, last week, or last month. Number of failed tasks and running tasks that need your attention. Click the task link to view the details in the Monitoring workspace. Number of running tasks based on the task run time. For example, you can view the details of the tasks that are running for more than a day or from 4 through 12 hours. Graphical representation of the predictive job load in terms of the number of jobs at different time slots in the current week or day. Total number of jobs scheduled for the day. Total number of unassigned connections. Resource Workspace You can create resources from the Resource workspace. You can also launch the recently opened data assets. To create a resource, click the link under the New Assets panel. To open a recently opened data asset, click the asset under the Recently Opened panel. Monitoring Workspace You can monitor the status of Live Data Map tasks on the Monitoring workspace. The workspace displays visual representation of tasks and their distribution. You can view the status of tasks, such as Running, Failed, and Paused. You can view the following details of tasks in the bottom pane of the Monitoring workspace: Type. Indicates the type of task, such as metadata load, profile executor, and profile result fetcher. Resource name. Displays the name of the resource. Schedule. Displays the name of the schedule associated with the resource. Triggered by. Indicates whether the task was triggered manually or as part of a configured schedule. Start Workspace 19

20 Status. Displays the task status, such as Running, Failed, Complete, and Paused. Start time of the task. End time of the task. Run time of the task. Next schedule. Displays the date and time when the scanner runs the job next. Log URL. Displays a link that you can click to open the log file for the task for completed tasks. You can refresh the Monitoring workspace to view the latest tasks and task statuses. You can also apply filters on the task list based on conditions, such as resource type, job creation time, and job fail history. Library Workspace Use the Library workspace to browse, view, search, and apply filters on a collection of resources and schedules that you have the user privilege to access. Click Open in the header area to launch the Library workspace. Specify the search criteria to find a specific resource or group of resources. You can sort the resources by resource name or resource type. You can also group the resources based on specific requirements. You can open a resource or schedule from the assets list. When you click a resource, the resource opens in the Resource workspace. If you click a schedule, it opens in the Schedules workspace. You can filter the resource list based on multiple conditions, such as Resource Name, Created by, and Resource Type. You can filter a schedule based on conditions, such as schedule name and the time you created the schedule. 20 Chapter 3: Using Live Data Map Administrator

21 C H A P T E R 4 Managing Resources This chapter includes the following topics: Managing Resources Overview, 21 Resources and Scanners, 22 Resources and Schedules, 22 Resources and Attributes, 22 Resource Type, 22 Creating a Resource, 59 Editing a Resource, 60 Running a Scan on a Resource, 60 Viewing a Resource, 60 Managing Resources Overview A resource is a repository object that represents an external data source or metadata repository from where scanners extracts metadata. The basic metadata operations, such as extraction, storage, and management of metadata, are performed at the resource level. You can create, edit, and remove a resource. A resource has a resource type that determines the type of data source from where scanners extract metadata. You can also choose the specific types of metadata that you want scanners to extract from the data sources. For example, you can choose to extract basic source metadata or both basic source metadata and profile metadata. You can view classifications assigned to resources. The classifications help you to quickly search and filter specific resources in Enterprise Information Catalog. A resource has the following characteristics: A resource has a resource type that identifies the type of source system. A resource has a global, unique identity. A resource can have a one-time schedule or recurring schedule. You can attach a schedule to one resource or attach a reusable schedule to multiple resources. 21

22 Resources and Scanners Scanners attached to resources identify the resource type including the name of the resource type, display name, description, and supported versions. You can configure multiple scanners for different types of metadata. You can configure the scanner properties for a resource when you create the resource. Each scanner has a unique ID that maps to the resource type. A resource type can have multiple scanners. For example, one scanner might extract the metadata for an Oracle data source directly from the source. Another scanner might derive the metadata about the Oracle data source from other metadata repositories, such as a PowerCenter repository or Model repository. Resources and Schedules Resource schedules determine the frequency with which scanners extract metadata from the metadata sources. You can create schedules for specific resources or create reusable schedules that you want to assign to multiple resources. You can create schedules that have an end date or schedules that run indefinitely. In addition to a start date, you can also set up the start time based on when you want scanners to start extracting metadata. Resources and Attributes Resources can have attributes that represent certain properties of object types, such Alias Column, Category, DataSet, and Table, based on the data types of the attributes. You can use the resource attributes in Enterprise Information Catalog to search and find relevant information among a large number of data assets and data relationships. Resource can have associated custom attributes. System attributes include predefined attributes that represent specific properties of object types, such as stored procedure, table, trigger, XML query, schema, and resource. For example, Comment is a system attribute assigned to a Hive table or Hive view. You can create custom attributes based on your business requirements that you want to assign to different resources. You can choose to create custom attributes based on the scanner types that are in use. Resource Type A resource type represents the type of source system from where scanners extract metadata. A resource type identifies the required and optional metadata types that you need to configure for each resource. Examples of the resource type are Business Objects, Oracle, PowerCenter, and Teradata. Each resource type has different properties that you need to configure when you create a resource. The properties include the connection properties and additional properties, such as resource owner and resource classification. You can create the following types of resources: Informatica Cloud 22 Chapter 4: Managing Resources

23 Amazon Redshift Hadoop File System (HDFS) MicroStrategy Amazon S3 Custom Lineage Business Glossary Classification Tableau Server IBM Cognos Cloudera Navigator Hive Informatica Platform PowerCenter IBM DB2 IBM DB2 for z/os IBM Netezza JDBC Microsoft SQL Server Oracle Sybase Teradata Salesforce SAP Business Objects To run a scan job on resource types, such as Oracle, DB2, and SQL Server, use the native resource types available in Live Data Map. Informatica recommends that you do not configure the JDBC resource type to fetch data from the supported native metadata sources. However, you can use the JDBC resource type to fetch metadata from MySQL and i5/os sources. Informatica Cloud Service Resource Connection Properties Informatica Cloud is an on-demand subscription service that provides access to applications, databases, platforms, and flat files hosted on premise or on a cloud. Informatica Cloud runs at a hosting facility. Before you add an Informatica Cloud Service resource, perform the steps listed in the Prerequisites section. Prerequisites 1. Create an organization for your company on the Informatica Cloud website, define the organization hierarchy, and configure the organization properties. You must perform this step before you can use Informatica Cloud. Note: To create an organization, you must have a REST API license. If you do not have a REST API license, contact Informatica Global Customer Support. 2. Create a subscription account on Informatica Cloud. 3. Verify that the machine where you install the Informatica Cloud Secure Agent meets the minimum system requirements. The Informatica Cloud Secure Agent is a lightweight program that runs all tasks and enables secure communication across the firewall between your organization and Informatica Cloud. Resource Type 23

24 4. Download, install, and register the Informatica Cloud Secure Agent using the Informatica Cloud user name and password. 5. Create the following tasks on Informatica Cloud: a. Mapping tasks A mapping defines reusable data flow logic that you can use in Mapping Configuration tasks. Use a mapping to define data flow logic that is not available in Data Synchronization tasks, such as specific ordering of logic or joining sources from different systems. When you configure a mapping, you describe the flow of data from source and target. You can add transformations to transform data, such as an Expression transformation for row-level calculations or a Filter transformation to remove data from the data flow. b. PowerCenter tasks The PowerCenter task allows you to import PowerCenter workflows in to Informatica Cloud and run them as Informatica Cloud tasks. c. Data synchronization tasks The Data Synchronization task allows you to synchronize data between a source and target. Note: An Informatica Cloud Service resource imports all the tasks to Live Data Map the first time metadata is extracted from the resource. During the subsequent extract operations, the resource imports only the updated tasks to Live Data Map. For more information about the prerequisites, see the Informatica Cloud User Guide and the Informatica Cloud Administrator Guide. Connection Properties The General tab includes the following properties: Cloud URL Username Password Auto Assign Connections The URL to access the Informatica Cloud Service. The user name to connect to the Informatica Cloud Service. The password associated with the user name. Select this option to specify that the connection must be assigned automatically. The Metadata Load Settings tab includes the following properties: Memory Specifies the memory required to run the scanner job. Select one of the following values based on the data set size imported: - Low - Medium - High See the Tuning Live Data Map Performance How-to-Library article for more information about memory values. Custom Lineage Resource Properties You can create a custom lineage resource to view the data lineage information for the assets in your organization. A custom lineage resource uses CSV files provided by you that include lineage data for your 24 Chapter 4: Managing Resources

25 enterprise. You can use this option if you do not have an ETL tool supported by Live Data Map. The following tables list the properties that you must configure to add a custom lineage resource: The General tab includes the following properties: File The CSV file or the.zip file that includes the CSV files with the lineage data. Click Choose to select the required CSV file or.zip file that you want to upload. Ensure that the CSV files in the.zip file are not stored in a directory within the.zip file. If you want to select multiple CSV files, you must include the required CSV files in a.zip file and then select the.zip file for upload. Note: Make sure that the CSV file includes the following parameters in the header: - From Connection - To Connection - From Object - To Object The Metadata Load Settings tab includes the following properties: Auto Assign Connections Memory Specifies to automatically assign the connection. Specifies the memory required to run the scanner job. Select one of the following values based on the data set size imported: - Low - Medium - High See the Tuning Live Data Map Performance How-to-Library article for more information about memory values. Amazon S3 Resource Properties The following tables list the properties that you must configure to add an Amazon S3 resource: The General tab includes the following properties: Amazon Web Services Bucket URL Amazon Web Services Access Key ID Amazon Web Services Secret Access Key Amazon Web Services Bucket Name Source Directory Amazon Web Services URL to access a bucket. Amazon Web Services access key ID to sign requests that you send to Amazon Web Services. Amazon Web Services secret access key to sign requests that you send to Amazon Web Services. Amazon Web Services bucket name that Live Data Map needs to scan. The source directory from where metadata must be extracted. Resource Type 25

26 The Metadata Load Settings tab includes the following properties: File Types First Row as Column Header for CSV First Level Directory Include Subdirectory Memory Select any or all of the following file types from which you want to extract metadata: - CSV - XML - JSON Select this option to specify the first row as the column header for the CSV file. Use this option to specify a directory or a list of directories under the source directory. If you leave this option blank, Live Data Map imports all the files from the specified source directory. To specify a directory or a list of directories, you can perform the following steps: 1. Click Select... The Select First Level Directory dialog box appears. 2. Select the required directories using one of the following options: - Select from list: select the required directories from a list of directories. - Select using regex: provide an SQL regular expression to select schemas that match the expression. Note: If you are selecting multiple directories, you must separate the directories using a semicolon (;). Select this option to import all the files in the subdirectories under the source directory. Specifies the memory required to run the scanner job. Select one of the following values based on the data set size imported: - Low - Medium - High See the Tuning Live Data Map Performance How-to-Library article for more information about memory values. HDFS Resource Connection Properties The following tables list the properties that you must configure to add a Hadoop File System (HDFS) resource. Adding an HDFS resource allows you import metadata from CSV, XML, and JSON files. The General tab includes the following properties: Name Node URI 1 Name Node URI 2 HDFS Service Name User Name/User Principal URI to the active HDFS NameNode. The active HDFS NameNode manages all the client operations in the cluster. URI to the secondary HDFS NameNode. The secondary HDFS NameNode stores modifications to HDFS as a log file appended to a native file system file. HDFS service name. User name to connect to HDFS. Specify the Kerberos Principal if the cluster is enabled for Kerberos. 26 Chapter 4: Managing Resources

27 Source Directory Include Subdirectories Kerberos Cluster The source location from where metadata must be extracted. Specifies that metadata must be extracted from files in the subdirectories under the specified source directory. Select Yes if the cluster is enabled for Kerberos. If the cluster is enabled for Kerberos, provide the following details: - HDFS Service Principal: the service principal name of HDFS service. - Keytab File: the path to the Kerberos Principal keytab file. Make sure that the keytab file is present at the specified location on Informatica domain host and cluster hosts of the Catalog Service. The Metadata Load Settings tab includes the following properties: File Types First Level Directory Include Subdirectory Memory Select any or all of the following file types from which you want to extract metadata: - CSV - XML - JSON Specifies that all the directories must be selected. If you want specific directories to be selected, use the Select Directory option. This option is disabled if you had selected the Include Subdirectories option on the General tab. Type the required directories in the text box or click Select... to choose the required directories. This option is disabled if you had selected the Include Subdirectories option on the General tab or the Select all Directories option listed above. Specifies the memory required to run the scanner job. Select one of the following values based on the data set size imported: - Low - Medium - High See the Tuning Live Data Map Performance How-to-Library article for more information about memory values. The following table describes the Domain Connection and Profile Configuration settings on the Metadata Load Settings tab: Enable Data Profiling Configuration Domain Name Select this option to enable data profiling for the metadata extracted from the source. Specify the configuration for Data Integration Service: - Custom. Use custom configuration when you want to configure the Data Integration Service options manually. - Global. Use global configuration when you want to use the existing Data Integration Service options created by the administrator. You can select from the existing Data Integration Service options listed in the drop-down list. Informatica domain name. Resource Type 27

28 Data Integration Service User Name Password Security Domain Host Port Profiling Run Options Priority Sampling Option Exclude Views Incremental Profiling Source Connection Name Run On Name of the Data Integration Service. Username to log in to the Informatica domain. Password to log in to the Informatica domain. Name of the security domain. Host name for the Data Integration Service. Port number for the Data Integration Service. Choose whether Live Data Map performs column profiling, domain discovery, or both column profiling and domain discovery. If you select domain discovery or both column profiling and domain discovery, provide the following details: - Data Domains: click Select... to select the required data domains from the Select Data Domains dialog box. Live Data Map runs profiling on the selected data domains. - Data Domain Match Criteria: select one of the following options to specify the criteria for matching data domains during profiling: - Percentage: select this option to specify the minimum conformance percentage of data to be eligible for data domain match. Conformance percentage is the ratio of number of matching rows divided by the total number of rows. Specify the value in the Minimum Conformance Percentage box. - Rows: select this option to specify the minimum number of rows needed to be eligible for data domain match. Specify the value in the Minimum Row Count for Data Domain Match box. - Exclude Null Values from Data Domain Discovery: select this option to specify that null values must be excluded when performing data domain discovery. - Column Name Match: select this option to specify that profiling must be run on column titles. Select High or Low to specify the priority for profiling. Live Data Map runs profiling on priority for resources where the profiling priority is set to High. Sampling options determine the number of rows that Live Data Map chooses to run a profile. You can configure sampling options when you define a profile or when you run a profile. Select First N Rows to specify that profiling must be run on the first <N> number of rows. Specify the number of rows in the Number of First N Sampling Rows box. Note: This option is valid for CSV files and is not valid for XML or JSON files. The rest of the sampling options listed are not valid for CSV, XML, or JSON files. Select this option to specify that views must be excluded from profiling. Specifies if the profiling must be run only for the changes made to the source object. If you do not select this option, Live Data Map runs profiling for the whole source every time. Event Date Records (EDR) name for the source connection. Click Select... to choose the EDR name for the source connection from the Select Source Connection dialog box. Specify where the HDFS resource must run by selecting one of the following options: - Hadoop: Select this option to specify that the HDFS resource must be run on Hadoop using Informatica Blaze. If you select this option, you must specify the Hadoop Connection Name. Click Select... and select the Hadoop connection name from the Select Hadoop Connection Name dialog box. - Native: Select this option to specify that the HDFS resource must be run on the Hive engine. 28 Chapter 4: Managing Resources

29 Amazon Redshift Resource Connection Properties The following tables list the properties that you must configure to add an Amazon Redshift resource: The General tab includes the following properties: User Password Host The user name used to access the database. The password associated with the user name. Host name or IP address of Amazon Redshift service. Port Amazon Redshift server port number. Default is Database The name of the database instance. The Metadata Load Settings tab includes the following properties: Import System Objects Schema Memory Select this option to specify that the system objects must be imported. Specify the particular database schema to be imported. You can use any of the following methods to specify the schema: - Leave the text box empty to import all tables without a schema. - Specify an SQL pattern to import tables with schema names that match the pattern. - Specify a schema name to import tables associated with the schema. Alternatively, you can select a specific schema by clicking Select... Specifies the memory required to run the scanner job. Select one of the following values based on the data set size imported: - Low - Medium - High See the Tuning Live Data Map Performance How-to-Library article for more information about memory values. MicroStrategy Resource Connection Properties The following tables list the properties that you must configure to add a MicroStrategy resource: The General tab includes the following properties: Agent URL Version URL to the MITI agent that runs on a Microsoft Windows Server. Select the version of MicroStrategy from the drop-down list. You can select the Auto detect option if you want Live Data Map to automatically detect the version of the MicroStrategy resource. Resource Type 29

30 Project Source Login User Password Default Language Import Schema Only Data Model Tables Design Level Incremental Import Project(s) Auto Assign Connections Name of the MicroStrategy project source to which you want to connect. The user name used to connect to the project source. The password associated with the user name. Specify the language to be used while importing metadata from the resource. Select this option to import the project schema without the reports and documents. Select one of the following options to specify the design for the imported tables: - Physical: the imported tables appear in the physical view of the model. - Logical and Physical: the imported tables appear in the logical and physical view of the model. Select this option to import only the changes from the source. Clear this option to import the complete source every time. Select the names of the projects to which you want to connect from the project source. Specifies to automatically assign the connection. The Metadata Load Settings tab includes the following properties: Memory Specifies the memory required to run the scanner job. Select one of the following values based on the data set size imported: - Low - Medium - High See the Tuning Live Data Map Performance How-to-Library article for more information about memory values. Business Glossary Classification Resource Type Properties Business Glossary contains online glossaries of business terms and policies that define important concepts within an organization. Configure a Business Glossary Classification resource type to extract metadata from Business Glossary. The following table describes the connection properties for the Business Glossary Classification resource type: Username Password Name of the user account used that connects to the Analyst tool. Password for the user account that connects to the Analyst tool. 30 Chapter 4: Managing Resources

31 Host Port Namespace Enable Secure Communication Name of the Analyst tool business glossary from which you want to extract metadata. Each resource can extract metadata from one business glossary. Port number on which the Analyst tool runs. Name of the security domain to which the Analyst tool user belongs. If the domain uses LDAP authentication or Kerberos authentication, enter the security domain name. Otherwise, enter Native. Enable secure communication from the Analyst tool to the Analyst Service. The following table describes the Additional and Advanced properties for source metadata settings on the Metadata Load Settings tab: Glossary Memory Name of the business glossary resource that you want to import. Specify the memory value required to run a scanner job. Specify one of the following memory values: - Low - Medium - High Note: For details about the memory values, see the Tuning Live Data Map Performance How-To Library article. IBM Cognos Connection Properties The following tables describe the IBM Cognos connection properties: The General tab includes the following properties: Agent URL Version Dispatcher URL Namespace Username Password URL to the MITI agent that runs on a Microsoft Windows Server. Indicates the Cognos server version. URL used by the framework manager to send requests to Cognos. Defines a collection of user accounts from an authentication provider. User name used to connect to the Cognos server. Password for the user account to connect to the Cognos server. Resource Type 31

32 Add Dependent Objects Incremental Import Folder Representation Transformer Import Configuration Worker Threads Auto Assign Connections Use to import dependent objects to the selection. Selecting this option requires a complete scan of report dependencies on the Cognos server. You can select any of the following options for this property: - None: only imports the selected Cognos objects. - Packages referenced by selected reports: imports the reports and associated source packages. - All: imports the source packages when a report is selected and imports the dependent reports when a source package is selected. You can specify one of the following values for this property: - True: Imports only the changes in the source. - False: Imports the complete source every time. Specifies how the folders from Cognos framework manager must be represented. You can select from the following options: - Ignore: ignores the folders. - Flat: represents the folders as diagrams, but does not retain the hierarchy. - Hierarchical: represents folders as diagrams and retains the hierarchy. The XML file that describes mappings between Cognos Content Manager data sources and PowerPlay Transformer models. Number of worker threads required to retrieve metadata asynchronously. Specifies to automatically assign the connection. The Metadata Load Settings tab includes the following properties: Content Browsing Mode Content Memory Specifies the content to be retrieved while searching the Cognos repository. You can select any of the following options: - Packages Only: retrieves the packages and folders and does not retrieve the reports. - Connections Only: retrieves the list of connections. - All: retrieves the packages, folders, queries, and reports. - Content: allows you to reduce the scope of import to a smaller set of objects than the whole set of objects on the server. Specifies the hierarchy for the content objects. Specifies the memory required to run the scanner job. Select one of the following values based on the data set size imported: - Low - Medium - High See the Tuning Live Data Map Performance How-to-Library article for more information about memory values. 32 Chapter 4: Managing Resources

33 Tableau Server Properties The following tables describe the Tableau Server properties: The General tab includes the following properties: Server Site Username Password Incremental Import Worker Threads Cache Auto Assign Connections The host name or the IP address where the Tableau server runs. Specify the site if the Tableau server has multiple sites installed. The value is case sensitive. The user name to connect to the Tableau server. The password associated with the user name. You can specify one of the following values for this property: - True: Imports only the changes in the source. - False: Imports the complete source every time. Number of worker threads required to retrieve metadata asynchronously. Path to the folder with the Tableau repository cache. Specifies to automatically assign the connection. The Metadata Load Settings tab includes the following properties: Group By Repository Objects Memory Specify to group workbooks in the following categories: - My Workbooks - All Workbooks Imports the repository objects such as workbooks and data sources. For any workbooks, the dependent data sources are also imported. Specifies the memory required to run the scanner job. Select one of the following values based on the data set size imported: - Low - Medium - High See the Tuning Live Data Map Performance How-to-Library article for more information about memory values. Resource Type 33

34 Cloudera Navigator Connection Properties Configure the connection properties when you create or edit a Cloudera Navigator resource. The following table describes the connection properties: Navigator URL User Password URL of the Cloudera Navigator Server. Name of the user account that connects to Cloudera Navigator. Password for the user account that connects to Cloudera Navigator. The following table describes the Additional and Advanced properties for source metadata settings on the Metadata Load Settings tab: Hive Database Memory Name of the Hive database or a schema from where you want to import a table. Specify the memory value required to run a scanner job. Specify one of the following memory values: - Low - Medium - High Note: For details about the memory values, see the Tuning Live Data Map Performance How-To Library article. Hive Connection Properties Configure the connection properties when you create or edit a Hive resource. The following table describes the connection properties: Hadoop Distribution URL User Password Keytab file User proxy Select one of the following Hadoop distribution types for the Hive resource: - Cloudera - Hortonworks - MapR JDBC connection URL used to access the Hive server. The Hive user name. The password for the Hive user name. Path to the keytab file if Hive uses Kereberos for authentication. The proxy user name to be used if Hive uses Kerberos for authentication. 34 Chapter 4: Managing Resources

35 Kerberos Configuration File Enable Debug for Kerberos Specify the path to the Kerberos configuration file if you use Kerberos-based authentication for Hive. Select this option to enable debugging options for Kerberos-based authentication. The following table describes the Additional and Advanced properties for source metadata settings on the Metadata Load Settings tab: Schema Table SerDe jars list Memory Click Select... to specify the Hive schemas that you want to import. You can use one of the following options from the Select Schema dialog box to import the schemas: - Select from List: Use this option to select the required schemas from a list of available schemas. - Select using regexp: Provide an SQL regular expression to select schemas that match the expression. Specify the name of the Hive table that you want to import. If you leave this property blank, Live Data Map imports all the Hive tables. Specify the path to the Serializer/DeSerializer (SerDe) jar file list. You can specify multiple jar files by separating the jar file paths using a semicolon (;). Specify the memory value required to run a scanner job. Specify one of the following memory values: - Low - Medium - High Note: For details about the memory values, see the Tuning Live Data Map Performance How-To Library article. The following table describes the Domain Connection and Profile Configuration settings on the Metadata Load Settings tab: Enable Data Profiling Configuration Domain Name Data Integration Service Username Select this option to enable data profiling for the metadata extracted from the source. Specify the configuration for Data Integration Service: - Custom. Use custom configuration when you want to configure the Data Integration Service options manually. - Global. Use global configuration when you want to use the existing Data Integration Service options created by the administrator. You can select from the existing Data Integration Service options listed in the drop-down list. Informatica domain name. Name of the Data Integration Service. Username to log in to the Informatica domain. Resource Type 35

36 Password Security Domain Host Port Profiling Run Options Priority Sampling Option Exclude Views Incremental Profiling Source Connection Name Run On Password to log in to the Informatica domain. Name of the security domain. Host name for the Data Integration Service. Port number for the Data Integration Service. Choose whether Live Data Map performs column profiling, domain discovery, or both column profiling and domain discovery. If you select domain discovery or both column profiling and domain discovery, provide the following details: - Data Domains: click Select... to select the required data domains from the Select Data Domains dialog box. Live Data Map runs profiling on the selected data domains. - Data Domain Match Criteria: select one of the following options to specify the criteria for matching data domains during profiling: - Percentage: select this option to specify the minimum conformance percentage of data to be eligible for data domain match. Conformance percentage is the ratio of number of matching rows divided by the total number of rows. Specify the value in the Minimum Conformance Percentage box. - Rows: select this option to specify the minimum number of rows needed to be eligible for data domain match. Specify the value in the Minimum Row Count for Data Domain Match box. - Exclude Null Values from Data Domain Discovery: select this option to specify that null values must be excluded when performing data domain discovery. - Column Name Match: select this option to specify that profiling must be run on column titles. Select High or Low to specify the priority for profiling. Live Data Map runs profiling on priority for resources where the profiling priority is set to High. Sampling options determine the number of rows that Live Data Map chooses to run a profile. You can configure sampling options when you define a profile or when you run a profile. Select any of the following options: - First N Rows: specifies that profiling must be run on the first <N> number of rows. Specify the number of rows in the Number of First N Sampling Rows box. - All Rows: specifies that profiling must be run on all rows. Select this option to specify that views must be excluded from profiling. This property is not in use for a Hive resource. Live Data Map runs profiling for the whole source every time. Event Date Records (EDR) name for the source connection. Click Select... to choose the EDR name for the source connection from the Select Source Connection dialog box. Specify where the Hive resource must run by selecting one of the following options: - Hadoop: Select this option to specify that the Hive resource must be run on Hadoop using Informatica Blaze. If you select this option, you must specify the Hadoop Connection Name. Click Select... and select the Hadoop connection name from the Select Hadoop Connection Name dialog box. - Native: Select this option to specify that the Hive resource must be run on the Hive engine. 36 Chapter 4: Managing Resources

37 Informatica Platform Resource Type Properties Create a resource based on the Informatica Platform resource type to extract metadata from the Model repository. You need to specify the Data Integration Service connection details when you configure the resource. The following table describes the Data Integration Service connection properties: Target version Domain Name DIS Name Username Password Security Domain Host Port Application Name Param Set for Mappings in Application The version number of the Informatica platform. You can choose any of the following Informatica versions: HotFix HotFix Name of the Informatica domain. Name of the Data Integration Service. Username for the Data Integration Service connection. Password for the Data Integration Service connection. Name of the LDAP security domain if the Informatica domain contains an LDAP security domain. Host name for the informatica domain. Port number of the Informatica domain. Name of the Data Integration Service application. Click Select... to select the name of the application from the Select Application Name dialog box. Note: This property is applicable if you select Target version as 10.0 or Parameter set for mappings configured for the Data Integration Service application. Click Select... to select the parameter set from the Select Param Sets for Mappings in Application dialog box. Note: This property is applicable if you select Target version as 10.0 or The following table describes the Additional and Advanced properties for source metadata settings on the Metadata Load Settings tab: Auto assign Connections Memory Specifies whether the connection must be automatically assigned. Specify the memory value required to run a scanner job. Specify one of the following memory values: - Low - Medium - High Note: For details about the memory values, see the Tuning Live Data Map Performance How-To Library article. Resource Type 37

38 PowerCenter Resource Type Properties You can configure a PowerCenter resource type to extract metadata from PowerCenter repository objects. Use PowerCenter to extract data from multiple sources, transform the data according to business logic you build in the client application, and load the transformed data into file and relational targets. The following table describes the properties for the PowerCenter resource type: Gateway Host Name or Address Gateway Port Number Informatica Security Domain Repository Name Repository User Name Repository User Password PowerCenter Version PowerCenter Code Page PowerCenter domain gateway host name or address. PowerCenter domain gateway port number. LDAP security domain name if one exists. Otherwise, enter "Native." Name of the PowerCenter repository. Username for the PowerCenter repository. Password for the PowerCenter repository. PowerCenter repository version. Code page for the PowerCenter repository. The following table describes the Additional and Advanced properties for source metadata settings on the Metadata Load Settings tab: Parameter File Auto assign Connections Repository subset Memory Specify the parameter file that you want to attach from a local system. Specifies whether Live Data Map assigns the connection is automatically. Enter the file path list separated by semicolons for the Informatica PowerCenter Repository object. Specify the memory value required to run a scanner job. Specify one of the following memory values: - Low - Medium - High Note: For details about the memory values, see the Tuning Live Data Map Performance How-To Library article. 38 Chapter 4: Managing Resources

39 IBM DB2 Resource Type Properties You can configure an IBM DB2 resource type to extract metadata from IBM DB2 databases. The following table describes the connection properties for the IBM DB2 resource type: User Password Host Port Database Name of the user account that connects to IBM DB2 database. Password for the user account that connects to IBM DB2 database. Fully qualified host name of the machine where IBM DB2 database is hosted. Port number for the IBM DB2 database. The DB2 connection URL used to access metadata from the database. The following table describes the Additional and Advanced properties for source metadata settings on the Metadata Load Settings tab: Import system objects Schema Import stored procedures Memory Specifies the system objects to import. Specifies a list of database schema. Specifies the stored procedures to import. Specify the memory value required to run a scanner job. Specify one of the following memory values: - Low - Medium - High Note: For details about the memory values, see the Tuning Live Data Map Performance How-To Library article. The following table describes the Domain Connection and Profile Configuration settings on the Metadata Load Settings tab: Configuration Domain Name Data Integration Service Username Specify the type of configuration for Data Integration Service: - Custom. Use custom configuration when you want to configure the Data Integration Service options manually. - Global. Use global configuration when you want to use the existing Data Integration Service options created by the administrator. Name of the Data Integration Service Domain. Name of the Data Integration Service. Username to log in to the Data Integration Service. Resource Type 39

40 Password Security Domain Host Port Profiling Run Options Priority Sampling Option Number of First N Sampling Rows Random Sampling Incremental Profiling Source Connection Name Run On Password to log in to the Data Integration Service. Name of the security domain. Host name for the Data Integration Service. Port number for the Data Integration Service. Choose whether Live Data Map performs column profiling, data domain discovery, or both. Specify the profile execution priority value. Specify one of the following profile execution priority values: - High - Low Sampling options determine the number of rows that Live Data Map chooses to run a profile on. You can configure sampling options when you define a profile or when you run a profile. The number of rows that you want to run the profile against. Live Data Map chooses the rows from the first rows in the source. Random sample size based on the number of rows in the data object. Random sampling forces Live Data Map to perform drilldown on staged data. Note that this option can impact the drill-down performance. Specifies if the profiling must be run only for the changes made to the source object. If you do not select this option, profiling is run for the whole source every time. Make sure that you enable and update database statistics for DB2. Event Date Records name for the source connection. Specify where the IBM DB2 resource must run by selecting one of the following options: - Hadoop: Select this option to specify that the IBM DB2 resource must be run on Hadoop using Informatica Blaze. If you select this option, you must specify the Hadoop Connection Name. Click Select... and select the Hadoop connection name from the Select Hadoop Connection Name dialog box. - Native: Select this option to specify that the IBM DB2 resource must be run on the Hive engine. 40 Chapter 4: Managing Resources

41 IBMDB2 for z/os Resource Type Properties Configure an IBM DB2 for z/os resource type to extract metadata from IBM DB2 for z/os databases. The following table describes the properties for the IBM DB2 for z/os resource type: Location User Password Encoding Sub System ID Node name in the dbmover.cfg file on the machine where the Catalog Service runs that points to the PowerExchange Listener on the z/os system. Note: Live Data Map uses PowerExchange for DB2 for z/os to access metadata from z/os subsystems. Name of the user account that connects to IBM DB2 for z/os database. Password for the user account that connects to IBM DB2 for z/os database. Code page for the IBM DB2 for z/os subsystem. Name of the DB2 subsystem. The following table describes the Additional property for source metadata settings on the Metadata Load Settings tab: Schema Specifies a list of database schema. The following table describes the Domain Connection and Profile Configuration settings on the Metadata Load Settings tab: Configuration Domain Name Data Integration Service Username Password Security Domain Host Port Specify the type of configuration for Data Integration Service: - Custom. Use custom configuration when you want to configure the Data Integration Service options manually. - Global. Use global configuration when you want to use the existing Data Integration Service options created by the administrator. Name of the Data Integration Service Domain. Name of the Data Integration Service. Username to log in to the Data Integration Service. Password to log in to the Data Integration Service. Name of the security domain. Host name for the Data Integration Service. Port number for the Data Integration Service. Resource Type 41

42 Profiling Run Options Priority Sampling Option Number of First N Sampling Rows Random Sampling Incremental Profiling Source Connection Name Run On Choose whether Live Data Map performs column profiling, data domain discovery, or both. Specify the profile execution priority value. Specify one of the following profile execution priority values: - High - Low Sampling options determine the number of rows that Live Data Map chooses to run a profile on. You can configure sampling options when you define a profile or when you run a profile. The number of rows that you want to run the profile against. Live Data Map chooses the rows from the first rows in the source. Random sample size based on the number of rows in the data object. Random sampling forces Live Data Map to perform drilldown on staged data. Note that this option can impact the drill-down performance. Specifies if the profiling must be run only for the changes made to the source object. If you do not select this option, profiling is run for the whole source every time. Make sure that you enable and update database statistics for DB2. Event Date Records name for the source connection. Specify where the IBM DB2 for z/os resource must run by selecting one of the following options: - Hadoop: Select this option to specify that the IBM DB2 resource must be run on Hadoop using Informatica Blaze. If you select this option, you must specify the Hadoop Connection Name. Click Select... and select the Hadoop connection name from the Select Hadoop Connection Name dialog box. - Native: Select this option to specify that the IBM DB2 for z/os resource must be run on the Hive engine. IBM Netezza Resource Type Properties You need to set up multiple configuration properties when you create a resource to extract metadata from IBM Netezza databases. The following table describes the connection properties for the IBM Netezza resource type: Host Port User Host name or IP address of the machine where the database management server runs. Port number for the Netezza database. Name of the user account used to connect to the Netezza database. 42 Chapter 4: Managing Resources

43 Password Database Password for the user account used to connect to the Netezza database. ODBC data source connect string for a Netezza database. Enter the data source name of the Netezza DSN if you created one. The following table describes the Additional and Advanced properties for source metadata settings on the Metadata Load Settings tab: Schema Memory Specifies a list of semicolon-separated database schema. Specify the memory value required to run a scanner job. Specify one of the following memory values: - Low - Medium - High Note: For details about the memory values, see the Tuning Live Data Map Performance How-To Library article. The following table describes the Domain Connection and Profile Configuration settings on the Metadata Load Settings tab: Configuration Domain Name Data Integration Service Username Password Security Domain Host Port Profiling Run Options Specify the type of configuration for Data Integration Service: - Custom. Use custom configuration when you want to configure the Data Integration Service options manually. - Global. Use global configuration when you want to use the existing Data Integration Service options created by the administrator. Name of the Data Integration Service Domain. Name of the Data Integration Service. Username to log in to the Data Integration Service. Password to log in to the Data Integration Service. Name of the security domain. Host name for the Data Integration Service. Port number for the Data Integration Service. Choose whether Live Data Map performs column profiling, data domain discovery, or both. Resource Type 43

44 Priority Sampling Option Number of First N Sampling Rows Random Sampling Incremental Profiling Source Connection Name Run On Specify the profile execution priority value. Specify one of the following profile execution priority values: - High - Low Sampling options determine the number of rows that Live Data Map chooses to run a profile on. You can configure sampling options when you define a profile or when you run a profile. The number of rows that you want to run the profile against. Live Data Map chooses the rows from the first rows in the source. Random sample size based on the number of rows in the data object. Random sampling forces Live Data Map to perform drilldown on staged data. Note that this option can impact the drill-down performance. This property is not in use for an IBM Netezza resource. Live Data Map runs profiling for the whole source every time. Event Date Records name for the source connection. Specify where the IBM Netezza resource must run by selecting one of the following options: - Hadoop: Select this option to specify that the IBM Netezza resource must be run on Hadoop using Informatica Blaze. If you select this option, you must specify the Hadoop Connection Name. Click Select... and select the Hadoop connection name from the Select Hadoop Connection Name dialog box. - Native: Select this option to specify that the IBM Netezza resource must be run on the Hive engine. JDBC Resource Type Properties You can use a JDBC connection to access tables in a database. The following table describes the connection properties for the JDBC resource type: Driver class URL User Password Name of the JDBC driver class. Connection string to connect to the database. Database username. Password for the database user name. Note: To extract metadata from multiple schemas of a source using the JDBC resource type, you can specify semicolon-separated schema names when you create the resource. You can type in the multiple schema names in the Schema field on the Metadata Load Settings page. 44 Chapter 4: Managing Resources

45 The following table describes the Additional and Advanced properties for source metadata settings on the Metadata Load Settings tab: Catalog Schema Case sensitivity View definition extracting SQL Synonyms lineage SQL Optional Scope Import stored procedures Memory Catalog name. Note: You cannot use Catalog option for JDBC or ODBC sources. Specifies a list of schemas to import. Specify if the database is set to a case-sensitivity mode when the property is set to Auto. Specifies the database specific SQL query to retrieve the view definition text. Specifies the database specific SQL query to retrieve the synonym lineage. The following are the two columns that the query returns: - Full Synonym Name - Full Table Name Specifies the database object types to import, such as Tables and Views, Indexes, and Procedures. Specify a list of optional database object types that you want to import. The list can have zero or more database object types, which are separated by semicolons. For example, Keys and Indexes, and Stored Procedures. Specifies the stored procedures to import. The default value is True or False whatever the case might be. Specify the memory value required to run a scanner job. Specify one of the following memory values: - Low - Medium - High Note: For details about the memory values, see the Tuning Live Data Map Performance How-To Library article. The following table describes the Domain Connection and Profile Configuration settings on the Metadata Load Settings tab: Configuration Domain Name Data Integration Service Username Password Specify the type of configuration for Data Integration Service: - Custom. Use custom configuration when you want to configure the Data Integration Service options manually. - Global. Use global configuration when you want to use the existing Data Integration Service options created by the administrator. Name of the Data Integration Service Domain. Name of the Data Integration Service. Username to log in to the Data Integration Service. Password to log in to the Data Integration Service. Resource Type 45

46 Security Domain Host Port Profiling Run Options Priority Sampling Option Number of First N Sampling Rows Random Sampling Incremental Profiling Source Connection Name Run On Name of the security domain. Host name for the Data Integration Service. Port number for the Data Integration Service. Choose whether Live Data Map performs column profiling, data domain discovery, or both. Specify the profile execution priority value. Specify one of the following profile execution priority values: - High - Low Sampling options determine the number of rows that Live Data Map chooses to run a profile on. You can configure sampling options when you define a profile or when you run a profile. The number of rows that you want to run the profile against. Live Data Map chooses the rows from the first rows in the source. Random sample size based on the number of rows in the data object. Random sampling forces Live Data Map to perform drilldown on staged data. Note that this option can impact the drill-down performance. This property is not in use for a JDBC resource. Live Data Map runs profiling for the whole source every time. Event Date Records name for the source connection. Specify where the JDBC resource must run by selecting one of the following options: - Hadoop: Select this option to specify that the JDBC resource must be run on Hadoop using Informatica Blaze. If you select this option, you must specify the Hadoop Connection Name. Click Select... and select the Hadoop connection name from the Select Hadoop Connection Name dialog box. - Native: Select this option to specify that the JDBC resource must be run on the Hive engine. 46 Chapter 4: Managing Resources

47 SQL Server Resource Type Properties You can configure an SQL Server resource type to extract metadata from Microsoft SQL Server databases. Make sure that you configure the VIEW DEFINITION permission for the SQL database and configure the SELECT permission for the sys.sql_expression_dependencies of the database. The following table describes the properties for the SQL Server resource type: User Password Host Port Database Instance Name of the SQL Server user account that connects to the Microsoft SQL Server database. The Catalog Service uses SQL Server authentication to connect to the Microsoft SQL Server database. Password for the user account that connects to the Microsoft SQL Server database. Host name of the machine where Microsoft SQL Server runs. Port number for the SQL Server database engine service. Name of the SQL Server database. SQL Server instance name. The following table describes the Additional and Advanced properties for source metadata settings on the Metadata Load Settings tab: Import system objects Schema Import stored procedures Memory Specifies the system objects to import. The default value is True or False whatever the case might be. Specifies a list of semicolon-separated database schema. Specifies the stored procedures to be imported. The default value is True or False whatever the case might be. Specify the memory value required to run a scanner job. Specify one of the following memory values: - Low - Medium - High Note: For details about the memory values, see the Tuning Live Data Map Performance How-To Library article. Resource Type 47

48 The following table describes the Domain Connection and Profile Configuration settings on the Metadata Load Settings tab: Configuration Domain Name Data Integration Service Username Password Security Domain Host Port Profiling Run Options Priority Sampling Option Number of First N Sampling Rows Random Sampling Incremental Profiling Specify the type of configuration for Data Integration Service: - Custom. Use custom configuration when you want to configure the Data Integration Service options manually. - Global. Use global configuration when you want to use the existing Data Integration Service options created by the administrator. Name of the Data Integration Service Domain. Name of the Data Integration Service. Username to log in to the Data Integration Service. Password to log in to the Data Integration Service. Name of the security domain. Host name for the Data Integration Service. Port number for the Data Integration Service. Choose whether Live Data Map performs column profiling, data domain discovery, or both. Specify the profile execution priority value. Specify one of the following profile execution priority values: - High - Low Sampling options determine the number of rows that Live Data Map chooses to run a profile on. You can configure sampling options when you define a profile or when you run a profile. The number of rows that you want to run the profile against. Live Data Map chooses the rows from the first rows in the source. Random sample size based on the number of rows in the data object. Random sampling forces Live Data Map to perform drilldown on staged data. Note that this option can impact the drill-down performance. Specifies if the profiling must be run only for the changes made to the source object. If you do not select this option, profiling is run for the whole source every time. Make sure that you enable and update database statistics for SQL Server database. 48 Chapter 4: Managing Resources

49 Source Connection Name Run On Event Date Records name for the source connection. Specify where the SQL Server resource must run by selecting one of the following options: - Hadoop: Select this option to specify that the SQL Server resource must be run on Hadoop using Informatica Blaze. If you select this option, you must specify the Hadoop Connection Name. Click Select... and select the Hadoop connection name from the Select Hadoop Connection Name dialog box. - Native: Select this option to specify that the SQL Server resource must be run on the Hive engine. Oracle Resource Type Properties Configure an Oracle resource type to extract metadata from Oracle databases. The following table describes the properties for the Oracle resource type: User Password Host Port Service Name of the user account that connects to the Oracle database. Password for the user account that connects to the Oracle database. Fully qualified host name of the machine where the Oracle database is hosted. Port number for the Oracle database engine service. Unique identifier or system identifier for the Oracle database server. The following table describes the Additional and Advanced properties for source metadata settings on the Metadata Load Settings tab: Import system objects Schema Import stored procedures Memory Specifies the system objects to import. The default value is True or False whatever the case might be. Specifies a list of semicolon-separated database schema. Specifies the stored procedures to import. The default value is True or False whatever the case might be. Specify the memory value required to run a scanner job. Specify one of the following memory values: - Low - Medium - High Note: For details about the memory values, see the Tuning Live Data Map Performance How-To Library article. Resource Type 49

50 The following table describes the Domain Connection and Profile Configuration settings on the Metadata Load Settings tab: Configuration Domain Name Data Integration Service Username Password Security Domain Host Port Profiling Run Options Priority Sampling Option Number of First N Sampling Rows Random Sampling Incremental Profiling Specify the type of configuration for Data Integration Service: - Custom. Use custom configuration when you want to configure the Data Integration Service options manually. - Global. Use global configuration when you want to use the existing Data Integration Service options created by the administrator. Name of the Data Integration Service Domain. Name of the Data Integration Service. Username to log in to the Data Integration Service. Password to log in to the Data Integration Service. Name of the security domain. Host name for the Data Integration Service. Port number for the Data Integration Service. Choose whether Live Data Map performs column profiling, data domain discovery, or both. Specify the profile execution priority value. Specify one of the following profile execution priority values: - High - Low Sampling options determine the number of rows that Live Data Map chooses to run a profile on. You can configure sampling options when you define a profile or when you run a profile. The number of rows that you want to run the profile against. Live Data Map chooses the rows from the first rows in the source. Random sample size based on the number of rows in the data object. Random sampling forces Live Data Map to perform drilldown on staged data. Note that this option can impact the drill-down performance. Specifies if the profiling must be run only for the changes made to the source object. If you do not select this option, profiling is run for the whole source every time. Make sure that you enable and update database statistics for Oracle. 50 Chapter 4: Managing Resources

51 Source Connection Name Run On Event Date Records name for the source connection. Specify where the Oracle resource must run by selecting one of the following options: - Hadoop: Select this option to specify that the Oracle resource must be run on Hadoop using Informatica Blaze. If you select this option, you must specify the Hadoop Connection Name. Click Select... and select the Hadoop connection name from the Select Hadoop Connection Name dialog box. - Native: Select this option to specify that the Oracle resource must be run on the Hive engine. Sybase Resource Type Properties You can configure a Sybase resource type to extract metadata from Sybase databases. The following table describes the properties for the Sybase resource type: Host Port User Password Database Host name of the machine where Sybase database is hosted. Port number for the Sybase database engine service. Database user name. The password for the database user name. Name of the database. The following table describes the Additional and Advanced properties for source metadata settings on the Metadata Load Settings tab: Schema Imported stored procedures Memory Specify a list of database or scheme to import. Specifies the stored procedures to import. The default value is True or False whatever the case might be. Specify the memory value required to run a scanner job. Specify one of the following memory values: - Low - Medium - High Note: For details about the memory values, see the Tuning Live Data Map Performance How-To Library article. Resource Type 51

52 The following table describes the Domain Connection and Profile Configuration settings on the Metadata Load Settings tab: Configuration Domain Name Data Integration Service Username Password Security Domain Host Port Profiling Run Options Priority Sampling Option Number of First N Sampling Rows Random Sampling Incremental Profiling Specify the type of configuration for Data Integration Service: - Custom. Use custom configuration when you want to configure the Data Integration Service options manually. - Global. Use global configuration when you want to use the existing Data Integration Service options created by the administrator. Name of the Data Integration Service Domain. Name of the Data Integration Service. Username to log in to the Data Integration Service. Password to log in to the Data Integration Service. Name of the security domain. Host name for the Data Integration Service. Port number for the Data Integration Service. Choose whether Live Data Map performs column profiling, data domain discovery, or both. Specify the profile execution priority value. Specify one of the following profile execution priority values: - High - Low Sampling options determine the number of rows that Live Data Map chooses to run a profile on. You can configure sampling options when you define a profile or when you run a profile. The number of rows that you want to run the profile against. Live Data Map chooses the rows from the first rows in the source. Random sample size based on the number of rows in the data object. Random sampling forces Live Data Map to perform drilldown on staged data. Note that this option can impact the drill-down performance. This property is not in use for a Sybase resource. Live Data Map runs profiling for the whole source every time. 52 Chapter 4: Managing Resources

53 Source Connection Name Run On Event Date Records name for the source connection. Specify where the Sybase resource must run by selecting one of the following options: - Hadoop: Select this option to specify that the Sybase resource must be run on Hadoop using Informatica Blaze. If you select this option, you must specify the Hadoop Connection Name. Click Select... and select the Hadoop connection name from the Select Hadoop Connection Name dialog box. - Native: Select this option to specify that the Sybase resource must be run on the Hive engine. Teradata Resource Type Properties Teradata is one of the ETL resource types in Live Data Map. Configure a Teradata resource type to extract metadata from Teradata databases. The following table describes the properties for the Teradata resource type: User Password Host Name of the user account that connects to the Teradata database. Password for the user account that connects to the Teradata database. Fully qualified host name of the machine where the Teradata database is hosted. The following table describes the Additional and Advanced properties for source metadata settings on the Metadata Load Settings tab: Import system objects Schema Import stored procedures Fetch Views Data Types Memory Specifies the system objects to import. The default value is True or False whatever the case might be. Specifies a list of semicolon-separated database schema. Specifies the stored procedures to import. The default value is True or False whatever the case might be. Specifies that views data type must be imported. Specify the memory value required to run a scanner job. Specify one of the following memory values: - Low - Medium - High Note: For details about the memory values, see the Tuning Live Data Map Performance How-To Library article. Resource Type 53

54 The following table describes the Domain Connection and Profile Configuration settings on the Metadata Load Settings tab: Configuration Domain Name Data Integration Service Username Password Security Domain Host Port Profiling Run Options Priority Sampling Option Number of First N Sampling Rows Random Sampling Incremental Profiling Specify the type of configuration for Data Integration Service: - Custom. Use custom configuration when you want to configure the Data Integration Service options manually. - Global. Use global configuration when you want to use the existing Data Integration Service options created by the administrator. Name of the Data Integration Service Domain. Name of the Data Integration Service. Username to log in to the Data Integration Service. Password to log in to the Data Integration Service. Name of the security domain. Host name for the Data Integration Service. Port number for the Data Integration Service. Choose whether Live Data Map performs column profiling, data domain discovery, or both. Specify the profile execution priority value. Specify one of the following profile execution priority values: - High - Low Sampling options determine the number of rows that Live Data Map chooses to run a profile on. You can configure sampling options when you define a profile or when you run a profile. The number of rows that you want to run the profile against. Live Data Map chooses the rows from the first rows in the source. Random sample size based on the number of rows in the data object. Random sampling forces Live Data Map to perform drilldown on staged data. Note that this option can impact the drill-down performance. This property is not in use for a Teradata resource. Live Data Map runs profiling for the whole source every time. 54 Chapter 4: Managing Resources

55 Source Connection Name Run On Event Date Records name for the source connection. Specify where the Teradata resource must run by selecting one of the following options: - Hadoop: Select this option to specify that the Teradata resource must be run on Hadoop using Informatica Blaze. If you select this option, you must specify the Hadoop Connection Name. Click Select... and select the Hadoop connection name from the Select Hadoop Connection Name dialog box. - Native: Select this option to specify that the Teradata resource must be run on the Hive engine. SAP Business Objects Resource Type Properties You can configure an SAP Business Objects resource type to extract metadata from SAP Business Objects. SAP Business Objects is a business intelligence tool that includes components for performance management, planning, reporting, analysis, and enterprise information management. The following table describes the properties for the Business Objects resource type: Agent URL Version System Authentication mode User Name Password Host name and port number of the MITI Agent. Version of the SAP Business Objects repository. Name of the Business Objects repository. For Business Objects 11.x and 12.x, specify the name of the Business Objects Central Management Server. Specify the server name in the following format: <server name>:<port number> If the Central Management Server is configured on a cluster, specify the cluster name in the following format: <host name>:<port>@<cluster name> Default port is Note: If the version of the Business Objects repository is , do not specify a port number in the repository name. If you specify the port number, Live Data Map cannot extract the Web Intelligence reports. The authentication mode for the user account that logs in to the Business Objects repository. Specify one of the following values: - Enterprise. Log in using the Business Objects Enterprise authentication mode. - LDAP. Log in using LDAP authentication configured to Business Objects. Default is Enterprise. User name to log in to the Business Objects repository. Password of the user account for the Business Objects repository. Resource Type 55

56 Incremental import Add dependent objects Add specific objects Crystal CORBA port Class representation Worker Threads Auto Assign Connections Loads changes after the previous resource load or loads complete metadata. Specify one of the following values: - True. Loads only the recent changes. - False. Performs a complete load of the metadata. Choose the documents that depend on the universe you selected. Specify one of the following values: - True. Imports the documents that depend on the specified universe. - False. Ignores the documents that depend on the specified universe. Note: Dependency information is retrieved from the Business Objects repository metadata cache. If the Live Data Map load does not reflect modified or moved reports, refresh the cache by loading these reports and refreshing the queries. Specifies additional objects to the universe. Specify one of the following values: - None. Ignores all objects. - Universe independent Documents. Imports documents that do not depend on any universe. Default is none. Specifies the client port number on which the Crystal SDK communicates with the Report Application Server (RAS). The RAS server uses the port to send metadata to the local client computer. If you do not specify a port, the server randomly selects a port for each execution. Controls how the import of the tree structure of classes and sub classes occur. MITI Agent imports each class containing objects as a dimension or as a tree of packages. Specify one of the following values: - As a flat structure. Creates no packages. - As a simplified flat structure. Creates a package for each class with a sub class. - As a full tree structure. Creates a package for each class. Default is As a flat structure. Number of worker threads that the MITI Agent uses to extract metadata asynchronously. Leave blank or enter a positive integer value. If left blank, the MITI Agent calculates the number of worker threads. The MITI agent uses the JVM architecture and number of available CPU cores on the MITI Agent machine to calculate the number of threads. If you specify a value that is not valid, the MITI agent uses one worker thread. Reduce the number of worker threads if the MITI Agent generates out-of-memory errors during metadata extraction. Increase the number of worker threads if the MITI Agent machine has a large amount of available memory, for example, 10 GB or more. If you specify too many worker threads, performance can decrease. Default is blank. Choose to automatically assign the database schemas to the resource that you create for SAP Business Objects source. 56 Chapter 4: Managing Resources

57 The following table describes the Additional and Advanced properties for source metadata settings on the Metadata Load Settings tab: Repository browsing mode Repository subset Memory Specifies the available objects in the SAP business objects repository. Specifies the objects stored in a remote SAP business object repository. Specify the memory value required to run a scanner job. Specify one of the following memory values: - Low - Medium - High Note: For details about the memory values, see the Tuning Live Data Map Performance How-To Library article. Salesforce Resource Type Properties Use a Salesforce connection to connect to a Salesforce object. The Salesforce connection is an application connection type. The following table describes the connection properties for the Salesforce resource type: Username Password Service_URL Salesforce username. Password for the Salesforce user name. URL of the Salesforce service that you want to access. The following table describes the Advanced property for source metadata settings on the Metadata Load Settings tab: Memory Specify the memory value required to run a scanner job. Specify one of the following memory values: - Low - Medium - High Note: For details about the memory values, see the Tuning Live Data Map Performance How-To Library article. Resource Type 57

58 The following table describes the Domain Connection and Profile Configuration settings on the Metadata Load Settings tab: Configuration Domain Name Data Integration Service Username Password Security Domain Host Port Profiling Run Options Priority Sampling Option Number of First N Sampling Rows Random Sampling Incremental Profiling Specify the type of configuration for Data Integration Service: - Custom. Use custom configuration when you want to configure the Data Integration Service options manually. - Global. Use global configuration when you want to use the existing Data Integration Service options created by the administrator. Name of the Data Integration Service Domain. Name of the Data Integration Service. Username to log in to the Data Integration Service. Password to log in to the Data Integration Service. Name of the security domain. Host name for the Data Integration Service. Port number for the Data Integration Service. Choose whether Live Data Map performs column profiling, data domain discovery, or both. Specify the profile execution priority value. Specify one of the following profile execution priority values: - High - Low Sampling options determine the number of rows that Live Data Map chooses to run a profile on. You can configure sampling options when you define a profile or when you run a profile. The number of rows that you want to run the profile against. Live Data Map chooses the rows from the first rows in the source. Random sample size based on the number of rows in the data object. Random sampling forces Live Data Map to perform drilldown on staged data. Note that this option can impact the drill-down performance. This property is not in use for a Salesforce resource. Live Data Map runs profiling for the whole source every time. 58 Chapter 4: Managing Resources

59 Source Connection Name Run On Event Date Records name for the source connection. Specify where the Saleforce resource must run by selecting one of the following options: - Hadoop: Select this option to specify that the Salesforce resource must be run on Hadoop using Informatica Blaze. If you select this option, you must specify the Hadoop Connection Name. Click Select... and select the Hadoop connection name from the Select Hadoop Connection Name dialog box. - Native: Select this option to specify that the Salesforce resource must be run on the Hive engine. Creating a Resource When you create a resource, you can specify the resource type, type of metadata that Live Data Map extracts, and an optional schedule for the resource. You can also assign custom attributes to the resource when you create it. 1. Click New > Resource. The New Resource wizard appears. 2. Enter a name and an optional description for the resource. 3. Click Choose to open the Select Resource Type dialog box. 4. Select a resource type, and click Select. More fields appear in the wizard based on the resource type you selected. 5. Based on the resource type, configure the connection properties. 6. Click Next to move to the Metadata Load Settings page. You can choose the type of metadata that you want Live Data Map to extract from the source systems. 7. Configure the required source metadata and profiling metadata parameters. 8. Click Next to move to the Custom Attributes page, and configure the attribute settings. You can select the custom attributes that you want to associate with the resource. 9. Click Next to go to the Schedule page. 10. Optionally, select schedules for source metadata load and profiling metadata. You can create a global schedule if required. 11. Click Save, or Click Save and Run. Creating a Resource 59

60 Editing a Resource You can make changes to a resource after you create it. You can change the settings, such as the connection properties, custom attributes, source metadata settings, profile metadata settings, and the attached schedule. You cannot change the name of the resource and its resource type after you create it. 1. From the Live Data Map administrator header, click Open. The Library workspace opens. 2. In the resource list, point to a resource, and select Edit from the control menu. The resource details appear on a new tab. 3. Make changes to the description, connection properties, custom attributes, source metadata settings, profile metadata settings, and schedule as required. 4. To save the changes without running the resource again, click Save. 5. To save the changes and run the scan, click Save and Run. Running a Scan on a Resource You can run a scan on a resource either as part of the resource schedule or manually as a one-time task based on your requirement. 1. From the Live Data Map administrator header, click Open. The Library workspace opens. 2. In the resource list, point to a resource, and select Run from the control menu. The resource details appear on a new page with the Monitoring tab enabled. Viewing a Resource You can view the list of resources on the Library tab. You can launch a read-only view of a specific resource. You can also edit a resource from the resource list. 1. From the Live Data Map administrator header, click Open. The Library workspace opens. 2. In the resource list, point to a resource, and select Open from the control menu. A read-only view of the resource details appears on the General tab. You can see multiple tabs for the resource details. 3. Click each tab to view more information about the resource. 60 Chapter 4: Managing Resources

61 C H A P T E R 5 Managing Schedules This chapter includes the following topics: Managing Schedules Overview, 61 Schedule Types, 61 Creating a Schedule, 62 Viewing the List of Schedules, 62 Managing Schedules Overview Schedules determine when scanners extract metadata from sources. You can have recurring daily, weekly, and monthly schedules to extract metadata at regular intervals. Create a reusable schedule if you want to assign multiple resources to the same schedule. If you choose to have a reusable schedule for metadata extraction, you can select from a list of existing schedules or create a different reusable schedule that meets your requirements. Create custom schedules that you can assign to specific resources. You can assign separate schedules to resources to extract source metadata and profiling metadata. When you create a schedule, you can choose to have a schedule without an end date or that recurs until a specific date. Schedule Types You can create reusable or custom schedules that meet the frequency requirements for each resource to extract metadata. You can attach more than one resource to a reusable schedule. You can create a custom schedule if you need a separate schedule specific to a single resource. Reusable Schedules Source systems might have changes to the metadata at different times. The changes can include newer data assets being added to the source or updates to the existing data assets. You can set up a reusable schedule that you can assign to multiple resources so that you continue to extract these source changes at regular intervals. 61

62 Custom Schedules Create a custom schedule if none of the existing reusable schedules match the metadata extraction schedule for the resource. You can create a custom schedule when you create a resource. Create a daily, weekly, or monthly custom schedule for a resource. You can create an indefinite custom schedule or a schedule that ends by a specific date. Creating a Schedule You can create a schedule when you configure a resource. You can create a reusable schedule using the New menu on the Live Data Administrator header. 1. Click New > Reusable Schedule. The New Reusable Schedule wizard appears on the Schedule workspace. 2. Enter a name and an optional description for the schedule. 3. Click the Starts on field to open a calendar, and choose a start date for the schedule. 4. Use the fields to the right of the Starts on field to set up the start time. 5. Choose whether you want to create a daily, weekly, or monthly schedule. 6. Configure the recurrence settings, such as every n days for a daily schedule or day of the week for a monthly schedule. You have different recurrence settings based on the schedule frequency. 7. Choose either an end date or set up the schedule without an end date. 8. Click Save. Viewing the List of Schedules You can view the list of schedules on the Library workspace. 1. From the Live Data Map Administrator header, click Open. The Library workspace opens. 2. On the left pane, click Schedule. The list of schedules appears on the right pane. 3. To view the schedule frequency, mouse over the icon at the beginning of the schedule name. 4. To view the complete information or edit the schedule, click the schedule name. The schedule opens in the Schedule workspace. 5. To make changes to the schedule, click Edit. 62 Chapter 5: Managing Schedules

63 C H A P T E R 6 Managing Attributes This chapter includes the following topics: Managing Attributes Overview, 63 System Attributes, 63 Custom Attributes, 64 General Attribute Properties, 64 Search Configuration Properties, 64 Editing a System Attribute, 65 Creating a Custom Attribute, 65 Managing Attributes Overview Attributes are metadata properties that scanners extract from different source systems. System attributes are predefined properties that scanners use for default resource types. You can create custom attributes that you can configure and assign to specific resources. Based on the business requirements, you can choose to assign custom attributes to resources. For example, you might want to assign a business glossary term or category titled City or Department to a resource. When you create a custom attribute, you can configure the basic attribute properties and search behavior. For example, you can select a specific data type, such as Data, Decimal, City, Department, or User. You can also make the attribute a search filter in Enterprise Information Catalog where users search for the required enterprise metadata. System Attributes System attributes represent the different types of metadata that scanners extract from source systems. For example, Author is a system attribute of the String data type that you can assign to a resource. You can configure these predefined attributes in Live Data Map Administrator. You can use system attributes to filter the search results that you are looking when you use Enterprise Information Catalog to search the metadata. Use the Live Data Map administrator to configure the search ranking of a system attribute based on the requirement. You can also set up the system attribute so that Enterprise Information Catalog includes the attribute in search filters. 63

64 Custom Attributes You can create custom attributes based on the search filters that you need to use in Enterprise Information Catalog where you search for metadata. Custom attributes help you quickly find specific metadata. For example, you might want to create a custom attribute named Data Center Location in Live Data Map Administrator and assign it to some of the resources. You can then use the custom attribute Data Center Location in Enterprise Information Catalog to quickly filter resources associated with a specific location. You need to specify a name and data type when you create a custom attribute. You can choose a core data type, such as Decimal, Integer, Date, and Boolean or extended data type, such as User. General Attribute Properties The general properties for both system attributes and custom attributes constitute the basic properties, such as name and description. The following table describes the general properties for both system attributes and custom attributes: Name Data Type Allow Multiple Value Selection Name of the system attribute or custom attribute. Descriptive text about the attribute. Basic or extended data type for the attribute. Examples are basic data types, such as String and Boolean and extended data types, such as user and CSV. Displays a multivalued list for the attribute when you use the attribute for metadata search in Live Data Map. You can simultaneously select multiple values from the list. Note: This property does not appear for or apply to the Boolean data type. Search Configuration Properties The search configuration properties define how Enterprise Information Catalog uses attributes in metadata search. The following table describes the search configuration properties for both system attributes and custom attributes: Search Rank Allow filtering Analyzer name Indicates the level of search ranking associated with the attribute. This setting determines the position of the attribute in the search query results of Enterprise Information Catalog. Determines whether Enterprise Information Catalog can use the attribute as a search filter. Name of the analyzer associated with string values. Note: Applies only to String data type. 64 Chapter 6: Managing Attributes

65 Editing a System Attribute You can make changes to the search configuration properties of system attributes. You cannot edit the remaining properties, such as Name and Data type. 1. From the Live Data Map administrator header, click Manage > Attributes. The Attributes workspace opens. 2. Select a system attribute in the left pane, and click Edit. The fields in the Search Configuration section appear in the edit mode. 3. Make the required changes to the Search Rank, Allow filtering, and Analyzer name properties. 4. Click Save. Creating a Custom Attribute Create custom attributes that you want Enterprise Information Catalog users to add to the search filters. Add search filters based on custom attributes in Enterprise Information Catalog to quickly categorize metadata search results. 1. Click Manage > Attributes. The Attributes workspace appears. 2. From the Actions menu, select New. The New Custom Attribute dialog box appears. 3. Enter the name and description for the custom attribute. 4. In the Data Type list, select a data type, such as Integer, String, Boolean, or Date. The data type determines the valid type of values for the custom attribute. 5. In the Search Rank field, choose the level of search ranking for the custom attribute. 6. Choose whether you need to display the custom attribute as a search filter in Enterprise Information Catalog. 7. Optionally, choose the analyzer name for a String data type. The analyzer name determines the analysis method that the Enterprise Information Catalog search engine uses when the search engine performs indexing of string values for the attribute. You can choose STRING, TEXT_ GENERAL, or TEXT_TECHNICAL. 8. Next, select the object types that you want to assign to the custom attribute. 9. Click OK to save the changes. Editing a System Attribute 65

66 C H A P T E R 7 Assigning Connections This chapter includes the following topics: Assigning Connections Overview, 66 Auto-assigned Connections, 66 User-assigned Connections, 67 Managing Connections, 67 Assigning Connections Overview When you run a scan on resources for some of the resource types, you need to ensure that the source connection maps accurately to the schemas from the resource. Live Data Map can automatically detect how the database schemas are assigned to the resources after you run a scan on the resources. You can assign and unassign schemas from resource to connections based on your requirements. The connection management tasks that you perform in Live Data Map Administrator apply only to SAP Business Objects, Informatica Platform, and PowerCenter resource types. If the data asset lineage information does not look accurate in Enterprise Information Catalog, you can troubleshoot the assigned and unassigned connections and make the required corrections in Live Data Map Administrator. You can then verify that the lineage flow is accurate in Enterprise Information Catalog. When you create an Informatica Platform, SAP Business Objects, and PowerCenter resource in Live Data Map Administrator, you can choose to automatically assign database schemas to resources. You can also manually assign the schemas to specific connections. Auto-assigned Connections When you create a resource for an Informatica Platform, SAP Business Objects, and PowerCenter source, you can choose to automatically assign the database schemas to the resource. You can view the list of 66

67 automatically assigned schemas and their connections for each resource. You can assign or unassign schemas in the auto-assigned connections. User-assigned Connections After you create a resource for an Informatica Platform, SAP Business Objects, and PowerCenter source connection and run a scan on it, you can view the resource as a user-assigned connection in Live Data Map Administrator. You can manually assign or unassign schemas to resources based on your requirements. When you manually assign or unassign connections, the status of the connection changes to In Progress. You can refresh the Connection Assignment workspace to view the latest status. Managing Connections You can assign or unassign connections one at a time or select multiple connections to make the changes. 1. From the Live Data Map administrator header, click Manage > Connection Assignment. The Connection Assignment workspace opens. The User Assigned Connections tab is displayed. 2. Use the filters at the top of the page to view the required connections based on the resource type and assignment type. 3. To assign a schema to a resource, select the connection and click Assign on the control menu. The Assign Connection dialog box appears. 4. Select the schema that you want to assign, and click Select. The Assignment Type status changes to In Progress. 5. On the control menu at the top of the page, click Refresh to view the latest status under the Assignment Type column. You can view the latest user-assigned connections and auto-assigned connections when you click Refresh from the control menu. 6. To unassign a connection, select the connection, and click Unassign. The Unassign connection dialog box appears. 7. Click OK. 8. To reassign a connection with another schema, select the connection, and click Reassign. 9. To assign or unassign multiple connections, select the connections, and select Manage Multiple Connections from the control menu at the top of the page. The Assigned Connections dialog box appears. 10. Make the required changes to the connections, and click OK. User-assigned Connections 67

68 C H A P T E R 8 Configuring Reusable Settings This chapter includes the following topics: Reusable Configuration Overview, 68 General Configuration Properties, 68 Data Integration Service Connection Properties, 69 Setting Up a Reusable Data Integration Service Configuration, 69 Reusable Configuration Overview You need to configure the Data Integration Service settings for a resource to extract profile metadata from source systems. You can create a reusable configuration for scanners to extract profile metadata that you can reuse for multiple resources. A reusable configuration helps you quickly configure multiple resources for extraction of profile metadata. Specify settings, such as the domain name, Data Integration Service Name, and user credentials. General Configuration Properties The general properties for reusable configuration include name, description, and the profiling configuration type. The following table describes the general properties for a reusable configuration: Name Profiling Name of the reusable configuration for profile metadata extraction. Descriptive text about the reusable configuration. Indicates the Data Integration Service configuration for profiling. 68

69 Data Integration Service Connection Properties Data Integration Service connection properties include the domain information, domain user information, Data Integration Service information, and Model repository information. The following table describes the Data Integration Service properties for a reusable, global configuration: Domain Name Data Integration Service User Name Password Security Domain Host Port Name of the domain. The name must not exceed 128 characters and must be 7-bit ASCII. It cannot contain a space or any of the following characters: ` % * + ; "?, < > \ / Name of the Data Integration Service associated with the Catalog Service. Username to access the Model Repository Service. Password to access the Model Repository Service. Name of the security domain to which the Informatica domain user belongs. Host name of the node running the Model Repository Service. Port number of the node running the Model Repository Service. Setting Up a Reusable Data Integration Service Configuration Use the Manage menu to create a reusable configuration to extract profile metadata from the source systems. 1. From the Live Data Map administrator header, click Manage > Reusable Configuration. The Reusable Configuration workspace opens. 2. From the control menu, click New. The New Reusable Configuration dialog box appears. 3. Enter the general properties, such as name and description. The DISOptions option is selected by default in the Profiling field. The Profiling field indicates the configuration type. 4. In the Domain Connection Settings section, configure the domain information, Data Integration Service information, and Model Repository Service information. Data Integration Service Connection Properties 69

70 C H A P T E R 9 Monitoring Live Data Map This chapter includes the following topics: Monitoring Live Data Map Overview, 70 Task Status, 71 Task Distribution, 71 Monitoring by Resource, 72 Monitoring by Task, 72 Applying Filters to Monitor Tasks, 72 Monitoring Live Data Map Overview Monitoring Live Data Map includes tracking the status and schedule of tasks. You can monitor the duration of the tasks that are running. You can also monitor the resource distribution in terms of the number of resources for each resource type. The Start workspace displays an overview of the monitoring statistics. You can view the number of resources for each resource type, task status details, and task schedule. To perform a detailed analysis of Live Data Map performance, you can open the Monitoring workspace. The task status that you can monitor includes the number of tasks and their statuses, such as Complete, Failed, and Running. You can also view the number of tasks for each phase of the metadata extraction, such as metadata load, profile executor, and profile result fetcher. Open the log files for troubleshooting Live Data Map tasks and further scrutiny. You can also filter and group by the jobs and tasks based on multiple factors. 70

71 The following image shows the Monitoring workspace: Task Status Tasks can have different statuses based on what stage of the metadata extraction process the tasks are in. The task status pie chart in the Monitoring workspace represents different task statuses and the number of tasks in each task status. Each task status has a different color in the chart. Place the pointer on the different sections of the pie chart to view the number tasks for each task status. Click any section on the pie chart to view more task details. The task status chart displays the following task statuses: Canceled. Number of canceled tasks. Failed. Number of failed tasks. Queued. Number of tasks that are in queue for run. Running. Number of tasks that are running. Complete. Number of tasks that have been successfully completed. Task Distribution Tasks distribution pie chart in the Monitoring workspace displays a summary of the task types and the number of tasks for each task type. The task types are Metadata Load, Profile Executor, and Profile Result Fetcher. Each task type has a different color in the chart. Place the pointer on the different sections of the pie chart to view the number tasks for each task type. Click any section on the pie chart to view more task details. Use the filters to the right of the pie chart to filter specific task types. Task Status 71

Informatica Data Archive (Version HotFix 1) Amdocs Accelerator Reference

Informatica Data Archive (Version HotFix 1) Amdocs Accelerator Reference Informatica Data Archive (Version 6.4.3 HotFix 1) Amdocs Accelerator Reference Informatica Data Archive Amdocs Accelerator Reference Version 6.4.3 HotFix 1 June 2017 Copyright Informatica LLC 2003, 2017

More information

Informatica (Version HotFix 4) Metadata Manager Repository Reports Reference

Informatica (Version HotFix 4) Metadata Manager Repository Reports Reference Informatica (Version 9.6.1 HotFix 4) Metadata Manager Repository Reports Reference Informatica Metadata Manager Repository Reports Reference Version 9.6.1 HotFix 4 April 2016 Copyright (c) 1993-2016 Informatica

More information

Informatica (Version ) SQL Data Service Guide

Informatica (Version ) SQL Data Service Guide Informatica (Version 10.1.0) SQL Data Service Guide Informatica SQL Data Service Guide Version 10.1.0 May 2016 Copyright (c) 1993-2016 Informatica LLC. All rights reserved. This software and documentation

More information

Informatica (Version 9.1.0) Data Quality Installation and Configuration Quick Start

Informatica (Version 9.1.0) Data Quality Installation and Configuration Quick Start Informatica (Version 9.1.0) Data Quality Installation and Configuration Quick Start Informatica Data Quality Installation and Configuration Quick Start Version 9.1.0 March 2011 Copyright (c) 1998-2011

More information

Informatica Data Integration Hub (Version 10.1) Developer Guide

Informatica Data Integration Hub (Version 10.1) Developer Guide Informatica Data Integration Hub (Version 10.1) Developer Guide Informatica Data Integration Hub Developer Guide Version 10.1 June 2016 Copyright (c) 1993-2016 Informatica LLC. All rights reserved. This

More information

Informatica (Version 10.1) Metadata Manager Custom Metadata Integration Guide

Informatica (Version 10.1) Metadata Manager Custom Metadata Integration Guide Informatica (Version 10.1) Metadata Manager Custom Metadata Integration Guide Informatica Metadata Manager Custom Metadata Integration Guide Version 10.1 June 2016 Copyright Informatica LLC 1993, 2016

More information

Informatica (Version 10.0) Rule Specification Guide

Informatica (Version 10.0) Rule Specification Guide Informatica (Version 10.0) Rule Specification Guide Informatica Rule Specification Guide Version 10.0 November 2015 Copyright (c) 1993-2015 Informatica LLC. All rights reserved. This software and documentation

More information

Informatica (Version ) Intelligent Data Lake Administrator Guide

Informatica (Version ) Intelligent Data Lake Administrator Guide Informatica (Version 10.1.1) Intelligent Data Lake Administrator Guide Informatica Intelligent Data Lake Administrator Guide Version 10.1.1 December 2016 Copyright Informatica LLC 2016 This software and

More information

Informatica Cloud (Version Fall 2016) Qlik Connector Guide

Informatica Cloud (Version Fall 2016) Qlik Connector Guide Informatica Cloud (Version Fall 2016) Qlik Connector Guide Informatica Cloud Qlik Connector Guide Version Fall 2016 November 2016 Copyright Informatica LLC 2016 This software and documentation contain

More information

Informatica (Version 10.1) Metadata Manager Administrator Guide

Informatica (Version 10.1) Metadata Manager Administrator Guide Informatica (Version 10.1) Metadata Manager Administrator Guide Informatica Metadata Manager Administrator Guide Version 10.1 June 2016 Copyright Informatica LLC 1993, 2017 This software and documentation

More information

Informatica Data Integration Hub (Version 10.0) Developer Guide

Informatica Data Integration Hub (Version 10.0) Developer Guide Informatica Data Integration Hub (Version 10.0) Developer Guide Informatica Data Integration Hub Developer Guide Version 10.0 November 2015 Copyright (c) 1993-2015 Informatica LLC. All rights reserved.

More information

Informatica PowerExchange for MSMQ (Version 9.0.1) User Guide

Informatica PowerExchange for MSMQ (Version 9.0.1) User Guide Informatica PowerExchange for MSMQ (Version 9.0.1) User Guide Informatica PowerExchange for MSMQ User Guide Version 9.0.1 June 2010 Copyright (c) 2004-2010 Informatica. All rights reserved. This software

More information

Informatica Cloud (Version Spring 2017) Microsoft Azure DocumentDB Connector Guide

Informatica Cloud (Version Spring 2017) Microsoft Azure DocumentDB Connector Guide Informatica Cloud (Version Spring 2017) Microsoft Azure DocumentDB Connector Guide Informatica Cloud Microsoft Azure DocumentDB Connector Guide Version Spring 2017 April 2017 Copyright Informatica LLC

More information

Informatica Cloud (Version Spring 2017) Magento Connector User Guide

Informatica Cloud (Version Spring 2017) Magento Connector User Guide Informatica Cloud (Version Spring 2017) Magento Connector User Guide Informatica Cloud Magento Connector User Guide Version Spring 2017 April 2017 Copyright Informatica LLC 2016, 2017 This software and

More information

Informatica Cloud (Version Spring 2017) Microsoft Dynamics 365 for Operations Connector Guide

Informatica Cloud (Version Spring 2017) Microsoft Dynamics 365 for Operations Connector Guide Informatica Cloud (Version Spring 2017) Microsoft Dynamics 365 for Operations Connector Guide Informatica Cloud Microsoft Dynamics 365 for Operations Connector Guide Version Spring 2017 July 2017 Copyright

More information

Informatica (Version 10.0) Mapping Specification Guide

Informatica (Version 10.0) Mapping Specification Guide Informatica (Version 10.0) Mapping Specification Guide Informatica Mapping Specification Guide Version 10.0 November 2015 Copyright (c) 1993-2015 Informatica LLC. All rights reserved. This software and

More information

Informatica (Version HotFix 3) Reference Data Guide

Informatica (Version HotFix 3) Reference Data Guide Informatica (Version 9.6.1 HotFix 3) Reference Data Guide Informatica Reference Data Guide Version 9.6.1 HotFix 3 June 2015 Copyright (c) 1993-2016 Informatica LLC. All rights reserved. This software and

More information

Informatica PowerExchange for SAP NetWeaver (Version 10.2)

Informatica PowerExchange for SAP NetWeaver (Version 10.2) Informatica PowerExchange for SAP NetWeaver (Version 10.2) SAP BW Metadata Creation Solution Informatica PowerExchange for SAP NetWeaver BW Metadata Creation Solution Version 10.2 September 2017 Copyright

More information

Informatica Data Services (Version 9.5.0) User Guide

Informatica Data Services (Version 9.5.0) User Guide Informatica Data Services (Version 9.5.0) User Guide Informatica Data Services User Guide Version 9.5.0 June 2012 Copyright (c) 1998-2012 Informatica. All rights reserved. This software and documentation

More information

Informatica Informatica (Version ) Installation and Configuration Guide

Informatica Informatica (Version ) Installation and Configuration Guide Informatica Informatica (Version 10.1.1) Installation and Configuration Guide Informatica Informatica Installation and Configuration Guide Version 10.1.1 Copyright Informatica LLC 1998, 2016 This software

More information

Informatica 4.0. Installation and Configuration Guide

Informatica 4.0. Installation and Configuration Guide Informatica Secure@Source 4.0 Installation and Configuration Guide Informatica Secure@Source Installation and Configuration Guide 4.0 September 2017 Copyright Informatica LLC 2015, 2017 This software and

More information

Informatica Cloud (Version Fall 2015) Data Integration Hub Connector Guide

Informatica Cloud (Version Fall 2015) Data Integration Hub Connector Guide Informatica Cloud (Version Fall 2015) Data Integration Hub Connector Guide Informatica Cloud Data Integration Hub Connector Guide Version Fall 2015 January 2016 Copyright (c) 1993-2016 Informatica LLC.

More information

Informatica (Version 10.0) Exception Management Guide

Informatica (Version 10.0) Exception Management Guide Informatica (Version 10.0) Exception Management Guide Informatica Exception Management Guide Version 10.0 November 2015 Copyright (c) 1993-2015 Informatica LLC. All rights reserved. This software and documentation

More information

Informatica (Version HotFix 4) Installation and Configuration Guide

Informatica (Version HotFix 4) Installation and Configuration Guide Informatica (Version 9.6.1 HotFix 4) Installation and Configuration Guide Informatica Installation and Configuration Guide Version 9.6.1 HotFix 4 Copyright (c) 1993-2016 Informatica LLC. All rights reserved.

More information

Informatica Cloud (Version Spring 2017) Box Connector Guide

Informatica Cloud (Version Spring 2017) Box Connector Guide Informatica Cloud (Version Spring 2017) Box Connector Guide Informatica Cloud Box Connector Guide Version Spring 2017 April 2017 Copyright Informatica LLC 2015, 2017 This software and documentation contain

More information

Informatica Data Integration Hub (Version 10.2) Administrator Guide

Informatica Data Integration Hub (Version 10.2) Administrator Guide Informatica Data Integration Hub (Version 10.2) Administrator Guide Informatica Data Integration Hub Administrator Guide Version 10.2 April 2017 Copyright Informatica LLC 1993, 2017 This software and documentation

More information

Informatica Dynamic Data Masking (Version 9.8.3) Installation and Upgrade Guide

Informatica Dynamic Data Masking (Version 9.8.3) Installation and Upgrade Guide Informatica Dynamic Data Masking (Version 9.8.3) Installation and Upgrade Guide Informatica Dynamic Data Masking Installation and Upgrade Guide Version 9.8.3 July 2017 Copyright Informatica LLC 1993, 2017

More information

Informatica Cloud (Version Spring 2017) DynamoDB Connector Guide

Informatica Cloud (Version Spring 2017) DynamoDB Connector Guide Informatica Cloud (Version Spring 2017) DynamoDB Connector Guide Informatica Cloud DynamoDB Connector Guide Version Spring 2017 April 2017 Copyright Informatica LLC 2015, 2017 This software and documentation

More information

Informatica PowerCenter Data Validation Option (Version 10.0) User Guide

Informatica PowerCenter Data Validation Option (Version 10.0) User Guide Informatica PowerCenter Data Validation Option (Version 10.0) User Guide Informatica PowerCenter Data Validation Option User Guide Version 10.0 December 2015 Copyright Informatica LLC 1998, 2016 This software

More information

Informatica Test Data Management (Version 9.6.0) User Guide

Informatica Test Data Management (Version 9.6.0) User Guide Informatica Test Data Management (Version 9.6.0) User Guide Informatica Test Data Management User Guide Version 9.6.0 April 2014 Copyright (c) 2003-2014 Informatica Corporation. All rights reserved. This

More information

Informatica PowerExchange for Web Content-Kapow Katalyst (Version ) User Guide

Informatica PowerExchange for Web Content-Kapow Katalyst (Version ) User Guide Informatica PowerExchange for Web Content-Kapow Katalyst (Version 10.1.1) User Guide Informatica PowerExchange for Web Content-Kapow Katalyst User Guide Version 10.1.1 December 2016 Copyright Informatica

More information

Informatica Cloud (Version Winter 2015) Box API Connector Guide

Informatica Cloud (Version Winter 2015) Box API Connector Guide Informatica Cloud (Version Winter 2015) Box API Connector Guide Informatica Cloud Box API Connector Guide Version Winter 2015 July 2016 Copyright Informatica LLC 2015, 2017 This software and documentation

More information

Informatica (Version 9.6.1) Mapping Guide

Informatica (Version 9.6.1) Mapping Guide Informatica (Version 9.6.1) Mapping Guide Informatica Mapping Guide Version 9.6.1 June 2014 Copyright (c) 1998-2014 Informatica Corporation. All rights reserved. This software and documentation contain

More information

Informatica Data Integration Hub (Version ) Administrator Guide

Informatica Data Integration Hub (Version ) Administrator Guide Informatica Data Integration Hub (Version 10.0.0) Administrator Guide Informatica Data Integration Hub Administrator Guide Version 10.0.0 November 2015 Copyright (c) 1993-2015 Informatica LLC. All rights

More information

Informatica (Version 10.1) Security Guide

Informatica (Version 10.1) Security Guide Informatica (Version 10.1) Security Guide Informatica Security Guide Version 10.1 June 2016 Copyright (c) 1993-2016 Informatica LLC. All rights reserved. This software and documentation contain proprietary

More information

Informatica (Version HotFix 3) Business Glossary 9.5.x to 9.6.x Transition Guide

Informatica (Version HotFix 3) Business Glossary 9.5.x to 9.6.x Transition Guide Informatica (Version 9.6.1.HotFix 3) Business Glossary 9.5.x to 9.6.x Transition Guide Informatica Business Glossary 9.5.x to 9.6.x Transition Guide Version 9.6.1.HotFix 3 June 2015 Copyright (c) 1993-2015

More information

Informatica PowerExchange for Tableau (Version HotFix 1) User Guide

Informatica PowerExchange for Tableau (Version HotFix 1) User Guide Informatica PowerExchange for Tableau (Version 9.6.1 HotFix 1) User Guide Informatica PowerExchange for Tableau User Guide Version 9.6.1 HotFix 1 September 2014 Copyright (c) 2014 Informatica Corporation.

More information

Informatica PowerExchange for Microsoft Azure SQL Data Warehouse (Version ) User Guide for PowerCenter

Informatica PowerExchange for Microsoft Azure SQL Data Warehouse (Version ) User Guide for PowerCenter Informatica PowerExchange for Microsoft Azure SQL Data Warehouse (Version 10.1.1) User Guide for PowerCenter Informatica PowerExchange for Microsoft Azure SQL Data Warehouse User Guide for PowerCenter

More information

Informatica 4.5. Installation and Configuration Guide

Informatica 4.5. Installation and Configuration Guide Informatica Secure@Source 4.5 Installation and Configuration Guide Informatica Secure@Source Installation and Configuration Guide 4.5 June 2018 Copyright Informatica LLC 2015, 2018 This software and documentation

More information

Informatica Enterprise Data Catalog Installation and Configuration Guide

Informatica Enterprise Data Catalog Installation and Configuration Guide Informatica 10.2.1 Enterprise Data Catalog Installation and Configuration Guide Informatica Enterprise Data Catalog Installation and Configuration Guide 10.2.1 May 2018 Copyright Informatica LLC 2015,

More information

Informatica PowerCenter Express (Version 9.6.1) Getting Started Guide

Informatica PowerCenter Express (Version 9.6.1) Getting Started Guide Informatica PowerCenter Express (Version 9.6.1) Getting Started Guide Informatica PowerCenter Express Getting Started Guide Version 9.6.1 June 2014 Copyright (c) 2013-2014 Informatica Corporation. All

More information

Informatica Cloud Integration Hub Spring 2018 August. User Guide

Informatica Cloud Integration Hub Spring 2018 August. User Guide Informatica Cloud Integration Hub Spring 2018 August User Guide Informatica Cloud Integration Hub User Guide Spring 2018 August August 2018 Copyright Informatica LLC 2016, 2018 This software and documentation

More information

Informatica Intelligent Data Lake (Version 10.1) Installation and Configuration Guide

Informatica Intelligent Data Lake (Version 10.1) Installation and Configuration Guide Informatica Intelligent Data Lake (Version 10.1) Installation and Configuration Guide Informatica Intelligent Data Lake Installation and Configuration Guide Version 10.1 August 2016 Copyright Informatica

More information

Informatica Fast Clone (Version 9.6.0) Release Guide

Informatica Fast Clone (Version 9.6.0) Release Guide Informatica Fast Clone (Version 9.6.0) Release Guide Informatica Fast Clone Release Guide Version 9.6.0 December 2013 Copyright (c) 2012-2013 Informatica Corporation. All rights reserved. This software

More information

Informatica PowerExchange for Tableau (Version HotFix 4) User Guide

Informatica PowerExchange for Tableau (Version HotFix 4) User Guide Informatica PowerExchange for Tableau (Version 9.6.1 HotFix 4) User Guide Informatica PowerExchange for Tableau User Guide Version 9.6.1 HotFix 4 April 2016 Copyright (c) 1993-2016 Informatica LLC. All

More information

Informatica Data Director for Data Quality (Version HotFix 4) User Guide

Informatica Data Director for Data Quality (Version HotFix 4) User Guide Informatica Data Director for Data Quality (Version 9.5.1 HotFix 4) User Guide Informatica Data Director for Data Quality User Guide Version 9.5.1 HotFix 4 February 2014 Copyright (c) 1998-2014 Informatica

More information

Informatica Cloud (Version Spring 2017) Salesforce Analytics Connector Guide

Informatica Cloud (Version Spring 2017) Salesforce Analytics Connector Guide Informatica Cloud (Version Spring 2017) Salesforce Analytics Connector Guide Informatica Cloud Salesforce Analytics Connector Guide Version Spring 2017 April 2017 Copyright Informatica LLC 2015, 2017 This

More information

Informatica PowerExchange for Microsoft Azure Cosmos DB SQL API User Guide

Informatica PowerExchange for Microsoft Azure Cosmos DB SQL API User Guide Informatica PowerExchange for Microsoft Azure Cosmos DB SQL API 10.2.1 User Guide Informatica PowerExchange for Microsoft Azure Cosmos DB SQL API User Guide 10.2.1 June 2018 Copyright Informatica LLC 2018

More information

Infomatica PowerCenter (Version 10.0) PowerCenter Repository Reports

Infomatica PowerCenter (Version 10.0) PowerCenter Repository Reports Infomatica PowerCenter (Version 10.0) PowerCenter Repository Reports Infomatica PowerCenter PowerCenter Repository Reports Version 10.0 November 2015 Copyright (c) 1993-2015 Informatica LLC. All rights

More information

Informatica PowerCenter Express (Version 9.6.0) Administrator Guide

Informatica PowerCenter Express (Version 9.6.0) Administrator Guide Informatica PowerCenter Express (Version 9.6.0) Administrator Guide Informatica PowerCenter Express Administrator Guide Version 9.6.0 January 2014 Copyright (c) 1998-2014 Informatica Corporation. All rights

More information

Informatica Cloud (Version Spring 2017) NetSuite RESTlet Connector Guide

Informatica Cloud (Version Spring 2017) NetSuite RESTlet Connector Guide Informatica Cloud (Version Spring 2017) NetSuite RESTlet Connector Guide Informatica Cloud NetSuite RESTlet Connector Guide Version Spring 2017 April 2017 Copyright Informatica LLC 2016, 2017 This software

More information

Informatica B2B Data Exchange (Version 10.2) Administrator Guide

Informatica B2B Data Exchange (Version 10.2) Administrator Guide Informatica B2B Data Exchange (Version 10.2) Administrator Guide Informatica B2B Data Exchange Administrator Guide Version 10.2 April 2017 Copyright Informatica LLC 1993, 2017 This software and documentation

More information

Informatica (Version ) Profiling Getting Started Guide

Informatica (Version ) Profiling Getting Started Guide Informatica (Version 10.1.1) Profiling Getting Started Guide Informatica Profiling Getting Started Guide Version 10.1.1 December 2016 Copyright Informatica LLC 2010, 2016 This software and documentation

More information

Informatica PowerExchange for Hive (Version 9.6.0) User Guide

Informatica PowerExchange for Hive (Version 9.6.0) User Guide Informatica PowerExchange for Hive (Version 9.6.0) User Guide Informatica PowerExchange for Hive User Guide Version 9.6.0 January 2014 Copyright (c) 2012-2014 Informatica Corporation. All rights reserved.

More information

Informatica Dynamic Data Masking (Version 9.8.1) Administrator Guide

Informatica Dynamic Data Masking (Version 9.8.1) Administrator Guide Informatica Dynamic Data Masking (Version 9.8.1) Administrator Guide Informatica Dynamic Data Masking Administrator Guide Version 9.8.1 May 2016 Copyright Informatica LLC 1993, 2016 This software and documentation

More information

Informatica Catalog Administrator Guide

Informatica Catalog Administrator Guide Informatica 10.2 Catalog Administrator Guide Informatica Catalog Administrator Guide 10.2 September 2017 Copyright Informatica LLC 2015, 2018 This software and documentation are provided only under a separate

More information

Informatica Test Data Management (Version 9.7.0) User Guide

Informatica Test Data Management (Version 9.7.0) User Guide Informatica Test Data Management (Version 9.7.0) User Guide Informatica Test Data Management User Guide Version 9.7.0 August 2015 Copyright (c) 1993-2015 Informatica LLC. All rights reserved. This software

More information

Informatica Cloud (Version Winter 2015) Dropbox Connector Guide

Informatica Cloud (Version Winter 2015) Dropbox Connector Guide Informatica Cloud (Version Winter 2015) Dropbox Connector Guide Informatica Cloud Dropbox Connector Guide Version Winter 2015 March 2015 Copyright Informatica LLC 2015, 2017 This software and documentation

More information

Informatica Dynamic Data Masking (Version 9.8.1) Dynamic Data Masking Accelerator for use with SAP

Informatica Dynamic Data Masking (Version 9.8.1) Dynamic Data Masking Accelerator for use with SAP Informatica Dynamic Data Masking (Version 9.8.1) Dynamic Data Masking Accelerator for use with SAP Informatica Dynamic Data Masking Dynamic Data Masking Accelerator for use with SAP Version 9.8.1 May 2016

More information

Informatica PowerCenter Express (Version HotFix2) Release Guide

Informatica PowerCenter Express (Version HotFix2) Release Guide Informatica PowerCenter Express (Version 9.6.1 HotFix2) Release Guide Informatica PowerCenter Express Release Guide Version 9.6.1 HotFix2 January 2015 Copyright (c) 1993-2015 Informatica Corporation. All

More information

Informatica (Version 9.6.1) Profile Guide

Informatica (Version 9.6.1) Profile Guide Informatica (Version 9.6.1) Profile Guide Informatica Profile Guide Version 9.6.1 June 2014 Copyright (c) 2014 Informatica Corporation. All rights reserved. This software and documentation contain proprietary

More information

Informatica PowerExchange for MapR-DB (Version Update 2) User Guide

Informatica PowerExchange for MapR-DB (Version Update 2) User Guide Informatica PowerExchange for MapR-DB (Version 10.1.1 Update 2) User Guide Informatica PowerExchange for MapR-DB User Guide Version 10.1.1 Update 2 March 2017 Copyright Informatica LLC 2017 This software

More information

Informatica PowerCenter Express (Version 9.6.1) Mapping Guide

Informatica PowerCenter Express (Version 9.6.1) Mapping Guide Informatica PowerCenter Express (Version 9.6.1) Mapping Guide Informatica PowerCenter Express Mapping Guide Version 9.6.1 June 2014 Copyright (c) 1998-2014 Informatica Corporation. All rights reserved.

More information

Informatica Cloud (Version Spring 2017) XML Target Connector Guide

Informatica Cloud (Version Spring 2017) XML Target Connector Guide Informatica Cloud (Version Spring 2017) XML Target Connector Guide Informatica Cloud XML Target Connector Guide Version Spring 2017 April 2017 Copyright Informatica LLC 2015, 2017 This software and documentation

More information

Informatica MDM Multidomain Edition (Version ) Provisioning Tool Guide

Informatica MDM Multidomain Edition (Version ) Provisioning Tool Guide Informatica MDM Multidomain Edition (Version 10.1.0) Provisioning Tool Guide Informatica MDM Multidomain Edition Provisioning Tool Guide Version 10.1.0 November 2015 Copyright (c) 1993-2016 Informatica

More information

Informatica PowerExchange for Tableau (Version 10.0) User Guide

Informatica PowerExchange for Tableau (Version 10.0) User Guide Informatica PowerExchange for Tableau (Version 10.0) User Guide Informatica PowerExchange for Tableau User Guide Version 10.0 December 2015 Copyright (c) 1993-2015 Informatica LLC. All rights reserved.

More information

Informatica (Version ) Developer Workflow Guide

Informatica (Version ) Developer Workflow Guide Informatica (Version 10.1.1) Developer Workflow Guide Informatica Developer Workflow Guide Version 10.1.1 December 2016 Copyright Informatica LLC 2010, 2016 This software and documentation are provided

More information

Informatica PowerExchange for Web Content- Kapow Katalyst (Version 10.0) User Guide

Informatica PowerExchange for Web Content- Kapow Katalyst (Version 10.0) User Guide Informatica PowerExchange for Web Content- Kapow Katalyst (Version 10.0) User Guide Informatica PowerExchange for Web Content-Kapow Katalyst User Guide Version 10.0 November 2015 Copyright (c) 1993-2015

More information

Informatica Data Quality for SAP Point of Entry (Version 9.5.1) Installation and Configuration Guide

Informatica Data Quality for SAP Point of Entry (Version 9.5.1) Installation and Configuration Guide Informatica Data Quality for SAP Point of Entry (Version 9.5.1) Installation and Configuration Guide Informatica Data Quality for SAP Point of Entry Installation and Configuration Guide Version 9.5.1 October

More information

Informatica (Version 10.1) Analyst Tool Guide

Informatica (Version 10.1) Analyst Tool Guide Informatica (Version 10.1) Analyst Tool Guide Informatica Analyst Tool Guide Version 10.1 June 2016 Copyright Informatica LLC 1993, 2016 This software and documentation contain proprietary information

More information

Informatica Big Data Management (Version Update 2) User Guide

Informatica Big Data Management (Version Update 2) User Guide Informatica Big Data Management (Version 10.1.1 Update 2) User Guide Informatica Big Data Management User Guide Version 10.1.1 Update 2 March 2017 Copyright Informatica LLC 2012, 2017 This software and

More information

Informatica Dynamic Data Masking (Version 9.8.0) Administrator Guide

Informatica Dynamic Data Masking (Version 9.8.0) Administrator Guide Informatica Dynamic Data Masking (Version 9.8.0) Administrator Guide Informatica Dynamic Data Masking Administrator Guide Version 9.8.0 December 2015 Copyright (c) 1993-2015 Informatica LLC. All rights

More information

Informatica PowerExchange for Cloud Applications HF4. User Guide for PowerCenter

Informatica PowerExchange for Cloud Applications HF4. User Guide for PowerCenter Informatica PowerExchange for Cloud Applications 9.6.1 HF4 User Guide for PowerCenter Informatica PowerExchange for Cloud Applications User Guide for PowerCenter 9.6.1 HF4 January 2017 Copyright Informatica

More information

Informatica Test Data Management (Version ) Release Guide

Informatica Test Data Management (Version ) Release Guide Informatica Test Data Management (Version 10.1.0) Release Guide Informatica Test Data Management Release Guide Version 10.1.0 December 2016 Copyright Informatica LLC 2003, 2017 This software and documentation

More information

Informatica MDM Multidomain Edition (Version 10.2) Data Steward Guide

Informatica MDM Multidomain Edition (Version 10.2) Data Steward Guide Informatica MDM Multidomain Edition (Version 10.2) Data Steward Guide Informatica MDM Multidomain Edition Data Steward Guide Version 10.2 October 2016 Copyright Informatica LLC 1998, 2016 This software

More information

Informatica Big Data Management (Version Update 2) Installation and Configuration Guide

Informatica Big Data Management (Version Update 2) Installation and Configuration Guide Informatica Big Data Management (Version 10.1.1 Update 2) Installation and Configuration Guide Informatica Big Data Management Installation and Configuration Guide Version 10.1.1 Update 2 March 2017 Copyright

More information

Informatica Data Integration Hub (Version 10.1) High Availability Guide

Informatica Data Integration Hub (Version 10.1) High Availability Guide Informatica Data Integration Hub (Version 10.1) High Availability Guide Informatica Data Integration Hub High Availability Guide Version 10.1 June 2016 Copyright (c) 1993-2016 Informatica LLC. All rights

More information

Informatica Data Services (Version 9.6.0) Web Services Guide

Informatica Data Services (Version 9.6.0) Web Services Guide Informatica Data Services (Version 9.6.0) Web Services Guide Informatica Data Services Web Services Guide Version 9.6.0 January 2014 Copyright (c) 1998-2014 Informatica Corporation. All rights reserved.

More information

Informatica B2B Data Transformation (Version 10.1) XMap Tutorial

Informatica B2B Data Transformation (Version 10.1) XMap Tutorial Informatica B2B Data Transformation (Version 10.1) XMap Tutorial Informatica B2B Data Transformation XMap Tutorial Version 10.1 January 2016 Copyright (c) 1993-2016 Informatica LLC. All rights reserved.

More information

Informatica Development Platform (Version 9.6.1) Developer Guide

Informatica Development Platform (Version 9.6.1) Developer Guide Informatica Development Platform (Version 9.6.1) Developer Guide Informatica Development Platform Developer Guide Version 9.6.1 June 2014 Copyright (c) 1998-2014 Informatica Corporation. All rights reserved.

More information

Informatica PowerExchange for Greenplum (Version 10.0) User Guide

Informatica PowerExchange for Greenplum (Version 10.0) User Guide Informatica PowerExchange for Greenplum (Version 10.0) User Guide Informatica PowerExchange for Greenplum User Guide Version 10.0 November 2015 Copyright (c) 1993-2015 Informatica LLC. All rights reserved.

More information

Informatica MDM Multidomain Edition (Version ) Data Steward Guide

Informatica MDM Multidomain Edition (Version ) Data Steward Guide Informatica MDM Multidomain Edition (Version 10.1.0) Data Steward Guide Informatica MDM Multidomain Edition Data Steward Guide Version 10.1.0 November 2015 Copyright (c) 1993-2015 Informatica LLC. All

More information

Informatica (Version HotFix 2) Upgrading from Version 9.1.0

Informatica (Version HotFix 2) Upgrading from Version 9.1.0 Informatica (Version 9.6.1 HotFix 2) Upgrading from Version 9.1.0 Informatica Upgrading from Version 9.1.0 Version 9.6.1 HotFix 2 January 2015 Copyright (c) 1993-2015 Informatica Corporation. All rights

More information

Informatica PowerExchange for Salesforce (Version 10.0) User Guide

Informatica PowerExchange for Salesforce (Version 10.0) User Guide Informatica PowerExchange for Salesforce (Version 10.0) User Guide Informatica PowerExchange for Salesforce User Guide Version 10.0 February 2016 Copyright (c) 1993-2016 Informatica LLC. All rights reserved.

More information

Informatica Enterprise Data Catalog Installation and Configuration Guide

Informatica Enterprise Data Catalog Installation and Configuration Guide Informatica 10.2.2 Enterprise Data Catalog Installation and Configuration Guide Informatica Enterprise Data Catalog Installation and Configuration Guide 10.2.2 February 2019 Copyright Informatica LLC 2015,

More information

Informatica (Version 10.1) Upgrading from Version 9.5.1

Informatica (Version 10.1) Upgrading from Version 9.5.1 Informatica (Version 10.1) Upgrading from Version 9.5.1 Informatica Upgrading from Version 9.5.1 Version 10.1 May 2016 Copyright Informatica LLC 1998, 2016 This software and documentation contain proprietary

More information

Informatica Cloud (Version Winter 2016) REST API Connector Guide

Informatica Cloud (Version Winter 2016) REST API Connector Guide Informatica Cloud (Version Winter 2016) REST API Connector Guide Informatica Cloud REST API Connector Guide Version Winter 2016 March 2016 Copyright (c) 1993-2016 Informatica LLC. All rights reserved.

More information

Informatica PowerExchange for Amazon S3 (Version HotFix 3) User Guide for PowerCenter

Informatica PowerExchange for Amazon S3 (Version HotFix 3) User Guide for PowerCenter Informatica PowerExchange for Amazon S3 (Version 9.6.1 HotFix 3) User Guide for PowerCenter Informatica PowerExchange for Amazon S3 User Guide for PowerCenter Version 9.6.1 HotFix 3 October 2015 Copyright

More information

Informatica Development Platform HotFix 1. Informatica Connector Toolkit Developer Guide

Informatica Development Platform HotFix 1. Informatica Connector Toolkit Developer Guide Informatica Development Platform 10.1.1 HotFix 1 Informatica Connector Toolkit Developer Guide Informatica Development Platform Informatica Connector Toolkit Developer Guide 10.1.1 HotFix 1 June 2017 Copyright

More information

Informatica Cloud Customer 360 (Version Spring 2017 Version 6.45) User Guide

Informatica Cloud Customer 360 (Version Spring 2017 Version 6.45) User Guide Informatica Cloud Customer 360 (Version Spring 2017 Version 6.45) User Guide Informatica Cloud Customer 360 User Guide Version Spring 2017 Version 6.45 May 2017 Copyright Informatica LLC 2015, 2017 This

More information

Informatica Enterprise Data Catalog Upgrading from Versions 10.1 and Later

Informatica Enterprise Data Catalog Upgrading from Versions 10.1 and Later Informatica Enterprise Data Catalog 10.2.2 Upgrading from Versions 10.1 and Later Informatica Enterprise Data Catalog Upgrading from Versions 10.1 and Later 10.2.2 February 2019 Copyright Informatica LLC

More information

Informatica Cloud Customer 360 (Version Winter 2017 Version 6.42) Setup Guide

Informatica Cloud Customer 360 (Version Winter 2017 Version 6.42) Setup Guide Informatica Cloud Customer 360 (Version Winter 2017 Version 6.42) Setup Guide Informatica Cloud Customer 360 Setup Guide Version Winter 2017 Version 6.42 January 2017 Copyright Informatica LLC 1993, 2017

More information

Informatica PowerExchange for Web Services (Version 9.6.1) User Guide for PowerCenter

Informatica PowerExchange for Web Services (Version 9.6.1) User Guide for PowerCenter Informatica PowerExchange for Web Services (Version 9.6.1) User Guide for PowerCenter Informatica PowerExchange for Web Services User Guide for PowerCenter Version 9.6.1 June 2014 Copyright (c) 2004-2014

More information

Informatica B2B Data Transformation (Version 10.0) Agent for WebSphere Message Broker User Guide

Informatica B2B Data Transformation (Version 10.0) Agent for WebSphere Message Broker User Guide Informatica B2B Data Transformation (Version 10.0) Agent for WebSphere Message Broker User Guide Informatica B2B Data Transformation Agent for WebSphere Message Broker User Guide Version 10.0 October 2015

More information

Informatica SQL Data Service Guide

Informatica SQL Data Service Guide Informatica 10.2 SQL Data Service Guide Informatica SQL Data Service Guide 10.2 September 2017 Copyright Informatica LLC 2009, 2018 This software and documentation are provided only under a separate license

More information

Informatica 4.1. Installation and Configuration Guide

Informatica 4.1. Installation and Configuration Guide Informatica Secure@Source 4.1 Installation and Configuration Guide Informatica Secure@Source Installation and Configuration Guide 4.1 December 2017 Copyright Informatica LLC 2015, 2018 This software and

More information

Informatica Managed File Transfer (Version 10.2) File Transfer Portal Guide

Informatica Managed File Transfer (Version 10.2) File Transfer Portal Guide Informatica Managed File Transfer (Version 10.2) File Transfer Portal Guide Informatica Managed File Transfer File Transfer Portal Guide Version 10.2 April 2017 Copyright Informatica LLC 2016, 2017 This

More information

Informatica Dynamic Data Masking (Version 9.6.2) Stored Procedure Accelerator Guide for Sybase

Informatica Dynamic Data Masking (Version 9.6.2) Stored Procedure Accelerator Guide for Sybase Informatica Dynamic Data Masking (Version 9.6.2) Stored Procedure Accelerator Guide for Sybase Informatica Dynamic Data Masking Stored Procedure Accelerator Guide for Sybase Version 9.6.2 March 2015 Copyright

More information

Informatica PowerExchange for SAS (Version 9.6.1) User Guide

Informatica PowerExchange for SAS (Version 9.6.1) User Guide Informatica PowerExchange for SAS (Version 9.6.1) User Guide Informatica PowerExchange for SAS User Guide Version 9.6.1 October 2014 Copyright (c) 2014 Informatica Corporation. All rights reserved. This

More information

Informatica Version HotFix 1. Business Glossary Guide

Informatica Version HotFix 1. Business Glossary Guide Informatica Version 10.1.1 HotFix 1 Business Glossary Guide Informatica Business Glossary Guide Version 10.1.1 HotFix 1 June 2017 Copyright Informatica LLC 2013, 2017 This software and documentation are

More information