How to Migrate RFC/BAPI Function Mappings to Use a BAPI/RFC Transformation

Size: px
Start display at page:

Download "How to Migrate RFC/BAPI Function Mappings to Use a BAPI/RFC Transformation"

Transcription

1 How to Migrate RFC/BAPI Function Mappings to Use a BAPI/RFC Transformation 2013 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica Corporation. All other company and product names may be trade names or trademarks of their respective owners and/or copyrighted materials of such owners.

2 Abstract You can modify the mappings that you created with RFC/BAPI function mappings with BAPI transformation. This article explains how to migrate existing mappings to use a BAPI/RFC transformation. Supported Versions PowerExchange for SAP Netweaver for PowerCenter Table of Contents Overview Configuration Differences in RFC/BAPI Function Mapping and BAPI/RFC Transformation RFC/BAPI No Input Mappings Migration Modifying RFC/BAPI No Input Mappings Migrating a No Input Mapping and Session - An Example RFC/BAPI Single Stream Mappings Modifying RFC/BAPI Single Stream Mappings Migrating a Single Stream Mapping and Session - An Example RFC/BAPI Multiple Stream Mappings Modifying RFC/BAPI Multiple Stream Mapping Migrating a Multiple Stream Mapping and Session - An Example BAPI/RFC Sessions Configuring Sessions After Migration Overview RFC/BAPI function mappings contains mapplets to extract data from SAP NetWeaver. Migrate your mappings to use the BAPI/RFC transformation instead of the function mappings. 2

3 Configuration Differences in RFC/BAPI Function Mapping and BAPI/RFC Transformation Some properties that you configure in a RFC/BAPI function mapping must be configured at the session level for a mapping with BAPI/RFC transformation. The following table describes the configuration levels for the properties in a RFC/BAPI function mapping and a mapping with BAPI/RFC transformation: Property RFC/BAPI Function Mapping Mappings with BAPI/RFC Transformation Connection FunctionCall mapplet Session Continue on error Mapping parameters Session Perform Commit Mapping parameters Session RFC/BAPI No Input Mappings Migration RFC/BAPI No Input function mapping extracts data from SAP. The mapping contains the following components: Source Definition Source definition for each RFC/BAPI signature you use in the mapping. Source Qualifier An Application Multi-Group Source Qualifier transformation for each Virtual plug-in source definition in the mapping. FunctionCall mapplet A FunctionCall_No_Input mapplet for each RFC/BAPI function signature you use in the mapping. In the No Input RFC/BAPI mapping, the FunctionCall_No_Input mapplet makes the RFC/BAPI function call on SAP and interprets the data that the function call returns. Target Definition for function output Target definition for each scalar or table output parameter you use in the RFC/BAPI function. To migrate an RFC/BAPI No Input mapping, you must replace the mapplet with a BAPI/RFC transformation and reconfigure the mapping. You must edit the session to include an RFC/BAPI connection and update the session properties. Modifying RFC/BAPI No Input Mappings Ensure that you have the details of the BAPI used in the FunctionCall mapplet. 1. Open the mapping in the Designer. 2. Note the values for commit call, interval, return type, and connection properties in the AEP transformation of the mapping. 3. Delete the FunctionCall mapplet. 4. Click Transformation > Create, and select SAP BAPI/RFC transformation. 5. Enter a transformation name, and click Create. Step 1 of the wizard appears. 3

4 6. Enter connection string parameters to connect to SAP with sapfrc.ini. Or, click Host Name and enter host name parameters to connect to SAP. 7. Click Next. Optionally, enter a filter condition to filter BAPIs by name or description. 8. Click Get Objects. The wizard displays BAPIs that you can import. 9. Expand the list of objects and select the BAPI as in the FunctionCall mapplet. Note: You need not use the TransactionID values to define the commit points for a logical unit of work ( LUW) in BAPI/RFC transformation. 10. Click Finish. 11. Connect the source qualifier port to the IntegrationID input port in the BAPI/RFC transformation. 12. Connect the required output ports in the BAPI/RFC transformation to the target definition. 13. Save the mapping. Migrating a No Input Mapping and Session - An Example m_bapi_noinput is a no input mapping that contains mapplets and s_bapi_noinput is the corresponding session. You must edit the mapping and session to use a BAPI/RFC transformation. 1. Open m_bapi_noinput mapping in the Designer. The mapping contains the following components: Flat file source Source Qualifier FunctionCall_SS mapplet Oracle target 2. Note the values for commit call, interval, return type, and connection properties in the AEP transformation of the mapplet. 3. Delete the mapplet and the source definition. 4. Import a flat file source definition that contains all the Integration IDs. 5. Click Transformation > Create, select SAP BAPI/RFC transformation, enter the name of the transformation as BAPI_ACTIVITYTYPE_GETLIST_RFCSSCALLFUNCTION, and click Create. Step 1 of the wizard appears. 6. Enter connection string parameters to connect to SAP with sapfrc.ini. Or, click Host Name and enter host name parameters to connect to SAP. Click Next. 7. Optionally, enter a filter condition to filter the BAPI by name or description and click Get Objects. The wizards displays BAPIs that you can import. 8. Expand the list of objects and select the BAPI. 9. Connect the Integration ID from the source qualifier to the Integration ID in the Scalar Input and the Table Input of the transformation. 10. Open the Target Designer. 11. Drag and drop the transformation into the Target Designer. The Designer creates the target definitions for scalar output and table output. 12. Connect the output groups from the transformation to the target definitions. 4

5 13. Save the mapping. The following figure shows the updated mapping with the BAPI/RFC transformation: 14. Click Connections > Applications> in the Workflow Manager. The Application Connections Browser dialog box appears. 15. Select SAP RFC/BAPI Interface and click New. The Connection Object Definition dialog box appears. 16. Enter the connection details and click OK. 17. Open the session, s_bapi_noinput. The Edit Tasks dialog box appears. 18. Select the transformation in the Mapping tab. 19. Select the type of connection as Application and the SAP RFC/BAPI connection that you created in the Connections pane. 20. Click Show Session Level Properties in the Properties pane and select Perform Commit. 21. Save the session. RFC/BAPI Single Stream Mappings An RFC/BAPI single stream mapping can extract data from SAP, change data in SAP, and write data into an SAP system. Unlike a multiple stream mapping, however, the single stream mapping does not contain RFCPrepare mapplets in the mapping to prepare source data for the function call. The mapping contains the following components: Source definition A source definition and a source qualifier for function input data. Sorter transformation Mapping includes a Sorter transformation to sort source data by TransactionID and IntegrationID. It passes the sorted data to the FunctionCall_SS mapplet. 5

6 FunctionCall_SS mapplet Mapping includes a FunctionCall_MS mapplet for each RFC/BAPI function signatures you use in the mapping. The mapplet makes the RFC/BAPI function call on SAP. The mapplet also makes commit calls on SAP for transaction control. If the RFC/BAPI function call returns output, the mapplet interprets the data that the function call returns and passes it to the targets. Target definition for function output Mapping includes a Virtual plug-in target definition for each scalar or table output parameter you use for the RFC/BAPI functions. The wizard includes the Virtual plug-in target definitions in the mapping for mapping validation. Replace the Virtual plug-in target definitions with the appropriate definitions for the target warehouse. Modifying RFC/BAPI Single Stream Mappings Ensure that you have the details of the BAPI used in the FunctionCall_SS mapplet. 1. Open the mapping in the Designer. 2. Note the values for commit call, interval, return type, and connection properties in the AEP transformation of the mapping. 3. Delete the mapplet and the sorter transformation. 4. Click Transformation > Create, and select SAP BAPI/RFC transformation. 5. Enter a transformation name, and click Create. Step 1 of the wizard appears. 6. Enter connection string parameters to connect to SAP with sapfrc.ini. You can also click Host Name and enter host name parameters to connect to SAP. 7. Click Next. Optionally, enter a filter condition to filter BAPIs by name or description. 8. Click Get Objects. The wizard displays BAPIs that you can import. 9. Expand the list of objects and select the BAPIs that you want. Note: You need not use the TransactionID values to define the commit points for a logical unit of work ( LUW) in BAPI/RFC transformation. 10. Click Finish. The BAPI/RFC mappings appear in the canvas. 11. Connect the source qualifier output ports to the BAPI/RFC transformation input mapping. The scalar and table input to the SAP system must be available in the source definition. 12. Connect the required output ports in the BAPI/RFC transformation to the target definition. 13. Save the mapping. Migrating a Single Stream Mapping and Session - An Example m_bapi_activitytype is a single stream mapping that contains mapplets and s_bapi_activitytype is the corresponding session. You must edit the mapping and session to use a BAPI/RFC transformation. 1. Open m_bapi_activitytype mapping in the Designer. 6

7 The following figure shows the single stream mapping: The mapping contains the following components: Flat file source Source Qualifier FunctionCall_SS mapplet Flat file target 2. Note the values for commit call, interval, return type, and connection properties in the AEP transformation of the mapplet. 3. Delete the mapplet and the target definition. 4. Click Transformation > Create, select SAP BAPI/RFC transformation, enter the name of the transformation as BAPI_ACTIVITYTYPE_GETLIST_RFCSSCALLFUNCTION, and click Create. Step 1 of the wizard appears. 5. Enter connection string parameters to connect to SAP with sapfrc.ini. Or, click Host Name and enter host name parameters to connect to SAP. Click Next. 6. Optionally, enter a filter condition to filter the BAPI by name or description and click Get Objects. The wizards displays BAPIs that you can import. 7. Expand the list of objects and select the BAPI. 8. Click the Return Structure tab to configure the ReturnParameter. 7

8 The following figure shows the return structure configuration: Optionally, you can configure the return structure after import with the Customize BAPI wizard. Note: In single stream mapping, return structure was configured in the FunctionCall mapplet. 9. Connect the Integration ID from the source qualifier to the Integration ID in the Scalar Input and the Table Input of the transformation. 10. Open the Target Designer. 11. Drag and drop the transformation into the Target Designer. The Designer creates the target definitions for scalar output, table output, and error output. 12. Connect the output groups from the transformation to the target definitions. 13. Save the mapping. 8

9 The following figure shows the updated mapping with the BAPI/RFC transformation: 14. Click Connections > Applications> in the Workflow Manager. The Application Connections Browser dialog box appears. 15. Select SAP RFC/BAPI Interface and click New. The Connection Object Definition dialog box appears. 16. Enter the connection details and click OK. 17. Open the session, s_bapi_activitytype. The Edit Tasks dialog box appears. 18. Select the transformation in the Mapping tab. 19. Select the type of connection as Application and the SAP RFC/BAPI connection that you created in the Connections pane. 20. Click Show Session Level Properties in the Properties pane and select Perform Commit. 21. Save the session. RFC/BAPI Multiple Stream Mappings An RFC/BAPI multiple stream mapping takes input from user defined sources and prepares the source data in the format that an RFC/BAPI function can accept. The mapping then uses the prepared data as function input to make RFC/BAPI function calls on SAP. An RFC/BAPI multiple stream mapping can extract, load, and update data in SAP. The mapping contains the following components: Source definition A source definition and a source qualifier for function input data. Prepare mapplet Mapping includes a Prepare mapplet for each RFC/BAPI function signature you use in the mapping. 9

10 RFCPreparedData target definition Mapping contains a set of flat file target definitions that receive input from the Prepare mapplets in the mapping. Each Prepare mapplet in the mapping passes data to a corresponding RFCPreparedData (flat file) target definition. The RFCPreparedData target definitions represent staging files where the Integration Service writes the prepared data from the Prepare mapplets. RFCPrepared Data source definition Mapping contains an RFCPreparedData (flat file) source definition. It represents the RFCPreparedData indirect file, which the Integration Service uses to read data from the RFCPreparedData staging files. Source Qualifier for RFCPreparedData source Mapping includes a Source Qualifier transformation for the RFCPreparedData source definition in the mapping. Transaction_Support mapplet In a multiple stream RFC/BAPI mapping, the mapping includes a Transaction_Support mapplet to ensure transaction control. FunctionCall_MS mapplet Mapping includes a FunctionCall_MS mapplet for each RFC/BAPI function signatures you use in the mapping. The Multiple Stream FunctionCall_MS mapplet makes the RFC/BAPI function call to extract data from SAP. It also interprets the data that the function call returns. Target definition for function output Mapping includes a Virtual plug-in target definition for each scalar or table output parameter you use for the RFC/BAPI functions. The wizard includes the Virtual plug-in target definitions in the mapping for mapping validation. Replace the Virtual plug-in target definitions with the appropriate definitions for the target warehouse. Modifying RFC/BAPI Multiple Stream Mapping Ensure that you have the details of the BAPI used in the Transaction_Support mapplet and FunctionCall_MS mapplet. 1. Open the mapping in the Designer. 2. Note the values for commit call, interval, return type, and connection properties in the AEP transformation of the mapping. 3. Delete all the mapplets. Ensure that you have the details of the different BAPIs used in the mapplet. 4. Click Transformation > Create, and select SAP BAPI/RFC transformation. 5. Enter a transformation name, and click Create. Step 1 of the wizard appears. 6. Enter connection string parameters to connect to SAP with sapfrc.ini. You can also click Host Name and enter host name parameters to connect to SAP. 7. Click Next. Optionally, enter a filter condition to filter BAPIs by name or description. 8. Click Get Objects. The wizard displays BAPIs that you can import. 9. Expand the list of objects and select the BAPIs that you want. Note: You need not use the TransactionID values to define the commit points for a logical unit of work ( LUW) in BAPI/RFC transformation. 10

11 10. Click Finish. The BAPI/RFC mappings appear in the canvas. 11. Connect the source qualifier output ports to the BAPI/RFC transformation input mapping. The scalar and table input to the SAP system must be available in the source definition. 12. Connect the required output ports in the BAPI/RFC transformation to the target definition. 13. Save the mapping. Migrating a Multiple Stream Mapping and Session - An Example m_bapi_activitytype is a multiple stream mapping that contains mapplets and s_bapi_activitytype is the corresponding session. You must edit the mapping and session to use a BAPI/RFC transformation. 1. Open m_bapi_activitytype mapping in the Designer. The following figure shows the multi stream mapping: The mapping contains the following components: Flat file source Source Qualifier Prepare mapplet PreparedData target PreparedData source Transaction_Support mapplet FunctionCall mapplet Flat file target 2. Note the values for commit call, interval, return type, and connection properties in the AEP transformation of the FunctionCall mapplet. 3. Delete all the components except flat file source and source qualifier. 4. Click Transformation > Create, select SAP BAPI/RFC transformation, enter the name of the transformation as BAPI_EMPATTABS_GETDETAILS, and click Create. 11

12 Step 1 of the wizard appears. 5. Enter connection string parameters to connect to SAP with sapfrc.ini. Or, click Host Name and enter host name parameters to connect to SAP. Click Next. 6. Optionally, enter a filter condition to filter the BAPI by name or description and click Get Objects. The wizards displays BAPIs that you can import. 7. Expand the list of objects and select the BAPI. 8. Click the Return Structure tab to configure the ReturnParameter. The following figure shows the return structure configuration: Optionally, you can configure the return structure after import with the Customize BAPI wizard. Note: In single stream mapping, return structure was configured in the FunctionCall mapplet. 9. Connect the Integration ID from the source qualifier to the Integration ID in the Scalar Input and the Table Input of the transformation. 10. Open the Target Designer. 11. Drag and drop the transformation into the Target Designer. The Designer creates the target definitions for scalar output, table output, and error output. 12. Connect the output groups from the transformation to the target definitions. 13. Save the mapping. 12

13 The following figure shows the updated mapping with the BAPI/RFC transformation: 14. Click Connections > Applications> in the Workflow Manager. The Application Connections Browser dialog box appears. 15. Select SAP RFC/BAPI Interface and click New. The Connection Object Definition dialog box appears. 16. Enter the connection details and click OK. 17. Open the session, s_bapi_activitytype. The Edit Tasks dialog box appears. 18. Select the transformation in the Mapping tab. 19. Select the type of connection as Application and the SAP RFC/BAPI connection that you created in the Connections pane. 20. Click Show Session Level Properties in the Properties pane and select Perform Commit. 21. Save the session. BAPI/RFC Sessions You must modify the session after you migrate the mapping to use BAPI transformation. Configure the following session properties when you configure a BAPI/RFC session: Commit behavior Caching Error handling Partitioning. Each partition makes separate BAPI/RFC calls to SAP. Verbose logging. The session log contains the information about the BAPI/RFC call return code, status, and detailed messages. You must create SAP RFC BAPI Interface application connections for the BAPIs you want to connect. When you configure a session, select an SAP RFC BAPI Interface application connection for the BAPI/RFC transformation. 13

14 Configuring Sessions After Migration You must edit the session properties and update the application connection after you modify the BAPI/RFC mapping. 1. Click Tools > Task Developer in the Workflow Manager. 2. Open the session that you want to modify. 3. Click Task > Edit. The Edit Tasks dialog box appears. 4. Click Mapping. 5. Select the BAPI/RFC transformation. 6. Choose SAP RFC BAPI Interface application connection. 7. Click OK. 8. Click Repository > Save. Author Narayan Sivaramakrishnan Senior Technical Writer Acknowledgements Thanks to Poornima Srikantesh, Lead Engineer QA, for her help in completing this article. 14

Increasing Performance for PowerCenter Sessions that Use Partitions

Increasing Performance for PowerCenter Sessions that Use Partitions Increasing Performance for PowerCenter Sessions that Use Partitions 1993-2015 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

How to Use Full Pushdown Optimization in PowerCenter

How to Use Full Pushdown Optimization in PowerCenter How to Use Full Pushdown Optimization in PowerCenter 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording

More information

Using Synchronization in Profiling

Using Synchronization in Profiling Using Synchronization in Profiling Copyright Informatica LLC 1993, 2017. Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

Optimizing Session Caches in PowerCenter

Optimizing Session Caches in PowerCenter Optimizing Session Caches in PowerCenter 1993-2015 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise)

More information

Using Standard Generation Rules to Generate Test Data

Using Standard Generation Rules to Generate Test Data Using Standard Generation Rules to Generate Test Data 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording

More information

Creating a Column Profile on a Logical Data Object in Informatica Developer

Creating a Column Profile on a Logical Data Object in Informatica Developer Creating a Column Profile on a Logical Data Object in Informatica Developer 1993-2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

Creating a Subset of Production Data

Creating a Subset of Production Data Creating a Subset of Production Data 2013 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise)

More information

Code Page Configuration in PowerCenter

Code Page Configuration in PowerCenter Code Page Configuration in PowerCenter 1993-2015 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise)

More information

Migrating Mappings and Mapplets from a PowerCenter Repository to a Model Repository

Migrating Mappings and Mapplets from a PowerCenter Repository to a Model Repository Migrating Mappings and Mapplets from a PowerCenter Repository to a Model Repository 2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic,

More information

How to Run a PowerCenter Workflow from SAP

How to Run a PowerCenter Workflow from SAP How to Run a PowerCenter Workflow from SAP 1993-2015 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or

More information

PowerExchange for Facebook: How to Configure Open Authentication using the OAuth Utility

PowerExchange for Facebook: How to Configure Open Authentication using the OAuth Utility PowerExchange for Facebook: How to Configure Open Authentication using the OAuth Utility 2013 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means

More information

Publishing and Subscribing to Cloud Applications with Data Integration Hub

Publishing and Subscribing to Cloud Applications with Data Integration Hub Publishing and Subscribing to Cloud Applications with Data Integration Hub 1993-2015 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

Aggregate Data in Informatica Developer

Aggregate Data in Informatica Developer Aggregate Data in Informatica Developer 2013 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise)

More information

Importing Flat File Sources in Test Data Management

Importing Flat File Sources in Test Data Management Importing Flat File Sources in Test Data Management Copyright Informatica LLC 2017. Informatica and the Informatica logo are trademarks or registered trademarks of Informatica LLC in the United States

More information

Creating OData Custom Composite Keys

Creating OData Custom Composite Keys Creating OData Custom Composite Keys 1993, 2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without

More information

Creating an Avro to Relational Data Processor Transformation

Creating an Avro to Relational Data Processor Transformation Creating an Avro to Relational Data Processor Transformation 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

Importing Metadata from Relational Sources in Test Data Management

Importing Metadata from Relational Sources in Test Data Management Importing Metadata from Relational Sources in Test Data Management Copyright Informatica LLC, 2017. Informatica and the Informatica logo are trademarks or registered trademarks of Informatica LLC in the

More information

Using PowerCenter to Process Flat Files in Real Time

Using PowerCenter to Process Flat Files in Real Time Using PowerCenter to Process Flat Files in Real Time 2013 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording

More information

Optimizing Performance for Partitioned Mappings

Optimizing Performance for Partitioned Mappings Optimizing Performance for Partitioned Mappings 1993-2015 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise)

More information

Generating Credit Card Numbers in Test Data Management

Generating Credit Card Numbers in Test Data Management Generating Credit Card Numbers in Test Data Management Copyright Informatica LLC 2003, 2017. Informatica and the Informatica logo are trademarks or registered trademarks of Informatica LLC in the United

More information

Configuring a JDBC Resource for IBM DB2/ iseries in Metadata Manager HotFix 2

Configuring a JDBC Resource for IBM DB2/ iseries in Metadata Manager HotFix 2 Configuring a JDBC Resource for IBM DB2/ iseries in Metadata Manager 9.5.1 HotFix 2 2013 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic,

More information

Informatica Corporation Informatica PowerCenter Connect for SAP NetWeaver Version SP5 Release Notes March Contents

Informatica Corporation Informatica PowerCenter Connect for SAP NetWeaver Version SP5 Release Notes March Contents Contents Informatica Corporation Informatica PowerCenter Connect for SAP NetWeaver Version 8.1.1 SP5 Release Notes March 2008 Copyright 2003-2008 Informatica Corporation This Software may be protected

More information

How to Write Data to HDFS

How to Write Data to HDFS How to Write Data to HDFS 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior

More information

Configure an ODBC Connection to SAP HANA

Configure an ODBC Connection to SAP HANA Configure an ODBC Connection to SAP HANA 1993-2017 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise)

More information

Tutorial 1: Simple Parameterized Mapping

Tutorial 1: Simple Parameterized Mapping Tutorial 1: Simple Parameterized Mapping 1993-2015 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise)

More information

Implementing Data Masking and Data Subset with IMS Unload File Sources

Implementing Data Masking and Data Subset with IMS Unload File Sources Implementing Data Masking and Data Subset with IMS Unload File Sources 2013 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

How to Connect to a Microsoft SQL Server Database that Uses Kerberos Authentication in Informatica 9.6.x

How to Connect to a Microsoft SQL Server Database that Uses Kerberos Authentication in Informatica 9.6.x How to Connect to a Microsoft SQL Server Database that Uses Kerberos Authentication in Informatica 9.6.x Copyright Informatica LLC 2015, 2017. Informatica Corporation. No part of this document may be reproduced

More information

Using the SAP BW Metadata Creation Solution

Using the SAP BW Metadata Creation Solution Using the SAP BW Metadata Creation Solution You can use the SAP BW metadata creation solution to create InfoSources, InfoObjects, and InfoPackages and start InfoPackages. To create objects in SAP BW, open

More information

Detecting Outliers in Column Profile Results in Informatica Analyst

Detecting Outliers in Column Profile Results in Informatica Analyst Detecting Outliers in Column Profile Results in Informatica Analyst 1993, 2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

Creating Column Profiles on LDAP Data Objects

Creating Column Profiles on LDAP Data Objects Creating Column Profiles on LDAP Data Objects Copyright Informatica LLC 1993, 2017. Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

Implementing Data Masking and Data Subset with IMS Unload File Sources

Implementing Data Masking and Data Subset with IMS Unload File Sources Implementing Data Masking and Data Subset with IMS Unload File Sources 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

Session 41660: Using Hyperion Data Integration Management with Hyperion Planning and Hyperion Essbase

Session 41660: Using Hyperion Data Integration Management with Hyperion Planning and Hyperion Essbase Session 41660: Using Hyperion Data Integration Management with Hyperion Planning and Hyperion Essbase Presenter Information Dan Colston Hyperion EPM Senior Consultant dcolston@thehackettgroup.com Patrick

More information

Business Glossary Best Practices

Business Glossary Best Practices Business Glossary Best Practices 1993-2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without

More information

Configuring Secure Communication to Oracle to Import Source and Target Definitions in PowerCenter

Configuring Secure Communication to Oracle to Import Source and Target Definitions in PowerCenter Configuring Secure Communication to Oracle to Import Source and Target Definitions in PowerCenter 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by

More information

Manually Defining Constraints in Enterprise Data Manager

Manually Defining Constraints in Enterprise Data Manager Manually Defining Constraints in Enterprise Data Manager 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording

More information

Data Warehousing Concepts

Data Warehousing Concepts Data Warehousing Concepts Data Warehousing Definition Basic Data Warehousing Architecture Transaction & Transactional Data OLTP / Operational System / Transactional System OLAP / Data Warehouse / Decision

More information

Configuring a JDBC Resource for IBM DB2 for z/os in Metadata Manager

Configuring a JDBC Resource for IBM DB2 for z/os in Metadata Manager Configuring a JDBC Resource for IBM DB2 for z/os in Metadata Manager 2011 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

Informatica PowerExchange for Tableau User Guide

Informatica PowerExchange for Tableau User Guide Informatica PowerExchange for Tableau 10.2.1 User Guide Informatica PowerExchange for Tableau User Guide 10.2.1 May 2018 Copyright Informatica LLC 2015, 2018 This software and documentation are provided

More information

How to Configure the SAP Secure Network Communication Protocol in PowerCenter Version 9.6.x

How to Configure the SAP Secure Network Communication Protocol in PowerCenter Version 9.6.x How to Configure the SAP Secure Network Communication Protocol in PowerCenter Version 9.6.x Copyright Informatica LLC, 2017. Informatica Corporation. No part of this document may be reproduced or transmitted

More information

Migrating External Loader Sessions to Dual Load Sessions

Migrating External Loader Sessions to Dual Load Sessions Migrating External Loader Sessions to Dual Load Sessions 2011 Informatica Corporation Abstract You can migrate PowerCenter sessions that load to a Teradata target with external loaders that load to Teradata

More information

Installation of Informatica Services on Amazon EC2

Installation of Informatica Services on Amazon EC2 Installation of Informatica Services on Amazon EC2 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording

More information

Using MDM Big Data Relationship Management to Perform the Match Process for MDM Multidomain Edition

Using MDM Big Data Relationship Management to Perform the Match Process for MDM Multidomain Edition Using MDM Big Data Relationship Management to Perform the Match Process for MDM Multidomain Edition Copyright Informatica LLC 1993, 2017. Informatica LLC. No part of this document may be reproduced or

More information

Importing Metadata From an XML Source in Test Data Management

Importing Metadata From an XML Source in Test Data Management Importing Metadata From an XML Source in Test Data Management Copyright Informatica LLC 2017. Informatica, the Informatica logo, and PowerCenter are trademarks or registered trademarks of Informatica LLC

More information

Jyotheswar Kuricheti

Jyotheswar Kuricheti Jyotheswar Kuricheti 1 Agenda: 1. Performance Tuning Overview 2. Identify Bottlenecks 3. Optimizing at different levels : Target Source Mapping Session System 2 3 Performance Tuning Overview: 4 What is

More information

Changing the Password of the Proactive Monitoring Database User

Changing the Password of the Proactive Monitoring Database User Changing the Password of the Proactive Monitoring Database User 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

Security Enhancements in Informatica 9.6.x

Security Enhancements in Informatica 9.6.x Security Enhancements in Informatica 9.6.x 1993-2016 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or

More information

Dynamic Data Masking: Capturing the SET QUOTED_IDENTIFER Value in a Microsoft SQL Server or Sybase Database

Dynamic Data Masking: Capturing the SET QUOTED_IDENTIFER Value in a Microsoft SQL Server or Sybase Database Dynamic Data Masking: Capturing the SET QUOTED_IDENTIFER Value in a Microsoft SQL Server or Sybase Database 1993, 2016 Informatica LLC. No part of this document may be reproduced or transmitted in any

More information

How to Convert an SQL Query to a Mapping

How to Convert an SQL Query to a Mapping How to Convert an SQL Query to a Mapping 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise)

More information

Improving PowerCenter Performance with IBM DB2 Range Partitioned Tables

Improving PowerCenter Performance with IBM DB2 Range Partitioned Tables Improving PowerCenter Performance with IBM DB2 Range Partitioned Tables 2011 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

Report Studio: Using Java Script to Select and Submit Values to a SAP Prompt.

Report Studio: Using Java Script to Select and Submit Values to a SAP Prompt. Tip or Technique Report Studio: Using Java Script to Select and Submit Values to a SAP Prompt. Product(s): IBM Cognos 8 Area of Interest: Reporting Prompt. 2 Copyright Copyright 2008 Cognos ULC (formerly

More information

Visual Composer for NetWeaver CE: Getting Started with a Typical Workflow

Visual Composer for NetWeaver CE: Getting Started with a Typical Workflow Visual Composer for NetWeaver CE: Getting Started with a Typical Workflow Applies to: Visual Composer for SAP NetWeaver Composition Environment 7.1 Summary This article aims to help you get started modeling

More information

Toolkit Activity Installation and Registration

Toolkit Activity Installation and Registration Toolkit Activity Installation and Registration Installing the Toolkit activity on the Workflow Server Install the Qfiche Toolkit workflow activity by running the appropriate SETUP.EXE and stepping through

More information

Configuring a Web Services Transformation in Informatica Cloud to Read Data from SAP BW BEx Query

Configuring a Web Services Transformation in Informatica Cloud to Read Data from SAP BW BEx Query Configuring a Web Services Transformation in Informatica Cloud to Read Data from SAP BW BEx Query Copyright Informatica LLC 2017. Informatica, the Informatica logo, and Informatica Cloud are trademarks

More information

Configuring a JDBC Resource for Sybase IQ in Metadata Manager

Configuring a JDBC Resource for Sybase IQ in Metadata Manager Configuring a JDBC Resource for Sybase IQ in Metadata Manager 2012 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

Importing Metadata From a Netezza Connection in Test Data Management

Importing Metadata From a Netezza Connection in Test Data Management Importing Metadata From a Netezza Connection in Test Data Management Copyright Informatica LLC 2003, 2017. Informatica, the Informatica logo, and PowerCenter are trademarks or registered trademarks of

More information

Implementing Data Masking and Data Subset with Sequential or VSAM Sources

Implementing Data Masking and Data Subset with Sequential or VSAM Sources Implementing Data Masking and Data Subset with Sequential or VSAM Sources 2013 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic,

More information

Moving DB2 for z/os Bulk Data with Nonrelational Source Definitions

Moving DB2 for z/os Bulk Data with Nonrelational Source Definitions Moving DB2 for z/os Bulk Data with Nonrelational Source Definitions 2011 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

PowerExchange IMS Data Map Creation

PowerExchange IMS Data Map Creation PowerExchange IMS Data Map Creation 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise)

More information

Real-time Session Performance

Real-time Session Performance Real-time Session Performance 2008 Informatica Corporation Overview This article provides information about real-time session performance and throughput. It also provides recommendations on how you can

More information

Using Two-Factor Authentication to Connect to a Kerberos-enabled Informatica Domain

Using Two-Factor Authentication to Connect to a Kerberos-enabled Informatica Domain Using Two-Factor Authentication to Connect to a Kerberos-enabled Informatica Domain Copyright Informatica LLC 2016, 2018. Informatica LLC. No part of this document may be reproduced or transmitted in any

More information

Container-based Authentication for MDM- ActiveVOS in WebSphere

Container-based Authentication for MDM- ActiveVOS in WebSphere Container-based Authentication for MDM- ActiveVOS in WebSphere 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

Data Validation Option Best Practices

Data Validation Option Best Practices Data Validation Option Best Practices 1993-2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without

More information

Hyperion Data Integration Management Adapter for Performance Scorecard. Readme. Release

Hyperion Data Integration Management Adapter for Performance Scorecard. Readme. Release Hyperion Data Integration Management Adapter for Performance Scorecard Release 11.1.1.1 Readme [Skip Navigation Links] Purpose... 3 About Data Integration Management Release 11.1.1.1... 3 Data Integration

More information

Configuring a JDBC Resource for MySQL in Metadata Manager

Configuring a JDBC Resource for MySQL in Metadata Manager Configuring a JDBC Resource for MySQL in Metadata Manager 2011 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording

More information

Informatica Cloud Spring Data Integration Hub Connector Guide

Informatica Cloud Spring Data Integration Hub Connector Guide Informatica Cloud Spring 2017 Data Integration Hub Connector Guide Informatica Cloud Data Integration Hub Connector Guide Spring 2017 December 2017 Copyright Informatica LLC 1993, 2017 This software and

More information

Website: Contact: / Classroom Corporate Online Informatica Syllabus

Website:  Contact: / Classroom Corporate Online Informatica Syllabus Designer Guide: Using the Designer o Configuring Designer Options o Using Toolbars o Navigating the Workspace o Designer Tasks o Viewing Mapplet and Mapplet Reports Working with Sources o Working with

More information

Using Data Replication with Merge Apply and Audit Apply in a Single Configuration

Using Data Replication with Merge Apply and Audit Apply in a Single Configuration Using Data Replication with Merge Apply and Audit Apply in a Single Configuration 2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic,

More information

Using a Web Services Transformation to Get Employee Details from Workday

Using a Web Services Transformation to Get Employee Details from Workday Using a Web Services Transformation to Get Employee Details from Workday Copyright Informatica LLC 2016, 2017. Informatica, the Informatica logo, and Informatica Cloud are trademarks or registered trademarks

More information

This document contains information on fixed and known limitations for Test Data Management.

This document contains information on fixed and known limitations for Test Data Management. Informatica LLC Test Data Management Version 10.1.0 Release Notes December 2016 Copyright Informatica LLC 2003, 2016 Contents Installation and Upgrade... 1 Emergency Bug Fixes in 10.1.0... 1 10.1.0 Fixed

More information

Configuring a Hadoop Environment for Test Data Management

Configuring a Hadoop Environment for Test Data Management Configuring a Hadoop Environment for Test Data Management Copyright Informatica LLC 2016, 2017. Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic,

More information

What's New In Informatica Data Quality 9.0.1

What's New In Informatica Data Quality 9.0.1 What's New In Informatica Data Quality 9.0.1 2010 Abstract When you upgrade Informatica Data Quality to version 9.0.1, you will find multiple new features and enhancements. The new features include a new

More information

Informatica Cloud Data Integration Winter 2017 December. What's New

Informatica Cloud Data Integration Winter 2017 December. What's New Informatica Cloud Data Integration Winter 2017 December What's New Informatica Cloud Data Integration What's New Winter 2017 December January 2018 Copyright Informatica LLC 2016, 2018 This software and

More information

Informatica PowerExchange for Microsoft Azure Blob Storage 10.2 HotFix 1. User Guide

Informatica PowerExchange for Microsoft Azure Blob Storage 10.2 HotFix 1. User Guide Informatica PowerExchange for Microsoft Azure Blob Storage 10.2 HotFix 1 User Guide Informatica PowerExchange for Microsoft Azure Blob Storage User Guide 10.2 HotFix 1 July 2018 Copyright Informatica LLC

More information

Hyperion Data Integration Management Adapter for Essbase. Sample Readme. Release

Hyperion Data Integration Management Adapter for Essbase. Sample Readme. Release Hyperion Data Integration Management Adapter for Essbase Release 11.1.1.1 Sample Readme [Skip Navigation Links] Purpose... 2 About Data Integration Management Release 11.1.1.1... 2 Data Integration Management

More information

Running PowerCenter Advanced Edition in Split Domain Mode

Running PowerCenter Advanced Edition in Split Domain Mode Running PowerCenter Advanced Edition in Split Domain Mode 1993-2016 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,

More information

PowerCenter 7 Architecture and Performance Tuning

PowerCenter 7 Architecture and Performance Tuning PowerCenter 7 Architecture and Performance Tuning Erwin Dral Sales Consultant 1 Agenda PowerCenter Architecture Performance tuning step-by-step Eliminating Common bottlenecks 2 PowerCenter Architecture:

More information

User Guide. Informatica PowerCenter Connect for MSMQ. (Version 8.1.1)

User Guide. Informatica PowerCenter Connect for MSMQ. (Version 8.1.1) User Guide Informatica PowerCenter Connect for MSMQ (Version 8.1.1) Informatica PowerCenter Connect for MSMQ User Guide Version 8.1.1 September 2006 Copyright (c) 2004-2006 Informatica Corporation. All

More information

How Do I Inspect Error Logs in Warehouse Builder?

How Do I Inspect Error Logs in Warehouse Builder? 10 How Do I Inspect Error Logs in Warehouse Builder? Scenario While working with Warehouse Builder, the designers need to access log files and check on different types of errors. This case study outlines

More information

Optimizing Testing Performance With Data Validation Option

Optimizing Testing Performance With Data Validation Option Optimizing Testing Performance With Data Validation Option 1993-2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording

More information

Informatica Power Center 10.1 Developer Training

Informatica Power Center 10.1 Developer Training Informatica Power Center 10.1 Developer Training Course Overview An introduction to Informatica Power Center 10.x which is comprised of a server and client workbench tools that Developers use to create,

More information

Using the Normalizer Transformation to Parse Records

Using the Normalizer Transformation to Parse Records Using the Normalizer Transformation to Parse Records 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording

More information

PowerCenter Repository Maintenance

PowerCenter Repository Maintenance PowerCenter Repository Maintenance 2012 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without

More information

Informatica Power Center 9.0.1

Informatica Power Center 9.0.1 Informatica Power Center 9.0.1 Informatica Audit Tables Description: BISP is committed to provide BEST learning material to the beginners and advance learners. In the same series, we have prepared a complete

More information

How to Optimize Jobs on the Data Integration Service for Performance and Stability

How to Optimize Jobs on the Data Integration Service for Performance and Stability How to Optimize Jobs on the Data Integration Service for Performance and Stability 1993-2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic,

More information

Using SAP NetWeaver Business Intelligence in the universe design tool SAP BusinessObjects Business Intelligence platform 4.1

Using SAP NetWeaver Business Intelligence in the universe design tool SAP BusinessObjects Business Intelligence platform 4.1 Using SAP NetWeaver Business Intelligence in the universe design tool SAP BusinessObjects Business Intelligence platform 4.1 Copyright 2013 SAP AG or an SAP affiliate company. All rights reserved. No part

More information

Data Quality : Profile Analysis On Join Condition

Data Quality : Profile Analysis On Join Condition Name of Solution: Data Quality : Profile Analysis On Join Condition Business Requirement: The purpose of this solution is to explain what is Join profile analysis and how it can be used. Solution URL:

More information

Performing a Post-Upgrade Data Validation Check

Performing a Post-Upgrade Data Validation Check Performing a Post-Upgrade Data Validation Check 2013 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or

More information

Dealing with Event Viewer

Dealing with Event Viewer Dealing with Event Viewer Event Viewer is a troubleshooting tool in Microsoft Windows 2000.This how-to article will describe how to use Event Viewer. Event Viewer displays detailed information about system

More information

Create Rank Transformation in Informatica with example

Create Rank Transformation in Informatica with example Create Rank Transformation in Informatica with example Rank Transformation in Informatica. Creating Rank Transformation in Inforamtica. Creating target definition using Target designer. Creating a Mapping

More information

Table of Contents. Eccella 1

Table of Contents. Eccella 1 ECCELLA 22-Apr-14 Table of Contents Introduction... 2 About the tool... 2 Features... 2 Scope... 3 Components... 4 Input... 4 Outputs... 5 Points to Note... 5 Operation... 6 Installation... 6 Update Licensing

More information

Performance Optimization for Informatica Data Services ( Hotfix 3)

Performance Optimization for Informatica Data Services ( Hotfix 3) Performance Optimization for Informatica Data Services (9.5.0-9.6.1 Hotfix 3) 1993-2015 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic,

More information

Informatica Cloud Spring Microsoft Azure Blob Storage V2 Connector Guide

Informatica Cloud Spring Microsoft Azure Blob Storage V2 Connector Guide Informatica Cloud Spring 2017 Microsoft Azure Blob Storage V2 Connector Guide Informatica Cloud Microsoft Azure Blob Storage V2 Connector Guide Spring 2017 October 2017 Copyright Informatica LLC 2017 This

More information

SelfTestEngine.PR000041_70questions

SelfTestEngine.PR000041_70questions SelfTestEngine.PR000041_70questions Number: PR000041 Passing Score: 800 Time Limit: 120 min File Version: 20.02 http://www.gratisexam.com/ This is the best VCE I ever made. Try guys and if any suggestion

More information

Data Integration Service Optimization and Stability

Data Integration Service Optimization and Stability Data Integration Service Optimization and Stability 2013 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording

More information

SILWOOD TECHNOLOGY LTD. Safyr Metadata Discovery Software. Safyr Getting Started Guide

SILWOOD TECHNOLOGY LTD. Safyr Metadata Discovery Software. Safyr Getting Started Guide SILWOOD TECHNOLOGY LTD Safyr Metadata Discovery Software Safyr Getting Started Guide S I L W O O D T E C H N O L O G Y L I M I T E D Safyr Getting Started Guide Safyr 7.1 This product is subject to the

More information

CA ERwin Data Modeler

CA ERwin Data Modeler CA ERwin Data Modeler Implementation Guide Service Pack 9.5.2 This Documentation, which includes embedded help systems and electronically distributed materials, (hereinafter referred to only and is subject

More information

Performing Lineage Analysis on Custom Metadata in Metadata Manager 8.5

Performing Lineage Analysis on Custom Metadata in Metadata Manager 8.5 Performing Lineage Analysis on Custom Metadata in Metadata Manager 8.5 2008 Informatica Corporation Overview In Metadata Manager 8.5, you can create object-level relationships between custom resource objects

More information

Oracle s Hyperion Data Integration Management Adapter for Financial Management Release Readme

Oracle s Hyperion Data Integration Management Adapter for Financial Management Release Readme Oracle s Hyperion Data Integration Management Adapter for Financial Management Release 9.3.1.1 File This file contains the following sections: Purpose... 1 System Requirements... 2 Hardware... 2 Software...

More information

Informatica BCI Extractor Solution

Informatica BCI Extractor Solution Informatica BCI Extractor Solution Objective: The current BCI implementation delivered by Informatica uses a LMAPI SDK plugin to serially execute idoc requests to SAP and then execute a process mapping

More information

StreamServe Persuasion SP5 StreamServe Connect for SAP - Business Processes

StreamServe Persuasion SP5 StreamServe Connect for SAP - Business Processes StreamServe Persuasion SP5 StreamServe Connect for SAP - Business Processes User Guide Rev A StreamServe Persuasion SP5StreamServe Connect for SAP - Business Processes User Guide Rev A SAP, mysap.com,

More information