INFORMATICA POWERCENTER BASICS KNOWLEDGES

Similar documents
Migrating Mappings and Mapplets from a PowerCenter Repository to a Model Repository

Data Warehousing Concepts

Informatica Power Center 10.1 Developer Training

Website: Contact: / Classroom Corporate Online Informatica Syllabus

Enterprise Data Catalog for Microsoft Azure Tutorial

Jyotheswar Kuricheti

Session 41660: Using Hyperion Data Integration Management with Hyperion Planning and Hyperion Essbase

How to Use Full Pushdown Optimization in PowerCenter

Oracle 1Z0-640 Exam Questions & Answers

PowerCenter 7 Architecture and Performance Tuning

Question: 1 What are some of the data-related challenges that create difficulties in making business decisions? Choose three.

INFORMATICA PERFORMANCE

Performance Optimization for Informatica Data Services ( Hotfix 3)

Optimizing Performance for Partitioned Mappings

Data Integration and ETL with Oracle Warehouse Builder

1 Dulcian, Inc., 2001 All rights reserved. Oracle9i Data Warehouse Review. Agenda

SAS Data Integration Studio 3.3. User s Guide

Introduction to Federation Server

DQpowersuite. Superior Architecture. A Complete Data Integration Package

Configuring a JDBC Resource for IBM DB2 for z/os in Metadata Manager

New Features Guide Sybase ETL 4.9

SelfTestEngine.PR000041_70questions

Oracle Data Integrator 12c: Integration and Administration

Configuring a JDBC Resource for MySQL in Metadata Manager

This document contains information on fixed and known limitations for Test Data Management.

Crystal Reports. Overview. Contents. How to report off a Teradata Database

About Gluent. we liberate enterprise data. We are long term Oracle Database & Data Warehousing guys long history of performance & scaling

Data Validation Option Best Practices

This is a simple tutorial that covers the basics of SAP Business Intelligence and how to handle its various other components.

Configuring a JDBC Resource for Sybase IQ in Metadata Manager

Importing Flat File Sources in Test Data Management

ETL Transformations Performance Optimization

SAS Data Integration Server

Informatica Power Center 9.0.1

WORK EXPERIENCE: Client: On Time BI, Dallas, TX Project: Sr. BODI Developer

Information empowerment for your evolving data ecosystem

MetaMatrix Enterprise Data Services Platform

Extending the Scope of Custom Transformations

Getting Information Out of the Informatica Repository. William Flood, ETL Team Lead Charles Schwab

Configure an ODBC Connection to SAP HANA

Call: Datastage 8.5 Course Content:35-40hours Course Outline

Cloud Mapping Designer (CMD) - FAQs

Informatica Developer Tips for Troubleshooting Common Issues PowerCenter 8 Standard Edition. Eugene Gonzalez Support Enablement Manager, Informatica

Perform scalable data exchange using InfoSphere DataStage DB2 Connector

High Speed ETL on Low Budget

ETL TESTING TRAINING

Optimizing Testing Performance With Data Validation Option

1. Attempt any two of the following: 10 a. State and justify the characteristics of a Data Warehouse with suitable examples.

FEATURES BENEFITS SUPPORTED PLATFORMS. Reduce costs associated with testing data projects. Expedite time to market

INDEPTH Network. Introduction to ETL. Tathagata Bhattacharjee ishare2 Support Team

Informatica Data Quality Upgrade. Marlene Simon, Practice Manager IPS Data Quality Vertical Informatica

Fast Track Model Based Design and Development with Oracle9i Designer. An Oracle White Paper August 2002

Full file at

Importing Metadata from Relational Sources in Test Data Management

SQL Maestro and the ELT Paradigm Shift

Deccansoft Software Services. SSIS Syllabus

Intro to BI Architecture Warren Sifre

IBM InfoSphere Information Analyzer

C_HANAIMP142

An Oracle White Paper March Oracle Warehouse Builder 11gR2: Feature Groups, Licensing and Feature Usage Management

Data Warehouse and Data Mining

Informatica Cloud Data Integration Winter 2017 December. What's New

Building Next- GeneraAon Data IntegraAon Pla1orm. George Xiong ebay Data Pla1orm Architect April 21, 2013

Property Default Schema Is Not Available For Database Ssis

After completing this course, participants will be able to:

1Z0-526

Topic 1, Volume A QUESTION NO: 1 In your ETL application design you have found several areas of common processing requirements in the mapping specific

Best ETL Design Practices. Helpful coding insights in SAS DI studio. Techniques and implementation using the Key transformations in SAS DI studio.

Oracle Data Integrator 12c: Integration and Administration

Hyperion Data Integration Management Adapter for Essbase. Sample Readme. Release

MetaSuite : Advanced Data Integration And Extraction Software

Accessibility Features in the SAS Intelligence Platform Products

Performance Tuning. Chapter 25

W h i t e P a p e r. Integration Overview Importing Data and Controlling BarTender from Within Other Programs

Data Warehouse and Data Mining

Call: SAS BI Course Content:35-40hours

Data Stage ETL Implementation Best Practices

Configuring a JDBC Resource for IBM DB2/ iseries in Metadata Manager HotFix 2

Product Overview. Technical Summary, Samples, and Specifications

Availability Digest. Attunity Integration Suite December 2010

Luncheon Webinar Series June 3rd, Deep Dive MetaData Workbench Sponsored By:

This course is suitable for delegates working with all versions of SQL Server from SQL Server 2008 through to SQL Server 2016.

A Examcollection.Premium.Exam.47q

Certkiller.A QA

About Database Adapters

Oracle Data Integrator 12c: Integration and Administration

Using MDM Big Data Relationship Management to Perform the Match Process for MDM Multidomain Edition

Data Warehousing Concept Using ETL Process for SCD Type-2

Oracle Warehouse Builder 10g Release 2 Integrating Packaged Applications Data

Hype Cycle for Data Warehousing, 2003

Oracle BI 11g R1: Build Repositories

Compact Solutions Connector FAQ

Pro Tech protechtraining.com

Designing your BI Architecture

Oracle Data Integration and OWB: New for 11gR2

Table of Contents. Eccella 1

Managing Metadata with Oracle Data Integrator. An Oracle Data Integrator Technical Brief Updated December 2006

Data Science. Data Analyst. Data Scientist. Data Architect

CHAPTER 3 Implementation of Data warehouse in Data Mining

COPYRIGHTED MATERIAL. Contents. Introduction. Chapter 1: Welcome to SQL Server Integration Services 1. Chapter 2: The SSIS Tools 21

Transcription:

INFORMATICA POWERCENTER BASICS KNOWLEDGES Emil LUNGU 1, Gabriel PREDUŞCĂ 2 1 Valahia University of Targoviste, Faculty of Sciences and Arts, 18-24 Unirii Blvd., 130082 Targoviste, Romania 2 Valahia University of Targoviste, Electrical Engineering Faculty, 18-24 Unirii Blvd., 130082 Targoviste, Romania e-mail: emil.lungu@yahoo.com Abstract. This article describes the data modeling techniques used in constructing a multipurpose, stable, and sustainable data warehouse used to support business intelligence (BI). This article introduces the data warehouse by describing the objectives of BI and the data warehouse and by explaining how these fit into the overall Corporate Information Factory (CIF) architecture. The purpose of this document is to provide an overview of the architecture of Informatica, its features, it s working, and the advantages offered by Informatica vis-à-vis the other data integration tools etc. Keywords: data warehouse, Informatica, powercenter. 1. INTRODUCTION Organizations have a number of ERP, CRM, SCM and Web application implementations and are hence burdened with the maintenance of these heterogeneous environments. To address the existing and evolving integration requirements, organizations need reliable and scalable data integration architecture so that individual projects can build value on one another. Informatica provides a complete range of tools and data services needed to address the most complex data integration projects. 2. DATA WAREHOUSING SCENARIO A Data Warehouse is a Subject Oriented, Integrated, Non volatile, and Time Variant repository of data that is generally used for querying and analyzing the past trends to support management decisions for the future. A Data Warehouse can be a relational database, multidimensional database, flat file, hierarchical database, object database, etc. Stages in a typical Data Warehousing project: a. Requirement Gathering. The Project team will gather end user reporting requirements and the remaining period of the project would be dedicated to satisfying these requirements. b. Identify the Business Areas. Identify the data that would be required by the Business. c. Data Modeling. The foundation of the data warehousing system is the data model. The first step in this stage is to build the Logical data model based on the user requirements and the next step would be to translate the Logical data model into a Physical data model. d. ETL Process. ETL is the Data Warehouse acquisition processes of Extracting, Transforming and Loading data from source systems into the data warehouse. This requires an understanding of the business rules, the logical and the physical data models and also involves getting the data from the source and populating it into the target [1]. ETL, Extract transform Load is a process in data warehousing that involves: extracting data from outside sources, transforming it to fit business needs (which can include quality levels), and ultimately loading it into the end target, i.e. the data warehouse. ELT stands for Extract Load and Transform and is a technique for moving and transforming data from one location and format to another instance and format. In this style of data integration, the target DBMS becomes the transformation engine. This is in contrast to traditional ETL, or Extract Transform and Load which is the traditional technique for moving and transforming data in which an ETL engine that is separate from either the source or target DBMS performs the data transformations. ELT provides optimal efficiency and reduced cost in certain integration scenarios regardless of the data volumes involved. There are not many tools dedicatedly ELT or pure ELT, available in the market. In Transformation features were built in the Databases itself, coupling the transformation ability with the SQL engines, leading to a genre of Databases offering ELT. Sunopsis was the single tool which is uniquely ELT (not offered by RDBMS), before being acquired by Oracle in 2006. Also ETL tools come with the option of performing ELT using them, by fitting ELT logic in the transformation process logic development using the ETL tool. ETL tools offer to push part/all of the processing into the Database engine, where in execution of that part of processing is most optimized. This means you code in one environment and send part of the code to the database where there is a performance gain. This can be called as ETLT. The flexibility to push data transformation processing to the most appropriate processing resource, whether within a source or target 93

database or through the ETL server could be termed as coming under the category of ETLT. Few ETL tools also enable the developer to choose the optimized method of execution of the parts / whole Transformation, allowing few of them to be performed in the source end, all/few at the target end. In the situation of source and target being in the same database, whole ETL process can be squeezed to a SQL operation to be pushed to the Database engine itself. Few other categories of ETL tools could do transformations at source end and target end by employing specific integration components, which could handle heterogeneous source/target environments. They are termed as T-ETL tools. ETL tools are based on a proprietary engine that runs all the transformation processes. The language of the ETL tool itself could be used to code the Transformation logics. However the proprietary engine performing all the transformations becomes a bottleneck in the transformation process. All data, coming from various sources to go to the target, had to pass through an engine that processed data transformations row by row a very slow approach when dealing with significant volumes of data. When the performance factor emerges bigger in the ETL process, ETL engines has to start using the potential of the Database engine to which the data are targeted most cases. ELT approach leverages the target Database engine s performance at its best. The in-database Transformation which is the essence of ELT approach can utilize information already in the Data stores to make decisions appropriately. The full scalability and parallelism of the Database engine underneath could be utilized [2]. Using an RDBMS to execute data transformations allows bulk processing of the data. Bulk is up to 1,000 times faster than row-by-row data processing. The larger the volume of data to manage, the more important the bulk processing becomes. In the ELT architecture, all database engines can potentially participate in a transformation thus running each part of the process where it is the most optimized. Any RDBMS can be an engine, and it may make sense to distribute the SQL code among sources and target to achieve the best performance. For example, a join between two large tables may be done on the source. ELT tools could take advantage of the power of today s RDBMS engines to perform any data integration tasks, by leveraging and orchestrating the work of these systems and processing all the data transformations in bulk, using the set processing features of the relational model. In essence ELT approach could: leverage database engines potentials; enable bulk processing of transformations; optimized processing to achieve best performance. When thinking about data integration techniques, it s important to understand what you are optimizing for. Depending on the objectives, one could optimize for timeliness (average time of information availability), cost of integration, availability of information (uptime of the integration chain), data cleanliness and integrity, process auditability or other factors. An understanding of the primary optimization objective is the first step in determining an overall integration approach. In case of a new Data warehouse, if the decision has to be made on the integration approach, following factors have to be considered: Arbitrary Data Volume metrics, Licensing costs, Cost of Operation, Implementation cost, Maintenance cost, Transformation complexity, Compatibility with range of Source systems [3]. In case of an existing Data warehouse consider: current workload on source and target dbms platforms; relative efficiency of performing a particular operation in source system, target system or integration system If an existing Database in the Data warehouse provides the ability to perform transformations, as a plug-in, it could be the better option to use the plug-in. Given the higher costs of the Databases, if an ETL tool is in use and it provides a plug-in to enable push down the transformations to the Database engines, would be the better option to get the add on option of the ETL tool. 3. POWERCENTER Informatica PowerCenter provides an add-on option for emphasizing the ETLT approach of Data Integration. ETLT promises a mixed mode approach mixing push down capabilities into the overall ETL process. 3.1. PowerCenter Pushdown Optimization This option is available from PowerCenter 8.x. The option aims at providing flexibility in choosing to execute certain transformations where it is most optimized. The process could be either run at the Source Database or the Target Database or at the ETL engine itself. All database engines can potentially participate in the transformation process, thus leveraging the capabilities of the available database engines. PowerCenter supports push down capability on the following databases: Oracle 9.x and above, IBM DB2, Teradata, Microsoft SQL Server, Sybase ASE, Databases that use ODBC drivers. PowerCenter provides three types of Push down options to choose: 94

Partial to Source: One or more transformations could be performed at the source. The processing of these transformations is pushed to source Database engine. Eg: Source->Filter->Aggregator->Target sequence of mapping could be a SQL statement like SELECT FROM <Source> where <Filter> GROUP BY <Aggregator> Partial to Target: One or more transformations could be performed at the target. The processing of these transformations is pushed to the target Database engine. Eg. Source->Expression->Lookup->Target sequence of mapping could be a SQL statement like INSERT into <Target> VALUES (?+10,INSTR(..,..),.) Full: In instances where both source and target data are co-resident in a relational database, processing can be pushed down into that database. In this option, no data is extracted outside the database. Eg. Source->Expression->Filter->Aggregator->Target sequence of mapping could be a SQL statement like INSERT into <Target> SELECT?+10, from <Source> where <Filter> GROUP BY <Aggregator> Push down optimization can be configured in the Session properties wherein one of these options could be chosen. It is available in the Performance Optimization setting in the Properties tab. PowerCenter Integration service analyzes the mapping and the session to determine which transformation logics to be processed where. PowerCenter Integration service would generate one or more SQL statements translating the transformation logics which could be pushed to the database. These SQL statements are executed by the database engine, either at the source or at the target based on the Pushdown optimization option that is chosen. The transformation logics which could not be pushed down to the database engine are processed by the Integration service itself [4]. The expressions in the transformations are converted to those compatible with the database engine which is going to execute the transformation logic. Integration Service translates the operators, functions and variables to those on the database side. When there is no equivalent operator, function or variable found at the Database end, Integration service itself executes the transformation logic. For example, the Integration Service translates the aggregate function, STDDEV () to STDDEV_SAMP () on Teradata and STDEV () on Microsoft SQL Server. However, no database supports the aggregate function, FIRST (), so the Integration Service processes any transformation that uses the FIRST () function. The transformations selected by the Integration service to push down to the database could be previewed. The SQL statements generated could be also previewed, but cannot be modified. The revised SQL statements are not stored in the repository. The push down optimization could be applied to the following transformations to Source: Filter, Aggregator, Expression, Joiner, Lookup, Sorter, Union. The push down optimization could be applied to the following transformations to Target: Expression, Lookup Target Definition. Benefits: - Need to resort to database specific programming to exploit the database processing power is eliminated. - A single design environment is used although processing is spread between the data integration and the database engines. - By simply selecting pushdown optimization in the PowerCenter GUI database specific transformation language is dynamically created and executed as appropriate. - Metadata-driven architecture supplies impact analysis and data lineage, ensuring comprehensive visibility into the potential effects of changes. Reporting: Design, Develop and enable the end users to visualize the reports thereby bringing value to the Data Warehouse. Selection of an ETL tool would depend on various factors such as the Complexity of the data transformation, Data Cleansing needs and the Volume of data involved. The commonly used ETL tools are: Informatica, Ab Initio, Ascential DataStage, Data Junction, Reveleus. Informatica provides an environment that can extract data from multiple sources, transform the data according to the business logic that is built in the Informatica Client application and load the transformed data into files or relational targets. Informatica comes in different packages: PowerCenter license has all options, including distributed metadata (data about data). PowerMart is a limited license and does not have a distributed metadata. The other products that are provided by Informatica are: Power Analyzer which is a web based tool for data analysis. Superglue provides graphical representation of data quality and flow, flexible analysis and reporting of overall data volumes, loading performance, etc. 4. ARHITECTURE The diagram provided below provides an overview of the various components of Informatica and the connectivity between them: a) Informatica Repository: The Informatica Repository is a database with a set of metadata tables that is accessed by the Informatica Client and Server to save and retrieve metadata. Repository stores the data needed for data extraction, transformation, loading, and management. b) Informatica Client: The Informatica Client is used to manage users, define sources and targets, build 95

mappings and mapplets with the transformation logic, and create sessions to run the mapping logic. 3. Configure the ODBC. 4. Register the Informatica Server in the Server Manager. 5. Create a Repository, create users and groups, edit users profiles. 6. Add source and target definitions, set up mapping between the sources and targets, create a session for each mapping and run the sessions. 5.1. Case Study Figure 1. PowerCenter and SOA Product Arhitecture The Informatica Client has three main applications: 1. Repository Manager: This is used to create and administer the metadata repository. The repository users and groups are created through the Repository Manager Assigning privileges and permissions, managing folders in the repository and managing locks on the mappings are also done through the Repository Manager 2. Designer: The Designer has five tools that are used to analyze sources, design target schemas and build the Source to Target mappings. These are: - Source Analyzer: This is used to either import or create the source definitions. - Warehouse Designer: This is used to import or create target definitions. - Mapping Designer: This is used to create mappings that will be run by the Informatica Server to extract, transform and load data. - Transformation Developer: This is used to develop reusable transformations that can be used in mappings. - Mapplet Designer: This is used to create sets of transformations referred to as Mapplets which can be used across mappings. 3. Server Manager: The Server Manager is used to create, schedule, execute and monitor sessions. c) Informatica Server: The Informatica Server reads the mapping and the session information from the repository. It extracts data from the mapping sources, stores it in the memory, applies the transformation rules and loads the transformed data into the mapping targets. Connectivity: Informatica uses the Network Protocol, Native Drivers or the ODBC for the Connectivity between its various components. The Connectivity details are as provided in the diagram above [5]. A Transformation is a repository object that generates, modifies, or passes data. The various Transformations that are provided by the Designer in Informatica have been explained with the aid of a mapping, Map_CD_Country_code. (Explained in blue). The mapping is present in the cifsit9i repository of the SIT machine under the folder Ecif_Dev_map. Objective: The mapping Map_CD_Country_code has been developed to extract data from the STG_COUNTRY table and move it into the ECIF_COUNTRY and the TRF_COUNTRY target tables. a) Source Definition: 1. The Source Definition contains a detailed definition of the Source. 2. The Source can be a Relational table, Fixed width and delimited flat files that do not contain binary data, COBOL files etc. 3. The relational source definition is imported from database tables by connecting to the source database from the client machine. - The Source in the Map_CD_Country_code is Shortcut_To_STG_COUNTRY *, a Source Definition Shortcut. - Right click on the Source and select edit. - In the Edit Transformations window, the Transformation tab has the following info: 5. SETING UP INFORMATICA 1. Install and Configure the Server components. 2. Install the Client applications. Figure 2. Edit transformation window 96

The circled area provides the location of the object that the shortcut references. In the above ex, the object referenced by the shortcut is present in the cifsit9i repository under the Ecif_dev_def folder and the object name is STG_COUNTRY. All fields from the Source are moved into the Source Qualifier. b) Source Qualifier (SQ_Shortcut_To_STG_ COUNTRY): 1. The Source Qualifier is an Active transformation. 2. The differences between an Active and a Passive transformation are as given below: Active Transformation (Advanced External Procedure, Aggregator, ERP Source Qualifier, Filter, Router, Update Strategy), Passive Transformation (Expression, External Procedure, Input, Lookup, Sequence Generator, XML Source Qualifier). An Active Transformation can change the A Passive Transformation does not change number of rows that pass through it the number of rows that pass through it. In the SQ_Shortcut_To_STG_COUNTRY, click on the Properties tab, SQL Query. The SQL Query is the query that is generated by Informatica and is a SELECT statement for each source column used in the mapping. But the Informatica Server reads only the columns in Source Qualifier that are connected to another transformation. In SQ_Shortcut_To_STG_COUNTRY, since all 4 fields ISO_CTRY_COD, CTRY_NAM, EMU_IND, PROC_FLG columns are connected to the EXP_COUNTRY transformation and hence the default SQL Query generated by Informatica would have all 4 columns. In case, one of the fields had not been mapped to any other transformation, that field would not have appeared in the default SQL Query. c) Lookup Transformation (LKP_CTRY_COD) 1. Lookup transformation is Passive transformation. 2. A Lookup transformation would be used in an Informatica mapping to lookup data in a relational table, view, or synonym. 3. The Informatica server queries the lookup table based on the lookup ports in the transformation. It compares Lookup transformation port values to lookup table column values based on the lookup condition. The result of the Lookup would then be passed on to other transformations and targets. In the Lookup transformation LKP_CTRY_COD, the input field SRC_COUNTRY_CODE is looked up against the COUNTRY_CODE field of the Lookup table and if the Lookup is successful, then the corresponding COUNTRY_CODE is returned as the output. The Lookup transformation can be configured to handle multiple matches in the following ways: - Return the first matching value, or return the last matching value. The transformation can be configured to return the first matching value or the last matching value. The first and last values are the first values and last values found in the lookup cache that match the lookup condition. - Return an error: The Informatica server returns the default value for the output ports. d) Expression Transformation (EXP_COUNTRY) 1. Expression transformation is Passive transformation. 2. All fields from the Source Qualifier are moved into the Expression transformation. The COUNTRY_CODE that is the output of the Lookup transformation is also moved into the Expression transformation. 3. O_PROC_FLAG has been set to Y in the Expression transformation. 4. All fields from the Expression transformation except the PROC_FLG field are moved into the Filter transformations FIL_NOTNULL_CTRY_COD and FIL_NULL_CTRY_COD. e) Filter Transformation (FIL_NOTNULL_CTRY_ COD) 1. Filter transformation is an Active transformation. 2. The COUNTRY_CODE field is checked for NOT NULL and if found true, the records are passed on to the Update Strategy UPD_COUNTRY_CODE, the Lookup transformation LKPTRANS and the Update Strategy UPD_UPD_STG_COUNTRY. f) Update Strategy Transformation (UPD_COUNTRY_ CODE) 1. Update Strategy transformation is an Active transformation. 2. The ISO_CTRY_COD, CTRY_NAM, BMU_IND fields are moved to the Update Strategy transformation from the FIL_NOTNULL_CTRY_COD transformation. 3. Click on the Properties tab 4. Update Strategy Expression is DD_UPDATE. 5. Forward Rejected Rows option is selected. 6. Update Strategy Expression is used to flag individual records for insert, delete, update or reject. 7. The below table lists the constants for each database operation and the numerical equivalent: Table 1. List the constants for each database operation Operation ConstantNumeric Value Insert DD_INSERT 0 Update DD_UPDATE 1 Delete DD_DELETE 2 Reject DD_REJECT 3 97

8. A session can also be configured for handling specific database operations. This is done by setting the Treat rows as field in the Session Wizard dialog box that appears while session configuration. 9. The Treat rows as option determines the treatment for all rows in the session. The options provided here are insert, delete, update or data-driven. 10. If the mapping for the session contains an Update Strategy transformation, this field is marked Data Driven by default. If any other option is selected, the Informatica Server ignores all Update Strategy transformations in the mapping. Update Strategy UPD_COUNTRY_CODE updates the target table Shortcut_to_ECIF_COUNTRY which is a shortcut to the ECIF_COUNTRY table. The starting value and the increment are set in the Sequence Generator transformation and the NEXTVAL is connected to the dataflow. A Sequence generator is normally placed after a filter (generally a filter that checks the primary key value of the target for NULL, which would indicate that the record is new) and before an update strategy that is set to DD_INSERT. If multiple informatica mappings write to the same target table, the sequence generator should be used as a reusable object or a shortcut. If non informatica routines write to the same target table, using a trigger or a database method is recommended. 6. CONCLUSIONS The contribution that these ETL tools have made to data warehousing development process is significant. Considering the emerging needs of data integration and data migration it is necessary that features provided by ETL tool should address these changing requirements. The key is to understand your current and future ETL processing needs and then identify the product features and functions that support those needs. With this knowledge, you can then identify one or more products that can adequately meet your ETL and BI needs. REFERENCES Figure 3. Update strategy UPD_COUNTRY_CODE g) Update Strategy Transformation (UPD_UPD_STG_ COUNTRY) 1. This receives the ISO_CTRY_COD and PROC_FLG fields from the filter transformation FIL_NOTNULL_ CTRY_COD when the COUNTRY_CODE is NOT NULL. 2. This updates the target table Shortcut_To_STG_ COUNTRY which is a shortcut to the STG_COUNTRY table. The Sequence Generator transformation is an object in Informatica which outputs a unique sequential number to each dataflow that it is attached to. Books: [1] Imhoff C., s.a., Mastering Data Warehouse Design. Indianapolis: Wiley Publising, Inc., 2003. [2] Adelman S., Impossible Data Warehouse Situations, Boston, MA: Addison Wesley Professional, 2002. [3] Adelman S., Moss L., Data Warehouse Project Management, Boston, MA: Addison Wesley, 2000. [4] David A. Maluf et al. (2005). "Lean middleware". SIGMOD 2005: 788-791. [5] Alon Y. Halevy et al. (2005). "Enterprise information integration: successes, challenges and controversies". SIGMOD 2005: 778-787. 98