Pre-Installation Tasks Before you apply the update, shut down the Informatica domain and perform the pre-installation tasks.

Similar documents
How to Install and Configure EBF16193 for Hortonworks HDP 2.3 and HotFix 3 Update 2

How to Run the Big Data Management Utility Update for 10.1

How to Install and Configure EBF14514 for IBM BigInsights 3.0

How to Install and Configure EBF15545 for MapR with MapReduce 2

New Features and Enhancements in Big Data Management 10.2

How to Configure Big Data Management 10.1 for MapR 5.1 Security Features

How to Configure Informatica HotFix 2 for Cloudera CDH 5.3

How to Install and Configure Big Data Edition for Hortonworks

Upgrading Big Data Management to Version Update 2 for Hortonworks HDP

Upgrading Big Data Management to Version Update 2 for Cloudera CDH

Informatica Cloud Spring Hadoop Connector Guide

Configuring Ports for Big Data Management, Data Integration Hub, Enterprise Information Catalog, and Intelligent Data Lake 10.2

Configuring a Hadoop Environment for Test Data Management

This document contains information on fixed and known limitations for Test Data Management.

Informatica Version Release Notes December Contents

This document contains important information about main features, installation, and known limitations for Data Integration Hub.

Configuring Sqoop Connectivity for Big Data Management

Configuring Intelligent Streaming 10.2 For Kafka on MapR

Informatica 10.2 Release Notes September Contents

Performance Tuning and Sizing Guidelines for Informatica Big Data Management

VMware vsphere Big Data Extensions Administrator's and User's Guide

This document contains important information about Emergency Bug Fixes in Informatica Service Pack 1.

Informatica Cloud Spring Complex File Connector Guide

How to Write Data to HDFS

Beta. VMware vsphere Big Data Extensions Administrator's and User's Guide. vsphere Big Data Extensions 1.0 EN

This document contains information on fixed and known limitations for Test Data Management.

Hortonworks Data Platform

KNIME Extension for Apache Spark Installation Guide. KNIME AG, Zurich, Switzerland Version 3.7 (last updated on )

Informatica (Version HotFix 3 Update 3) Big Data Edition Installation and Configuration Guide

KNIME Extension for Apache Spark Installation Guide

Enterprise Data Catalog Fixed Limitations ( Update 1)

Informatica (Version HotFix 2) Big Data Edition Installation and Configuration Guide

Tuning the Hive Engine for Big Data Management

New Features... 1 Upgrade Changes... 1 Installation and Upgrade... 1 Known Limitations... 2 Informatica Global Customer Support...

VMware vsphere Big Data Extensions Command-Line Interface Guide

How to Configure MapR Hive ODBC Connector with PowerCenter on Linux

This document contains important information about main features, installation, and known limitations for Data Integration Hub.

Informatica PowerExchange for Hadoop (Version ) User Guide for PowerCenter

Tuning Enterprise Information Catalog Performance

VMware vsphere Big Data Extensions Command-Line Interface Guide

SQT03 Big Data and Hadoop with Azure HDInsight Andrew Brust. Senior Director, Technical Product Marketing and Evangelism

Hadoop. Course Duration: 25 days (60 hours duration). Bigdata Fundamentals. Day1: (2hours)

Using Two-Factor Authentication to Connect to a Kerberos-enabled Informatica Domain

Emergency Bug Fixes (9.7.0)... 1 Fixed Limitations (9.7.0)... 2 Known Limitations (9.7.0)... 3 Informatica Global Customer Support...

Informatica PowerExchange for Microsoft Azure Blob Storage 10.2 HotFix 1. User Guide

Blended Learning Outline: Cloudera Data Analyst Training (171219a)

Strategies for Incremental Updates on Hive

Axon Fixed Limitations... 1 Known Limitations... 3 Informatica Global Customer Support... 5

SAS Data Loader 2.4 for Hadoop

Informatica Big Data Management Hadoop Integration Guide

Informatica Big Data Management Big Data Management Administrator Guide

Informatica Big Data Management (Version Update 2) Installation and Configuration Guide

iway Big Data Integrator New Features Bulletin and Release Notes

Blended Learning Outline: Developer Training for Apache Spark and Hadoop (180404a)

How to Optimize Jobs on the Data Integration Service for Performance and Stability

ISILON ONEFS WITH HADOOP KERBEROS AND IDENTITY MANAGEMENT APPROACHES. Technical Solution Guide

Table of Contents HOL-SDC-1409

SAS Viya 3.2 and SAS/ACCESS : Hadoop Configuration Guide

Tuning Intelligent Data Lake Performance

Talend Open Studio for Big Data. Getting Started Guide 5.3.2

SAS 9.4 Hadoop Configuration Guide for Base SAS and SAS/ACCESS, Fourth Edition

How to connect to Cloudera Hadoop Data Sources

Integrating with Apache Hadoop

Big Data Hadoop Developer Course Content. Big Data Hadoop Developer - The Complete Course Course Duration: 45 Hours

iway iway Big Data Integrator New Features Bulletin and Release Notes Version DN

Installing SmartSense on HDP

9.4 Hadoop Configuration Guide for Base SAS. and SAS/ACCESS

Cloudera Manager Quick Start Guide

Overview. : Cloudera Data Analyst Training. Course Outline :: Cloudera Data Analyst Training::

Informatica Big Data Management (Version 10.0) Big Data Management Installation and Configuration Guide

Oracle Data Integrator 12c: Integration and Administration

Informatica Big Data Release Notes February Contents

Getting Started with Pentaho and Cloudera QuickStart VM

Creating Column Profiles on LDAP Data Objects

Using Apache Phoenix to store and access data

Informatica Cloud Data Integration Spring 2018 April. What's New

Apache Hadoop Installation and Single Node Cluster Configuration on Ubuntu A guide to install and setup Single-Node Apache Hadoop 2.

Oracle Data Integrator 12c: Integration and Administration

Enterprise Data Catalog for Microsoft Azure Tutorial

Xcalar Installation Guide

Hadoop Security. Building a fence around your Hadoop cluster. Lars Francke June 12, Berlin Buzzwords 2017

HDP Security Overview

HDP Security Overview

Hadoop. copyright 2011 Trainologic LTD

Quick Deployment Step-by-step instructions to deploy Oracle Big Data Lite Virtual Machine

Hortonworks Data Platform

BIG DATA TRAINING PRESENTATION

Configuring and Deploying Hadoop Cluster Deployment Templates

Apache Ranger User Guide

Big Data Analytics using Apache Hadoop and Spark with Scala

Release Notes 1. DLM Release Notes. Date of Publish:

Important Notice Cloudera, Inc. All rights reserved.

Informatica PowerExchange for Tableau User Guide

Tuning Intelligent Data Lake Performance

Innovatus Technologies

Importing Connections from Metadata Manager to Enterprise Information Catalog

Using Apache Zeppelin

Integrating Big Data with Oracle Data Integrator 12c ( )

Informatica Intelligent Data Lake (Version 10.1) Installation and Configuration Guide

Securing the Oracle BDA - 1

Transcription:

Informatica LLC Big Data Edition Version 9.6.1 HotFix 3 Update 3 Release Notes January 2016 Copyright (c) 1993-2016 Informatica LLC. All rights reserved. Contents Pre-Installation Tasks... 1 Prepare the Data Integration Service.... 1 Prepare the Analyst Service.... 2 Installation... 2 Apply the Update to the Hadoop Cluster.... 3 Apply the Update to the Informatica Domain.... 3 Apply the Update to the Informatica Clients.... 3 Post-Installation Tasks... 4 Configure Big Data Edition.... 4 Create and Configure the Analyst Service.... 9 New Features and Enhancements... 9 Changes... 10 Fixed Limitations... 10 Known Limitations... 11 Third-Party Limitations... 12 Informatica Global Customer Support... 12 This document contains important information about installation and limitations for Big Data Edition. Pre-Installation Tasks Before you apply the update, shut down the Informatica domain and perform the pre-installation tasks. Prepare the Data Integration Service Before you apply the update, prepare the Data Integration Service. Delete the following directory on the machine where the Data Integration Service runs: <Informatica installation directory>/services/dataintegrationservice/dataintegrationservice. BD-RLN-961HF3-0UD3 1

Prepare the Analyst Service Before you apply the update, prepare the Analyst Service if the Hadoop cluster uses Kerberos authentication. Note: Skip this task if the Informatica domain does not have an Analyst Service. To use the Analyst Service with a Hadoop cluster that uses Kerberos authentication, perform the following steps before you apply the update: 1. Shut down the Analyst Service. 2. Delete the following directories on the machine where the Data Integration Service runs: <Informatica installation directory>\tomcat\temp\<analystservicename> <Informatica installation directory>\services\analystservice\analysttool Installation Big Data Edition 9.6.1. HotFix 3 Update 3 is an update to version 9.6.1 HotFix 3, including PowerCenter. Use EBF16415 to apply Update 3. Before you install the update, you must have Big Data Edition 9.6.1 HotFix 3 installed. For more information about how to install and configure Big Data Edition, see the Informatica Big Data Edition Installation and Configuration Guide. After Big Data Edition 9.6.1 HotFix 3 is installed, download EBF16415 and apply the update to the Hadoop cluster, the Informatica domain, and the Developer tool and PowerCenter clients. The EBF is available in the following location from https://tsftp.informatica.com: /updates/ Informatica9/9.6.1 HotFix3/EBF16415. The EBF contains the following files: EBF16415.html Contains the steps to apply Update 3 to the Informatica domain and clients. You must also apply the update to the Hadoop cluster. EBF16415.Linux64-X86.tar.gz Contains the files required to apply Update 3 to the Hadoop cluster and the Informatica domain. Extract the following files: EBF16415_HadoopRPM_EBFInstaller.tar and EBF16415_Server_installer_linux_em64t.tar Use the EBF16415_HadoopRPM_EBFInstaller.tar.Z file to apply Update 3 to the Hadoop cluster. Use the EBF16415_Server_installer_linux_em64t.tar file to apply Update 3 to the Informatica domain. EBF16415.Win32Client.zip Contains the files required to apply Update 3 to the Informatica clients. INFORMATICA-9.6.1-3.informatica9.6.1.3.p3.2323.parcel.tar Contains the parcels required to apply Update 3 to the Hadoop cluster. Download this file if you want to use Cloudera Manager to apply Update 3. 2 BD-RLN-961HF3-0UD3

Apply the Update to the Hadoop Cluster You must apply the update to every node in the Hadoop cluster. If the cluster runs Cloudera CDH, you can distribute the Big Data Edition update as parcels across the Hadoop cluster nodes with Cloudera Manager. To create parcel repositories, upload the Informatica parcels and manifest.json to a web server hosted on your local network. For more information about how to use Cloudera Manager to distribute Big Data Edition, see Informatica Knowledge Base Article 303347. To apply the update to the Hadoop cluster without Cloudera Manager, perform the following steps: 1. Copy EBF16415_HadoopRPM_EBFInstaller.tar.Z to a temporary location on the cluster machine. 2. Extract the installer file. Run the following command: tar -xvf EBF16415_HadoopRPM_EBFInstaller.tar.Z 3. Provide the node list in the HadoopDataNodes file. 4. Configure the destdir parameter in the input.properties file: destdir=<informatica home directory> For example, set the destdir parameter to the following value: destdir="/opt/informatica" 5. Run InformaticaHadoopEBFInstall.sh. Apply the Update to the Informatica Domain Apply the update to every node in the domain that is used to connect to HDFS or the HiveServer. To apply the update to a node in the domain, perform the following steps: 1. Copy EBF16415_Server_Installer_linux_em64t.tar to a temporary location on the node. 2. Extract the installer file. Run the following command: tar -xvf EBF16415_Server_Installer_linux_em64t.tar 3. Configure the following properties in the Input.properties file: DEST_DIR=<Informatica installation directory> ROLLBACK=0 4. Run installebf.sh. 5. Repeat steps 1 through 4 for every node in the domain that is used for Hive pushdown. Apply the Update to the Informatica Clients To apply the update to the Informatica client, perform the following steps: 1. Copy EBF16415_Client_Installer_win32_x86.zip to the Windows client machine. 2. Extract the installer. BD-RLN-961HF3-0UD3 3

3. Configure the following properties in the Input.properties file: DEST_DIR=<Informatica installation directory> ROLLBACK=0 Use two slashes when you set the DEST_DIR property. For example, include the following lines in the Input.properties file: DEST_DIR=C:\\Informatica\\9.6.1HF3 ROLLBACK=0 4. Run installebf.bat. Post-Installation Tasks Complete the post-installation tasks after you apply the update. Configure Big Data Edition If you need to configure Big Data Edition for a new Hadoop distribution after you apply the update, complete the post-installation task. Use the Big Data Edition Configuration Utility to automate part of the Big Data Edition configuration process. After you run the utility, complete the configuration process for Big Data Edition. For more information, see the Informatica Big Data Edition Installation and Configuration Guide. Alternatively, you can manually configure Big Data Edition. For more information about the manual configuration steps required, see the Informatica Big Data Edition Installation and Configuration Guide. To automate part of the configuration process for the Hadoop cluster properties on the machine where the Data Integration Service runs, perform the following steps: 1. On the machine where the Data Integration Service runs, open the command line. 2. Go to the following directory: <Informatica installation directory>/tools/bdeutil. 3. Run BDEConfig.sh. 4. Press Enter. 5. Choose the Hadoop distribution version you want to use to configure Big Data Edition: Option 1 Cloudera CDH 2 Hortonworks HDP 3 MapR 4 Pivotal HD 5 IBM BigInsights 6. Choose how to access files on the Hadoop cluster: 4 BD-RLN-961HF3-0UD3

If you choose Cloudera CDH, the following options appear: Option 1 Cloudera Manager. Enter this option to use the Cloudera Manager API to access files on the Hadoop cluster. 2 Secure Shell (SSH). Enter this option to use SSH to access files on the Hadoop cluster. This option requires SSH connections to the machines that host the NameNode, JobTracker, and Hive client. If you select this option, Informatica recommends that you use an SSH connection without a password or have sshpass or Expect installed. 3 Shared directory. Select this option to use a shared directory to access files on the Hadoop cluster. You must have read permission for the shared directory. Note: Informatica recommends the Cloudera Manager or SSH option. If you choose a distribution other than Cloudera CDH, the following options appear: Option 1 Secure Shell (SSH). Enter this option to use SSH to access files on the Hadoop cluster. This option requires SSH connections to the machines that host the NameNode, JobTracker, and Hive client. If you select this option, Informatica recommends that you use an SSH connection without a password or have sshpass or Expect installed. 2 Shared directory. Enter this option to use a shared directory to access files on the Hadoop cluster. You must have read permission for the shared directory. Note: Informatica recommends the SSH option. 7. Choose how you want Big Data Edition to run mappings: Option 1. No Big Data Edition uses the default Hive Command Line Interface to run mappings. 2. Yes Big Data Edition uses HiveServer2 to run mappings. 8. If you select "Yes" in step 7, specify the Informatica installation directory on the Hadoop cluster. By default, Informatica is installed in the following directory: /opt/informatica. 9. Based on the option you selected in step 6, see the corresponding topic to continue with the configuration process: Use Cloudera Manager on page 5 Use SSH on page 6 Use a Shared Directory on page 7 Use Cloudera Manager If you choose Cloudera Manager, perform the following steps to configure Big Data Edition: 1. Enter the Cloudera Manager host. BD-RLN-961HF3-0UD3 5

2. Enter the Cloudera user ID. 3. Enter the password for the user ID. 4. Enter the port for Cloudera Manager. The Big Data Edition Configuration Utility retrieves the required information from the Hadoop cluster. 5. Edit the yarn-site.xml file on the machine that runs the Data Integration Service. You can find the yarn-site.xml file in the following directory on the machine that runs the Data Integration Service: <Informatica installation directory>/services/shared/hadoop/ cloudera_cdh<version>/conf 6. Add the following paths to the yarn.application.classpath property $HADOOP_CONF_DIR $HADOOP_MAPRED_HOME/* $HADOOP_MAPRED_HOME/lib/* 7. Complete the manual configuration steps. For more information about the manual configuration steps for Cloudera CDH, see the Informatica Big Data Edition Installation and Configuration Guide. The utility creates the following files in the <Informatica installation directory>/tools/bdeutil directory: ClusterConfig.properties Contains details about the properties fetched from the Hadoop cluster and connection creation commands, including HiveServer2 connections if HiveServer2 is selected. HiveServer2_EnvInfa.txt Contains the list of environment variables and values that need to be copied to the HiveServer2 environment on the Hadoop cluster. This file is created only if you choose HiveServer2. Use SSH If you choose SSH, you must provide host names and Hadoop configuration file locations. Note: Informatica recommends that you use an SSH connection without a password or have sshpass or Expect installed. If you do not use one of these methods, you must enter the password each time the utility downloads a file from the Hadoop cluster. Verify the following host names: NameNode, JobTracker, and Hive client. Additionally, verify the locations for the following files on the Hadoop cluster: hdfs-site.xml core-site.xml mapred-site.xml yarn-site.xml hive-site.xml Perform the following steps to configure Big Data Edition: 1. Enter the NameNode host name. 6 BD-RLN-961HF3-0UD3

2. Enter the SSH user ID. 3. Enter the password for the SSH user ID. If you use an SSH connection without a password, leave this field blank and press enter. 4. Enter the location for the hdfs-site.xml file on the Hadoop cluster. 5. Enter the location for the core-site.xml file on the Hadoop cluster. The Big Data Edition Configuration Utility connects to the NameNode and downloads the following files: hdfs-site.xml and core-site.xml. 6. Enter the JobTracker host name. 7. Enter the SSH user ID. 8. Enter the password for the SSH user ID. If you use an SSH connection without a password, leave this field blank and press enter. 9. Enter the directory for the mapred-site.xml file on the Hadoop cluster. 10. Enter the directory for the yarn-site.xml file on the Hadoop cluster. The utility connects to the JobTracker and downloads the following files: mapred-site.xml and yarn-site.xml. 11. Enter the Hive client host name. 12. Enter the SSH user ID. 13. Enter the password for the SSH user ID. If you use an SSH connection without a password, leave this field blank and press enter. 14. Enter the directory for the hive-site.xml file on the Hadoop cluster. The utility connects to the Hive client and downloads the following file: hive-site.xml. 15. Complete the manual configuration steps. For more information about the manual configuration steps required for the Hadoop distribution, see the Informatica Big Data Edition Installation and Configuration Guide. The utility creates the following files in the <Informatica installation directory>/tools/bdeutil directory: ClusterConfig.properties Contains details about the properties fetched from the Hadoop cluster and connection creation commands, including HiveServer2 connections if HiveServer2 is selected. HiveServer2_EnvInfa.txt Contains the list of environment variables and values that need to be copied to the HiveServer2 environment on the Hadoop cluster. This file is created only if you choose HiveServer2. Use a Shared Directory If you choose shared directory, perform the following steps to configure Big Data Edition: 1. Enter the location of the shared directory. BD-RLN-961HF3-0UD3 7

Note: You must have read permission for the directory, and the directory should contain the following files: core-site.xml hdfs-site.xml hive-site.xml mapred-site.xml yarn-site.xml 2. Complete the manual configuration steps. For more information about the manual configuration steps required for the Hadoop distribution, see the Informatica Big Data Edition Installation and Configuration Guide. The utility creates the following files in the <Informatica installation directory>/tools/bdeutil directory: ClusterConfig.properties Contains details about the properties fetched from the Hadoop cluster and connection creation commands, including HiveServer2 connections if HiveServer2 is selected. HiveServer2_EnvInfa.txt Contains the list of environment variables and values that need to be copied to the HiveServer2 environment on the Hadoop cluster. This file is created only if you choose HiveServer2. Troubleshooting the Configuration Utility Consider the following troubleshooting tips when you perform the post-installation tasks: In the ClusterConfig.properties file, the hostname is incorrect for the command templates if I use the shared directory option for the Big Data Edition Configuration Utility. If the utility cannot determine the host name for the connection based on the files in the shared directory, the utility uses "localhost." Manually replace "localhost" with the host name for the connection. In the ClusterConfig.properties file, which user do I provide for the UserName parameter in the Hive remote connection command template? Provide the user name of the user that the Data Integration Service impersonates to run mappings on a Hadoop cluster. In the ClusterConfig.properties file, which user do I provider for the USERNAME parameter in the HDFS connection command template? Provide the user name that is used to access HDFS. 8 BD-RLN-961HF3-0UD3

Create and Configure the Analyst Service To use the Analyst Service with a Hadoop cluster that uses Kerberos authentication, create the Analyst Service and configure it to use the Kerberos ticket for the Data Integration Service. Perform the following steps: 1. Verify that the Data Integration Service is configured for Kerberos. For more information, see the Informatica Big Data Edition User Guide. 2. Create an Analyst Service. For more information about how to create the Analyst Service, see the Informatica Application Services Guide. 3. Log in to the Administrator tool. 4. In the Domain Navigator, select the Analyst Service. 5. In the Processes tab, edit the Advanced Properties. 6. Add the following value to the JVM Command Line Options field: DINFA_HADOOP_DIST_DIR=<Informatica installation directory>/services/shared/hadoop/ <hadoop_distribution>. New Features and Enhancements This section describes new features and enhancements to Big Data Edition 9.6.1 HotFix 3 Update 3. Hadoop Ecosystem Effective in 9.6.1 HotFix 3 Update 3, Big Data Edition added support for Hadoop clusters that run the following Hadoop distributions: Cloudera CDH 5.5 IBM BigInsights 4.1 For more information, see the Informatica 9.6.1 HotFix 3 Update 3 Installation and Configuration Guide. HiveServer2 Effective in version 9.6.1 HotFix 3 Update 3, Big Data Edition supports HiveServer2 for running mappings in the Hadoop environment. Select HiveServer2 when you configure Big Data Edition with the Big Data Edition Configuration Utility or edit the hadoopenv.properties file to use HiveServer2. You can use HiveServer2 with the following distributions: Cloudera CDH 5.4 Hortonworks HDP 2.3 and 2.2 MapR (YARN) 4.0.2 For more information, see the Informatica 9.6.1 HotFix 3 Update 3 Installation and Configuration Guide. BD-RLN-961HF3-0UD3 9

Changes This section describes changes to Big Data Edition 9.6.1 HotFix 3 Update 3. Hadoop Distributions Effective in version 9.6.1 HotFix 3 Update 3, Big Data Edition dropped support for Hadoop clusters that run the following Hadoop distributions: Cloudera CDH 5.2 IBM BigInsights 3.0 Fixed Limitations The following table describes fixed limitations: CR 444925 Profiling on a Hive table fails if the Hive table contains a large amount of data. 445395 A PowerCenter session hangs if an HDFS source on a MapR cluster uses N way partitioning. 444417/432299 Cloudera Navigator resource loads might not complete when the Hadoop cluster contains more than 2 million entities 444264 The Data Integration Service shuts down unexpectedly with the following error message when you run mappings in the Hive environment: Too many open files 444123 A mapping fails to run in the Hive environment if an Aggregator expression uses the date column. 439715 A mapping fails to run in the Hive environment when a large number of mappings run concurrently. The following message appears: org.apache.hadoop.hive.ql.exec.ddltask. AlreadyExistsException(message:Table w423814594_xxxxxxxxxx_m_xxxx already exists) 439167 The DLM mastership operation timeout period is not configurable. 439125 A mapping that contains a Terdata target with TDCH fails when you use the infacmd ms RunMapping command. The following error message appears: [ICMD_10033] Command [RunMapping] failed with error [[LDTM_1055] The Integration Service failed to generate a Hive workflow for mapping [Mapping_TPT_1].]. 436598 Data preview on a Data Processor transformation fails if the Hadoop cluster uses Kerberos authentication. 10 BD-RLN-961HF3-0UD3

CR 433719 A mapping fails to run in the Hive environment when the following conditions are true: - The mapping contains a Union transformation. - The user designated in the Hive connection is not the Data Integration Service user. 428094 A mapping fails to run in the Hive environment on a highly available Hortonworks HDP cluster. 421608 Rolling back an EBF takes a long time. 418059/436913 The UUID_UNPARSE(UUID4() function returns a null value. 417272 A mapping fails to run in the Hive environment when the following conditions are true: - The Hadoop cluster runs Hortonworks HDP 2.2. - The mapping has a flat file target. - The user designated in the Hive connection is not the Data Integration Service user. Known Limitations The following table describes known limitations: CR 446666 A mapping fails to run in the native environment if you select both native and Hive for the validation environment. Workaround: Select only native as the validation environment to run a mapping in the native environment. 446321 A mapping fails to run in the Hive environment with HiveServer2 if the Cloudera CDH cluster uses Kerberos authentication. Workaround: You must specify a user name in the Hive connection. 446216 A mapping fails to run in the Hive environment on a Pivotal HD cluster. 445244 After you register the PowerCenter repository plug-in for IBM BigInsights 4.1, the Hadoop distribution for the HDFS connection object is blank. Workaround: Manually select IBM BigInsights 4.1. 445167 Validation of an application fails if it contains a mapping with a warning. 444858 A mapping fails to run in the Hive environment if it contains an expression that contains only literals. 444759 HBase mappings that use protobuf format in the Hive environment fail with the following error: TM_6085 A fatal error occurred at transformation [HBase_ProtoBuf_Write_Integer], and the session is terminating. 444468 A mapping fails to run in the Hive environment with HiveServer2 if one or more of the following values for the Hive connection are missing: - Default FS URI - Database name - Hive warehouse directory BD-RLN-961HF3-0UD3 11

CR 444007 Job logs are not created in the following directory when the Hadoop cluster uses HiveServer2: tomcat/bin/distemp. 441115 The getquerylog() command does not work for HiveServer2 if the Hadoop cluster uses a Hive version before version.14. You can ignore the warning message if the Hive version is before version.14. Third-Party Limitations The following table describes third-party limitations: CR 428194 The Update Strategy transformation fails to insert data into a bucketed target table. This is a third-party limitation for Hive versions before 1.3. For more information, see the following Hive limitation:https://issues.apache.org/jira/browse/hive-10151 Informatica Global Customer Support You can contact a Customer Support Center by telephone or through the Online Support. Online Support requires a user name and password. You can request a user name and password at http://mysupport.informatica.com. The telephone numbers for Informatica Global Customer Support are available from the Informatica web site at http://www.informatica.com/us/services-and-training/support-services/global-support-centers/. 12 BD-RLN-961HF3-0UD3