How to Configure Big Data Management 10.1 for MapR 5.1 Security Features
|
|
- Penelope James
- 6 years ago
- Views:
Transcription
1 How to Configure Big Data Management 10.1 for MapR 5.1 Security Features 2014, 2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica Corporation. All other company and product names may be trade names or trademarks of their respective owners and/or copyrighted materials of such owners.
2 Abstract Informatica Big Data Management 10.1 adds support for MapR 5.1 Kerberos and MapR Ticket. This document describes how to use these MapR security features with Big Data Management and MapR 5.1. Supported Versions Informatica Big Data Management 10.1 Table of Contents Overview... 2 Install the RPM Archive and the EBF on MapR Cluster Nodes Installing on an Existing Cluster Configure the Informatica Domain to Communicate with a Kerberos-Enabled MapR 5.1 Cluster Generate a MapR Ticket Enable the TLS Protocol and Kerberos Copy the Truststore File to the Data Integration Service Machine Add Additional Properties for the Native Runtime Engine Add Additional Properties for the Blaze Runtime Engine Configure Hive Connection Properties Edit hive-site.xml to Enable a Mapping to Run with the Hive Run-Time Engine... 6 Configure the Informatica Domain to Communicate with a Cluster that Uses MapR Ticket Authentication... 6 Configure Hive and HDFS Metadata Fetch for MapR Ticket or Kerberos Running Mappings Using the Teradata Connector for Hadoop on a Hive or Blaze Engine Overview Informatica Big Data Management EBF adds support for MapR 5.1 security features. You can implement Big Data Management in a MapR environment that uses Kerberos or MapR Ticket for authentication. Before you follow the steps in this article, perform the following preliminary steps: 1. Install and configure an Informatica domain. 2. Configure Big Data Management to run mappings in a MapR 5.1 cluster. For more information, see the Big Data Management 10.1 Installation and Configuration Guide. After you complete the preliminary steps, perform the following steps to enable Big Data Management to use the security features of MapR 5.1: 1. Download and install EBF Select one of the following authentication methods: Configure the Informatica domain to communicate with the Kerberos-enabled MapR 5.1 cluster. Configure the Informatica domain to communicate with a cluster that uses MapR Ticket authentication. 3. Optionally, configure Hive and HDFS metadata fetch for MapR Ticket. 2
3 Install the RPM Archive and the EBF on MapR Cluster Nodes Install the RPM archive and the EBF on each node of the cluster. Installing on an Existing Cluster If you are installing the EBF on an existing cluster, download and uncompress the EBF archive file, and then perform the following steps: 1. Install the RPM package. 2. Install server and Big Data Management Utility updates. 3. Update clients. 4. Copy.jar files that you need to run mappings on the MapR cluster. Install the RPM Package Install the RPM package to implement support for MapR 5.1 on each cluster node. 1. Edit the input.properties file with the following information: DEST_DIR - destination directory for Informatica Big Data Management. Note: Install the binaries on each node of the cluster. 2. Type installebf.sh to run the installer. 3. Accept the license terms. 4. Select 1 to install the package on a local cluster node. Install Server Updates Install server updates on the machine where the Data Integration Service runs. 1. Edit the input.properties file with the following information: DEST_DIR - destination directory on the on the machine where the Data Integration Service runs. 2. Type installebf.sh to run the installer. The installer updates servers and the Big Data Management Configuration Utility. Update Clients Update clients on the Informatica host machine. 1. Edit the input.properties file with the following information: DEST_DIR - destination directory on the on the machine where clients are installed. 2. Type installebf.sh to run the installer. The installer updates the Developer tool. Configure the Informatica Domain to Communicate with a Kerberos-Enabled MapR 5.1 Cluster To use Kerberos authentication, perform the following tasks: 1. Generate a MapR ticket. 2. Enable the TLS protocol and Kerberos. 3. Copy the Truststore file to the Data Integration Service machine. 3
4 4. Add additional properties for the runtime engine you want to use. 5. Configure Hive connection properties. 6. Edit hive-site.xml to add properties to enable mappings to use the Hive run-time engine. Generate a MapR Ticket To enable mappings to run on a Kerberos-enabled MapR cluster, generate a MapR ticket for the Data Integration Service user. 1. Run the MapR kinit utility on the CLDB node of the cluster to create a Kerberos ticket for the Data Integration Service user. For information about how to generate MapR Tickets, refer to MapR documentation. 2. Run the maprlogin kerberos utility. Type: maprlogin kerberos The utility generates a MapR ticket in the /tmp directory using the following naming convention: maprticket_<userid> where <userid> corresponds to the Data Integration Service user. 3. Copy the ticket file from the cluster node to the following directory on the VM that runs the Data Integration Service: /tmp Enable the TLS Protocol and Kerberos Enable the TLS security protocol in the Administrator tool. 1. In the Administrator tool, browse to the Data Integration Service Process Properties tab. 2. In the Advanced Properties area, add the following line to the JVM Command Line Options: -Dhadoop.login=<MAPR_ECOSYSTEM_LOGIN_OPTS> -Dhttps.protocols=TLSv1.2 where <MAPR_ECOSYSTEM_LOGIN_OPTS> is the value of the MAPR_ECOSYSTEM_LOGIN_OPTS property in the file /opt/mapr/conf/env.sh. 3. Restart the Data Integration Service for the change to take effect. Copy the Truststore File to the Data Integration Service Machine 1. Copy the truststore file from the following location on the cluster: /opt/mapr/conf/ssl_truststore 2. Paste the truststore file to the following location on the Data Integration Service host: <Informatica installation directory>/services/shared/hadoop/mapr_5.1.0/conf/ssl_truststore Add Additional Properties for the Native Runtime Engine To use the native Data Integration Service to run mappings, define custom properties in the Administrator tool. 1. In the Administrator tool, browse to the Data Integration Service Process tab. 4
5 2. In the Custom Properties area, define the following properties and values: Property Value ExecutionContextOptions.JVMOption2 -Dhadoop.login=<MAPR_ECOSYSTEM_LOGIN_OPTS> - Dhttps.protocols=TLSv1.2 where <MAPR_ECOSYSTEM_LOGIN_OPTS> is the value of the MAPR_ECOSYSTEM_LOGIN_OPTS property in the file /opt/mapr/conf/ env.sh. ExecutionContextOptions.JVMOption7 -Dhttps.protocols=TLSv1.2 Add Additional Properties for the Blaze Runtime Engine To use the Blaze engine to run mappings, define custom properties in the Administrator tool. 1. In the Administrator tool, browse to the Data Integration Service Process tab. 2. In the environment variables area, define the Kerberos authentication protocol: Property JAVA_OPTS Value -Dhadoop.login=<MAPR_ECOSYSTEM_LOGIN_OPTS> -Dhttps.protocols=TLSv1.2 where <MAPR_ECOSYSTEM_LOGIN_OPTS> is the value of the MAPR_ECOSYSTEM_LOGIN_OPTS property in the file /opt/mapr/conf/env.sh. Configure Hive Connection Properties To use Hive to run mappings, you configure the Data Access Connection String with the service principal name. 1. In the Administrator tool, browse to the Connections tab and browse to the HiveServer2 Connection Properties area. 2. Configure the following connection properties: Property Metadata Connection String Data Access Connection String Value jdbc:hive2://<domain_host>:<port_number>/ default;principal=<service_principal_name> jdbc:hive2://<domain_host>:<port_number>/ default;principal=<service_principal_name> Note: You can retrieve the service principal name from the MapR Control System browser. 5
6 3. In the environment variables area, configure the following property to define the Kerberos authentication protocol: Property JAVA_OPTS Value -Dhadoop.login=<MAPR_ECOSYSTEM_LOGIN_OPTS> -Dhttps.protocols=TLSv1.2 where <MAPR_ECOSYSTEM_LOGIN_OPTS> is the value of the MAPR_ECOSYSTEM_LOGIN_OPTS property in the file /opt/mapr/conf/env.sh. Edit hive-site.xml to Enable a Mapping to Run with the Hive Run-Time Engine To run mappings using Hive, open the file <Informatica installation directory>/services/shared/hadoop/ mapr_<version>/conf/hive-site.xml for editing and make the following changes: 1. Locate the hive.server2.authentication property and change the value as follows: <property> <name>hive.server2.authentication</name> <value>kerberos</value> <description> </description> </property> 2. Add the following property: <property> <name>hive.metastore.sasl.enabled</name> <value>true</value> <description> </description> </property> 3. Open the file /opt/mapr/conf/hive-site.xml on the cluster and copy the following properties: hive-metastore.principal hive-metastore.keytab Paste both properties to <Informatica installation directory>/services/shared/hadoop/ mapr_<version>/conf/hive-site.xml Configure the Informatica Domain to Communicate with a Cluster that Uses MapR Ticket Authentication You can use the following runtime engines to run mappings on MapR clusters that use the MapR Ticket method of authentication: Native (Data Integration Service) Blaze Hive To use the MapR Ticket method of authentication, perform the following tasks: 1. Retrieve the value of the hive.server2.authentication property. 2. Open the following file for editing: <Informatica Installation Directory>/services/shared/hadoop/mapr_5.1.0/conf/hivesite.xml 3. Change the value of the hive.server2.authentication property from NONE to the hive.server2.authentication property value that you obtained from the cluster. 6
7 4. Add the following property to the hive-site.xml file: <property> <name>hive.metastore.sasl.enabled</name> <value>true</value> </property> Then save and close the hive-site.xml file. 5. Generate a MapR Ticket file for the Data Integration Service user on the cluster. To understand how to generate MapR tickets, refer to MapR documentation. 6. Copy the MapR ticket file to the following directory on the VM that runs the Data Integration Service: /tmp Configure Hive and HDFS Metadata Fetch for MapR Ticket or Kerberos Note: Before performing this task, complete the steps in the topic "Configure the Developer Tool" in the Informatica 10.1 Big Data Management Installation and Configuration Guide. To configure users for MapR Ticket or Kerberos-enabled MapR clusters, establish Linux accounts and configure user permissions for users. 1. Create a Linux user on the node where the HiveServer2 service runs. Use the same username as the Windows user account that runs the Developer tool client. We will refer to this user as the client user. 2. If the cluster is Kerberos-enabled, you can perform the following steps to generate a MapR ticket. Alternatively, follow steps 3 and 4. a. Install maprclient on the Windows machine. b. Generate a Kerberos ticket on the Windows machine. c. Use maprlogin to generate a maprticket at %TEMP%. Skip to step On the same node, log in as the client user and generate a MapR ticket. Refer to MapR documentation for more information. If the cluster is not Kerberos-enabled, follow these steps: a. Type the following command: maprlogin password b. When prompted, provide the password for the client user. If the cluster is Kerberos-enabled, follow these steps: a. Generate a Kerberos ticket using kinit. b. Type the following command to generate a maprticket: maprlogin kerberos The cluster generates a MapR ticket associated with the client user. By default, tickets on Linux systems are generated in the /tmp directory and have a name like maprticket_<username>. 4. Copy the MapR ticket file and paste it to the %TEMP% directory on the Windows machine. 5. Rename the file like this: maprticket_<username> where <username> is the username of the client user. 6. On the MapR Control System browser, get the value of the property hive.server2.authentication. 7
8 7. Open the file <Informatica_client_installation>\clients\DeveloperClient\hadoop \mapr_<version_number>\conf\hive-site.xml for editing. 8. Change the value of the property hive.server2.authentication from NONE to the value you got in Step 5. Note: If Kerberos is enabled on the cluster, comment out the hive.server2.authentication property in hivesite.xml. 9. Add the following lines to the hive-site.xml file: <property> <name>hive.metastore.sasl.enabled</name> <value>true</value> </property> 10. Save and close the hive-site.xml file. To test the Hive connection, or perform a metadata fetch task, use the following format for the connection string if the cluster is Kerberos-enabled: Example: jdbc:hive2://<hostname>:10000/default;principal=<spn> jdbc:hive2://myserver2:10000/default;principal=mapr/myserver2@clustername If custom authentication is enabled, specify the user name and password in the Database Connection tab of the Hive connection. Running Mappings Using the Teradata Connector for Hadoop on a Hive or Blaze Engine When you use Teradata Connector for Hadoop to run Teradata mappings on a Hive or Blaze engine, the Data Integration Service user must have the MapR Ticket available on all the nodes of the Hadoop cluster. Author Big Data Management Team 8
How to Run the Big Data Management Utility Update for 10.1
How to Run the Big Data Management Utility Update for 10.1 2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording
More informationHow to Install and Configure EBF15545 for MapR with MapReduce 2
How to Install and Configure EBF15545 for MapR 4.0.2 with MapReduce 2 1993-2015 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic,
More informationUpgrading Big Data Management to Version Update 2 for Hortonworks HDP
Upgrading Big Data Management to Version 10.1.1 Update 2 for Hortonworks HDP Copyright Informatica LLC 2017. Informatica, the Informatica logo, and Informatica Big Data Management are trademarks or registered
More informationPre-Installation Tasks Before you apply the update, shut down the Informatica domain and perform the pre-installation tasks.
Informatica LLC Big Data Edition Version 9.6.1 HotFix 3 Update 3 Release Notes January 2016 Copyright (c) 1993-2016 Informatica LLC. All rights reserved. Contents Pre-Installation Tasks... 1 Prepare the
More informationHow to Install and Configure EBF16193 for Hortonworks HDP 2.3 and HotFix 3 Update 2
How to Install and Configure EBF16193 for Hortonworks HDP 2.3 and 9.6.1 HotFix 3 Update 2 1993-2015 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any
More informationUpgrading Big Data Management to Version Update 2 for Cloudera CDH
Upgrading Big Data Management to Version 10.1.1 Update 2 for Cloudera CDH Copyright Informatica LLC 2017. Informatica, the Informatica logo, and Informatica Cloud are trademarks or registered trademarks
More informationHow to Install and Configure EBF14514 for IBM BigInsights 3.0
How to Install and Configure EBF14514 for IBM BigInsights 3.0 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,
More informationHow to Configure Informatica HotFix 2 for Cloudera CDH 5.3
How to Configure Informatica 9.6.1 HotFix 2 for Cloudera CDH 5.3 1993-2015 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,
More informationCreating Column Profiles on LDAP Data Objects
Creating Column Profiles on LDAP Data Objects Copyright Informatica LLC 1993, 2017. Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,
More informationConfiguring a Hadoop Environment for Test Data Management
Configuring a Hadoop Environment for Test Data Management Copyright Informatica LLC 2016, 2017. Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic,
More informationInformatica Cloud Spring Complex File Connector Guide
Informatica Cloud Spring 2017 Complex File Connector Guide Informatica Cloud Complex File Connector Guide Spring 2017 October 2017 Copyright Informatica LLC 2016, 2017 This software and documentation are
More informationInformatica Cloud Spring Hadoop Connector Guide
Informatica Cloud Spring 2017 Hadoop Connector Guide Informatica Cloud Hadoop Connector Guide Spring 2017 December 2017 Copyright Informatica LLC 2015, 2017 This software and documentation are provided
More informationUsing Synchronization in Profiling
Using Synchronization in Profiling Copyright Informatica LLC 1993, 2017. Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,
More informationHow to Write Data to HDFS
How to Write Data to HDFS 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior
More informationUsing Two-Factor Authentication to Connect to a Kerberos-enabled Informatica Domain
Using Two-Factor Authentication to Connect to a Kerberos-enabled Informatica Domain Copyright Informatica LLC 2016, 2018. Informatica LLC. No part of this document may be reproduced or transmitted in any
More informationInformatica PowerExchange for Microsoft Azure Blob Storage 10.2 HotFix 1. User Guide
Informatica PowerExchange for Microsoft Azure Blob Storage 10.2 HotFix 1 User Guide Informatica PowerExchange for Microsoft Azure Blob Storage User Guide 10.2 HotFix 1 July 2018 Copyright Informatica LLC
More informationPentaho MapReduce with MapR Client
Pentaho MapReduce with MapR Client Change log (if you want to use it): Date Version Author Changes Contents Overview... 1 Before You Begin... 1 Use Case: Run MapReduce Jobs on Cluster... 1 Set Up Your
More informationCreating OData Custom Composite Keys
Creating OData Custom Composite Keys 1993, 2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without
More informationConfiguring Sqoop Connectivity for Big Data Management
Configuring Sqoop Connectivity for Big Data Management Copyright Informatica LLC 2017. Informatica, the Informatica logo, and Big Data Management are trademarks or registered trademarks of Informatica
More informationHow to Connect to a Microsoft SQL Server Database that Uses Kerberos Authentication in Informatica 9.6.x
How to Connect to a Microsoft SQL Server Database that Uses Kerberos Authentication in Informatica 9.6.x Copyright Informatica LLC 2015, 2017. Informatica Corporation. No part of this document may be reproduced
More informationNew Features and Enhancements in Big Data Management 10.2
New Features and Enhancements in Big Data Management 10.2 Copyright Informatica LLC 2017. Informatica, the Informatica logo, Big Data Management, and PowerCenter are trademarks or registered trademarks
More informationSAS Data Loader 2.4 for Hadoop
SAS Data Loader 2.4 for Hadoop vapp Deployment Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. SAS Data Loader 2.4 for Hadoop: vapp Deployment
More informationPublishing and Subscribing to Cloud Applications with Data Integration Hub
Publishing and Subscribing to Cloud Applications with Data Integration Hub 1993-2015 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,
More informationGuidelines - Configuring PDI, MapReduce, and MapR
Guidelines - Configuring PDI, MapReduce, and MapR This page intentionally left blank. Contents Overview... 1 Set Up Your Environment... 2 Get MapR Server Information... 2 Set Up Your Host Environment...
More informationSecurity Enhancements in Informatica 9.6.x
Security Enhancements in Informatica 9.6.x 1993-2016 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or
More informationConfiguring Intelligent Streaming 10.2 For Kafka on MapR
Configuring Intelligent Streaming 10.2 For Kafka on MapR Copyright Informatica LLC 2017. Informatica and the Informatica logo are trademarks or registered trademarks of Informatica LLC in the United States
More informationHow to Optimize Jobs on the Data Integration Service for Performance and Stability
How to Optimize Jobs on the Data Integration Service for Performance and Stability 1993-2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic,
More informationHow to Install and Configure Big Data Edition for Hortonworks
How to Install and Configure Big Data Edition for Hortonworks 1993-2015 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,
More informationConfiguring Ports for Big Data Management, Data Integration Hub, Enterprise Information Catalog, and Intelligent Data Lake 10.2
Configuring s for Big Data Management, Data Integration Hub, Enterprise Information Catalog, and Intelligent Data Lake 10.2 Copyright Informatica LLC 2016, 2017. Informatica, the Informatica logo, Big
More informationTuning Intelligent Data Lake Performance
Tuning Intelligent Data Lake Performance 2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without
More informationUsing Data Replication with Merge Apply and Audit Apply in a Single Configuration
Using Data Replication with Merge Apply and Audit Apply in a Single Configuration 2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic,
More informationSandbox Setup Guide for HDP 2.2 and VMware
Waterline Data Inventory Sandbox Setup Guide for HDP 2.2 and VMware Product Version 2.0 Document Version 10.15.2015 2014-2015 Waterline Data, Inc. All rights reserved. All other trademarks are the property
More informationMigrating Mappings and Mapplets from a PowerCenter Repository to a Model Repository
Migrating Mappings and Mapplets from a PowerCenter Repository to a Model Repository 2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic,
More informationBLUEPRINT TEAM REPOSITORY. For Requirements Center & Requirements Center Test Definition
BLUEPRINT TEAM REPOSITORY Installation Guide for Windows For Requirements Center & Requirements Center Test Definition Table Of Contents Contents Table of Contents Getting Started... 3 About the Blueprint
More informationCreating an Avro to Relational Data Processor Transformation
Creating an Avro to Relational Data Processor Transformation 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,
More informationCloudera ODBC Driver for Apache Hive
Cloudera ODBC Driver for Apache Hive Important Notice 2010-2017 Cloudera, Inc. All rights reserved. Cloudera, the Cloudera logo, and any other product or service names or slogans contained in this document,
More informationConfiguring SAML-based Single Sign-on for Informatica Web Applications
Configuring SAML-based Single Sign-on for Informatica Web Applications Copyright Informatica LLC 2017. Informatica LLC. Informatica, the Informatica logo, Informatica Big Data Management, and Informatica
More informationImportant Notice Cloudera, Inc. All rights reserved.
Important Notice 2010-2017 Cloudera, Inc. All rights reserved. Cloudera, the Cloudera logo, and any other product or service names or slogans contained in this document, except as otherwise disclaimed,
More informationWhite Paper. Fabasoft on Linux - Fabasoft Folio Web Management. Fabasoft Folio 2017 R1 Update Rollup 1
White Paper Fabasoft on Linux - Fabasoft Folio Web Management Fabasoft Folio 2017 R1 Update Rollup 1 Copyright Fabasoft R&D GmbH, Linz, Austria, 2018. All rights reserved. All hardware and software names
More informationHow to Run a PowerCenter Workflow from SAP
How to Run a PowerCenter Workflow from SAP 1993-2015 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or
More informationData Access 3. Starting Apache Hive. Date of Publish:
3 Starting Apache Hive Date of Publish: 2018-07-12 http://docs.hortonworks.com Contents Start a Hive shell locally...3 Start Hive as an authorized user... 4 Run a Hive command... 4... 5 Start a Hive shell
More informationAccessing clusters 2. Accessing Clusters. Date of Publish:
2 Accessing Clusters Date of Publish: 2018-09-14 http://docs.hortonworks.com Contents Cloudbreak user accounts... 3 Finding cluster information in the web UI... 3 Cluster summary... 4 Cluster information...
More informationChanging the Password of the Proactive Monitoring Database User
Changing the Password of the Proactive Monitoring Database User 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,
More informationHow to Configure the SAP Secure Network Communication Protocol in PowerCenter Version 9.6.x
How to Configure the SAP Secure Network Communication Protocol in PowerCenter Version 9.6.x Copyright Informatica LLC, 2017. Informatica Corporation. No part of this document may be reproduced or transmitted
More informationInstallation and Configuration Guide Simba Technologies Inc.
Simba Hive ODBC Driver with SQL Connector Installation and Configuration Guide Simba Technologies Inc. Version 2.6.1 August 3, 2018 Copyright 2018 Simba Technologies Inc. All Rights Reserved. Information
More informationManually Defining Constraints in Enterprise Data Manager
Manually Defining Constraints in Enterprise Data Manager 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording
More informationHow to Configure MapR Hive ODBC Connector with PowerCenter on Linux
How to Configure MapR Hive ODBC Connector with PowerCenter on Linux Copyright Informatica LLC 2017. Informatica, the Informatica logo, and PowerCenter are trademarks or registered trademarks of Informatica
More informationDesktop Installation Guide
Desktop Installation Guide Desktop Installation Guide Legal notice Copyright 2017 LAVASTORM ANALYTICS, INC. ALL RIGHTS RESERVED. THIS DOCUMENT OR PARTS HEREOF MAY NOT BE REPRODUCED OR DISTRIBUTED IN ANY
More informationEnterprise Steam Installation and Setup
Enterprise Steam Installation and Setup Release H2O.ai Mar 01, 2017 CONTENTS 1 Installing Enterprise Steam 3 1.1 Obtaining the License Key........................................ 3 1.2 Ubuntu Installation............................................
More informationHow Do I Manage Active Directory
How Do I Manage Active Directory Your Red Box Recorder supports Windows Active Directory integration and Single Sign-On. This Quick Question topic is provided for system administrators and covers the setup
More informationInstallation and Configuration Guide Simba Technologies Inc.
Simba Drill ODBC Driver with SQL Connector Installation and Configuration Guide Simba Technologies Inc. Version 1.3.15 November 1, 2017 Copyright 2017 Simba Technologies Inc. All Rights Reserved. Information
More informationSecurity 3. NiFi Authentication. Date of Publish:
3 Date of Publish: 2018-08-13 http://docs.hortonworks.com Contents... 3 Enabling SSL with a NiFi Certificate Authority... 5 Enabling SSL with Existing Certificates... 5 (Optional) Setting Up Identity Mapping...6
More informationVeritas NetBackup for Hadoop Administrator's Guide
Veritas NetBackup for Hadoop Administrator's Guide UNIX, Windows, and Linux Release 8.1.1 Veritas Hadoop Administartor's Guide Last updated: 2018-05-10 Document version:netbackup 8.1.1 Legal Notice Copyright
More informationSecuring the Oracle BDA - 1
Hello and welcome to this online, self-paced course titled Administering and Managing the Oracle Big Data Appliance (BDA). This course contains several lessons. This lesson is titled Securing the Oracle
More informationDesktop Installation Guide
Desktop Installation Guide Desktop Installation Guide Legal notice Copyright 2018 LAVASTORM ANALYTICS, INC. ALL RIGHTS RESERVED. THIS DOCUMENT OR PARTS HEREOF MAY NOT BE REPRODUCED OR DISTRIBUTED IN ANY
More informationOracle Big Data Manager User s Guide. For Oracle Big Data Appliance
Oracle Big Data Manager User s Guide For Oracle Big Data Appliance E96163-02 June 2018 Oracle Big Data Manager User s Guide, For Oracle Big Data Appliance E96163-02 Copyright 2018, 2018, Oracle and/or
More informationHortonworks Data Platform
Hortonworks Data Platform Workflow Management (August 31, 2017) docs.hortonworks.com Hortonworks Data Platform: Workflow Management Copyright 2012-2017 Hortonworks, Inc. Some rights reserved. The Hortonworks
More informationSecuring Hadoop. Keys Botzum, MapR Technologies Jan MapR Technologies - Confiden6al
Securing Hadoop Keys Botzum, MapR Technologies kbotzum@maprtech.com Jan 2014 MapR Technologies - Confiden6al 1 Why Secure Hadoop Historically security wasn t a high priority Reflec6on of the type of data
More informationCloudera ODBC Driver for Apache Hive Version
Cloudera ODBC Driver for Apache Hive Version 2.5.17 Important Notice 2010-2016 Cloudera, Inc. All rights reserved. Cloudera, the Cloudera logo, Cloudera Impala, Impala, and any other product or service
More informationEnabling SAML Authentication in an Informatica 10.2.x Domain
Enabling SAML Authentication in an Informatica 10.2.x Domain Copyright Informatica LLC 2017, 2018. Informatica, the Informatica logo, Informatica Big Data Management, and Informatica PowerCenter are trademarks
More informationInformatica Cloud Spring REST API Connector Guide
Informatica Cloud Spring 2017 REST API Connector Guide Informatica Cloud REST API Connector Guide Spring 2017 December 2017 Copyright Informatica LLC 2016, 2018 This software and documentation are provided
More informationCloudera ODBC Driver for Apache Hive Version
Cloudera ODBC Driver for Apache Hive Version 2.5.10 Important Notice 2010-2014 Cloudera, Inc. All rights reserved. Cloudera, the Cloudera logo, Cloudera Impala, Impala, and any other product or service
More informationArtix Orchestration Installation Guide. Version 4.2, March 2007
Artix Orchestration Installation Guide Version 4.2, March 2007 IONA Technologies PLC and/or its subsidiaries may have patents, patent applications, trademarks, copyrights, or other intellectual property
More informationUsing Standard Generation Rules to Generate Test Data
Using Standard Generation Rules to Generate Test Data 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording
More informationSophos for Virtual Environments. configuration guide -- Sophos Central edition
Sophos for Virtual Environments configuration guide -- Sophos Central edition Contents About this guide... 1 Configure policies...2 Check that guest VMs are protected...5 Check the protection settings...5
More informationHow to Use Full Pushdown Optimization in PowerCenter
How to Use Full Pushdown Optimization in PowerCenter 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording
More informationDisaster Recovery Procedure for a RulePoint Setup Spanning Across Geographical Locations
Disaster Recovery Procedure for a RulePoint Setup Spanning Across Geographical Locations 2014 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means
More informationTable of Contents. Abstract
JDBC User Guide 2013 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent
More informationRunning PowerCenter Advanced Edition in Split Domain Mode
Running PowerCenter Advanced Edition in Split Domain Mode 1993-2016 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,
More informationSAS Data Loader 2.4 for Hadoop: User s Guide
SAS Data Loader 2.4 for Hadoop: User s Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2016. SAS Data Loader 2.4 for Hadoop: User s Guide. Cary,
More informationCreating a Subset of Production Data
Creating a Subset of Production Data 2013 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise)
More informationBusiness Glossary Best Practices
Business Glossary Best Practices 1993-2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without
More informationServer Installation Guide
Server Installation Guide Server Installation Guide Legal notice Copyright 2018 LAVASTORM ANALYTICS, INC. ALL RIGHTS RESERVED. THIS DOCUMENT OR PARTS HEREOF MAY NOT BE REPRODUCED OR DISTRIBUTED IN ANY
More informationImport Data Connection from an SAP Universe
Import Data Connection from an SAP Universe SAP Analytics Cloud allows you to connect to SAP Universe and import your data. NOTE: It is recommended that the SAP Cloud Platform Cloud Connector (SAP CP CC)
More informationHow to connect to Cloudera Hadoop Data Sources
How to connect to Cloudera Hadoop Data Sources InfoCaptor works with both ODBC and JDBC protocol. Depending on the availability of suitable drivers for the appropriate platform you can leverage either
More informationTuning Intelligent Data Lake Performance
Tuning Intelligent Data Lake 10.1.1 Performance Copyright Informatica LLC 2017. Informatica, the Informatica logo, Intelligent Data Lake, Big Data Mangement, and Live Data Map are trademarks or registered
More informationTalend Open Studio for Big Data. Release Notes 5.4.1
Talend Open Studio for Big Data Release Notes 5.4.1 Talend Open Studio for Big Data Publication date December 12, 2013 Copyleft This documentation is provided under the terms of the Creative Commons Public
More informationSAS 9.4 Hadoop Configuration Guide for Base SAS and SAS/ACCESS, Fourth Edition
SAS 9.4 Hadoop Configuration Guide for Base SAS and SAS/ACCESS, Fourth Edition SAS Documentation August 31, 2017 The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2016.
More informationAccessVia Publishing Platform
AccessVia Publishing Platform Installation and Setup Guide Publishing Platform Manager Version: 8.6.x Written by: Product Documentation, R&D Date: February 2014 2014 Perceptive Software. All rights reserved
More informationHortonworks Hive ODBC Driver
Hortonworks Hive ODBC Driver User Guide Revised: August 17, 2018 2012-2018 Hortonworks Inc. All Rights Reserved. Parts of this Program and Documentation include proprietary software and content that is
More informationPrint and Copy Vending
Print and Copy Vending Administrative Guide Print and Copy Vending is an application of Enhanced Locked Print. Read this manual carefully before you use this product and keep it handy for future reference.
More informationTIBCO Spotfire Connectors Release Notes
TIBCO Spotfire Connectors Release Notes Software Release 7.6 May 2016 Two-Second Advantage 2 Important Information SOME TIBCO SOFTWARE EMBEDS OR BUNDLES OTHER TIBCO SOFTWARE. USE OF SUCH EMBEDDED OR BUNDLED
More informationInformatica Cloud Spring Microsoft Azure Blob Storage V2 Connector Guide
Informatica Cloud Spring 2017 Microsoft Azure Blob Storage V2 Connector Guide Informatica Cloud Microsoft Azure Blob Storage V2 Connector Guide Spring 2017 October 2017 Copyright Informatica LLC 2017 This
More informationTIBCO Spotfire Connecting to a Kerberized Data Source
TIBCO Spotfire Connecting to a Kerberized Data Source Introduction Use Cases for Kerberized Data Sources in TIBCO Spotfire Connecting to a Kerberized Data Source from a TIBCO Spotfire Client Connecting
More informationWF-distiller Installation Guide
WF-distiller Installation Guide Version 4.0 SP2 September 2016 prepared by WF-distiller Engineering 2016 Lexmark. All rights reserved. Lexmark is a trademark of Lexmark International Inc., registered in
More informationInformatica Version Release Notes December Contents
Informatica Version 10.1.1 Release Notes December 2016 Copyright Informatica LLC 1998, 2017 Contents Installation and Upgrade... 2 Support Changes.... 2 Migrating to a Different Database.... 5 Upgrading
More informationInformatica Axon Data Governance 5.2. Release Guide
Informatica Axon Data Governance 5.2 Release Guide Informatica Axon Data Governance Release Guide 5.2 March 2018 Copyright Informatica LLC 2015, 2018 This software and documentation are provided only under
More informationdocs.hortonworks.com
docs.hortonworks.com Hortonworks Data Platform : Security Administration Tools Guide Copyright 2012-2014 Hortonworks, Inc. Some rights reserved. The Hortonworks Data Platform, powered by Apache Hadoop,
More informationSyncsort Incorporated, 2016
Syncsort Incorporated, 2016 All rights reserved. This document contains proprietary and confidential material, and is only for use by licensees of DMExpress. This publication may not be reproduced in whole
More information<Partner Name> <Partner Product> RSA Ready Implementation Guide for. MapR Converged Data Platform 3.1
RSA Ready Implementation Guide for MapR Jeffrey Carlson, RSA Partner Engineering Last Modified: 02/25/2016 Solution Summary RSA Analytics Warehouse provides the capacity
More informationMULTI FACTOR AUTHENTICATION USING THE NETOP PORTAL. 31 January 2017
MULTI FACTOR AUTHENTICATION USING THE NETOP PORTAL 31 January 2017 Contents 1 Introduction... 2 1.1 Prerequisite for configuring the multi-factor authentication:... 2 1.1.1 On the Guest side... 2 1.1.2
More informationInstallation 1. DLM Installation. Date of Publish:
1 DLM Installation Date of Publish: 2018-05-18 http://docs.hortonworks.com Contents Installation overview...3 Setting Up the Local Repository for Your DLM Installation... 3 Set up a local repository for
More informationConfiguring an IMAP4 or POP3 Journal Account for Microsoft Exchange Server 2003
Configuring an IMAP4 or POP3 Journal Account for Microsoft Exchange Server 2003 This article refers to Microsoft Exchange Server 2003. As of April 8, 2014, Microsoft no longer issues security updates for
More informationSAS Viya 3.2 and SAS/ACCESS : Hadoop Configuration Guide
SAS Viya 3.2 and SAS/ACCESS : Hadoop Configuration Guide SAS Documentation July 6, 2017 The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2017. SAS Viya 3.2 and SAS/ACCESS
More information9.4 Hadoop Configuration Guide for Base SAS. and SAS/ACCESS
SAS 9.4 Hadoop Configuration Guide for Base SAS and SAS/ACCESS Second Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. SAS 9.4 Hadoop
More informationConfiguring Hadoop Security with Cloudera Manager
Configuring Hadoop Security with Cloudera Manager Important Notice (c) 2010-2015 Cloudera, Inc. All rights reserved. Cloudera, the Cloudera logo, Cloudera Impala, and any other product or service names
More informationInformatica Cloud Spring Data Integration Hub Connector Guide
Informatica Cloud Spring 2017 Data Integration Hub Connector Guide Informatica Cloud Data Integration Hub Connector Guide Spring 2017 December 2017 Copyright Informatica LLC 1993, 2017 This software and
More informationTuning the Hive Engine for Big Data Management
Tuning the Hive Engine for Big Data Management Copyright Informatica LLC 2017. Informatica, the Informatica logo, Big Data Management, PowerCenter, and PowerExchange are trademarks or registered trademarks
More informationStarWind Native SAN Configuring HA File Server for SMB NAS
Hardware-less VM Storage StarWind Native SAN Configuring HA File Server for SMB NAS DATE: FEBRUARY 2012 TECHNICAL PAPER Trademarks StarWind, StarWind Software and the StarWind and the StarWind Software
More informationSAS Data Explorer 2.1: User s Guide
SAS Data Explorer 2.1: User s Guide Working with SAS Data Explorer Understanding SAS Data Explorer SAS Data Explorer and the Choose Data Window SAS Data Explorer enables you to copy data to memory on SAS
More information