Logi DataHub Comprehensive User Guide

Size: px
Start display at page:

Download "Logi DataHub Comprehensive User Guide"

Transcription

1 Logi DataHub Comprehensive User Guide Version 2.2 February 2017

2 Page 2 Table of Contents Introduction...6 About DataHub...6 Using DataHub...7 Data Sources...8 System Requirements...8 Installing Logi DataHub Installation Scenarios Preparing to Install Installation Process Login to Complete the Installation Uninstalling DataHub Getting Started with DataHub DataHub Interface Tour Managing Your Profile Working with Data Admins: Managing Users Connecting to Databases Create a New Data Source Managing Data Sources Connecting to Applications Create a New Data Source Application Source Details... 29

3 Page 3 Amazon Dynamo DB Eloqua Facebook Financial Accounts (OFX) FreshBooks Google Analytics Google Spreadsheets HubSpot Marketo Microsoft Dynamics CRM Online Microsoft Dynamics GP NetSuite OData QuickBooks on Premise QuickBooks Online QuickBooks POS Salesforce SAP Netweaver Twitter Managing Data Sources Using Excel and CSV Data Preparing the Data Uploading Data Files Creating Data Relationships Supported Relationship Types Creating Relationships... 49

4 Page 4 Creating Dataviews About Dataviews Creating a Dataview Dataview Loading Data Enrichment Managing Dataviews Blending Multi-Source Data About Data Blending Data Blending Example Filtering Data on Load About Filters and Data Caching Create a Filter Expression Filter Syntax Usage Notes Apply Data Enrichment About Data Enrichment Set Column Properties Create a Calculated Column Create a Multi-Part Column Create a Conversion Column Data Repository Management Managing Usage Deleting Data Objects Changing the Repository Password Backing Up the Repository Scheduling a Backup Restoring a Repository Backup... 86

5 Page 5 Scheduling Data Refreshes Immediate Refresh Schedule a Refresh Using DataHub with Logi Info Connecting to Dataviews Dataviews in the Metadata Builder Wizard Advanced Repository Configurations Allocating Memory Configuring Look-up Columns Contact Us... 99

6 Page 6 Introduction DataHub is a web-based application that retrieves and caches data destined for analysis, using a highperformance, self-tuning data repository. It off-loads data from transactional systems, which are not optimized for analysis, and provides access to data sources not generally available in Logi Info. About DataHub Data lives everywhere... in the cloud, in corporate databases, and on local computers. However, it's often not ready for analysis and a good example is data from transactional systems. Many solutions serve enterprise-grade requirements but result in long lead times and require skilled data integrators. Logi DataHub is a data retrieval and preparation tool for use with analytic applications that solves many of these problems. It connects to multiple data sources, retrieves and caches data for improved performance, and prepares data with smart profiling, joins, and data enrichment, so you can deliver efficient reporting and analysis without impacting your transactional systems and application data sources. DataHub can access data sources not supported within Logi Info, and can access data located in multiple locations. It can blend data from the cloud, databases, applications, and files and supports large datasets (250M+ rows).

7 Page 7 Large dataset performance is accelerated with a self-tuning, easy-to-maintain columnar data store. DataHub allows you to create and manage "dataviews", which are made available to Logi applications, including our self-service analytics offerings. Using DataHub Here's a quick overview of the typical steps, left-to-right, involved in using Logi DataHub: DataHub's easy-to-use interface and intuitive operations do not require a DBA or data scientist in order to create meaningful results. Once created, dataviews are then available to a wider audience of Logi application users, who benefit immediately without having to know the data in depth.

8 Page 8 Data Sources DataHub can connect to and retrieve data from a variety of data sources, some of which are illustrated below: Data can be blended together to produce unique relationships and enriched through Smart Profiling to make working with it easier. Data retrieval can be scheduled, allowing you to refresh cached data as necessary, without impacting data sources during peak usage periods. System Requirements DataHub is a web-based application that includes a columnar data repository. While DataHub can be installed on a desktop or laptop machine, we recommend that you install it on a fixed server with appropriate resources. The general requirements are: Category Server Memory Server Storage Supported Operating Systems Requirements 16GB RAM minimum, for testing and evaluation. 32GB RAM minimum, for production environments (more recommended). 4GB hard drive (minimum), for software installation, + 1GB (minimum) for data, Additional GB storage for data as necessary, secondary drive recommended for data loads Logi DataHub is only available in a 64-bit version, for: Windows Server 2012, Windows Server 2012 R2 Windows Server 2008 R2 Windows 10 Windows 8.1, Windows 8 Windows 7 Not supported: Windows 10 Anniversary Update version 1607 (OS Build ) Windows Server 2016

9 Page 9 Supported System Software Data Sources Logi DataHub works with IIS 7+. If IIS is not already installed, or if you prefer, the DataHub installer can also install IIS Express. The DataHub installer will also install.net Framework 4.5 and configure required sharing permissions. The following data sources are supported: Microsoft SQL Server MySQL ODBC database providers OLEDB database provers Oracle PervasiveDB PostgresSQL Vertica Microsoft Excel file CSV data file Amazon Dynamo DB* Eloqua Facebook Financial Accounts (OFX)* Freshbooks* Google Analytics Google Spreadsheets* HubSpot Marketo Microsoft Dynamics CRM Online Microsoft Dynamics CRM On Premise* Microsoft Dynamics GP* Netsuite* OData* Quickbooks Online Quickbooks On Premise* Quickbooks POS* Salesforce SAP Netweaver* Twitter * These applications use a generic ADO.NET connector; please contact Logi Support if you need configuration assistance. Supported Browsers Logi DataHub is a web-based application that takes advantage of the latest HTML5 technologies. It'is currently compatible with all of the following web browsers. Google Chrome - v26+ Firefox - v20+ Internet Explorer - v10+ Safari - v6+

10 Page 10 Installing Logi DataHub This chapter guides you through the Logi DataHub installation process, for both testing and production environments. Installation Scenarios Logi DataHub allows for several installation scenarios: Standalone - Installs a single-user, standalone instance of DataHub. In addition to DataHub, the installer will also install an IIS Express service to deliver the web content. Workgroup - Installs an instance intended to be accessed by multiple users. The installer will configure a web site under IIS to deliver the web content. Two-Tier - You can also choose to install a two-tier application, wherein the DataHub Data Repository is installed on a separate database server machine. In this scenario, the repository must be installed on the database server before the DataHub Web Application is installed on the web server. Logi DataHub also relies on other web components. What Gets Installed If the following components don't exist in the target environment, the installation application will install them for you: The Logi DataHub web application Microsoft.Net Framework 4.5 IIS Express 8.0 (if Standalone mode is selected) The InfoBrite-Postgres database server Visual C++ Redistributable libraries Preparing to Install Windows includes a number of security enhancements that impact software installation and it is critical that installation and configuration occur while using the software with the built-in "Administrator" account. Even if your account has been added to the local Administrators Group, it may not have sufficient privileges, so don't rely on it.

11 Page 11 As shown at left, the correct practice when running the Logi installation program or using the Command Line to make configuration adjustments is to start the tool by right-clicking its icon and selecting "Run as administrator" from the menu to start the program. The ensures that appropriate permissions are provided for the installed components. staff for assistance. Don't see a "Run as administrator" option? If the system is in a network domain, your network admin may have created security policies that don't allow you to see this option, in which case you need to consult your IT Installation Process This section describes the installation steps and the dialog boxes you'll see when installing DataHub. When you launch the installer, it will examine your system for the necessary components and, if necessary, install them. Depending on previously installed components, selected options, and operating system versions, some of the following dialogs may not be shown, or may appear slightly different than the examples shown. If you intend to use Two-Tier configuration, with the database server and web server on separate computers, you'll need to install the DataHub Data Repository on the database server first. This will establish the database and the scheduler service. Once that's done, you'll install the DataHub Web Application on the web server. As usual, you can click Back at any time before the physical installation begins to go back to the previous screen. 1. To start the installation, right-click the Logi product installation program icon and select "Run as administrator" to launch the installer. Allow it to complete the installation preparation.

12 Page When the Welcome Screen appears, click Next. 3. License Agreement: Select the "I accept the terms..." radiobutton after reading the license agreement and click Next to continue.

13 Page Setup Type - If this installation is for testing or evaluation on a server where IIS is not already installed, select the Stand Alone option. The installer will install an IIS Express service, in addition to DataHub, for local use to administer the application. If this installation is for a production environment where IIS is already present, select the Workgroup option. The installer will configure a web site under the existing IIS server to deliver its web content. To install DataHub's Data Repository on a separate server, in order to support a Two-Tier configuration, select the Workgroup option. Click Next to continue.

14 Page 14 If you selected the WorkGroup option, you'll see the dialog box shown above. In it, you can select which of the two DataHub components to install. Make your selections and click Next to continue.

15 Page Destination Folder: Click Next to accept the default installation location and continue. Optional - click Change to specify an alternative installation location if you don't like the default location. 6. Repository Account Password - Provide a password (minimum of 6-characters) to be used to administer the DataHub Repository. Click Next to continue.

16 Page DataHub Admin Account - Enter a user name, password, address, and first and last name for an administrative user account. This account will be used to manage standard DataHub user accounts and will receive all admin-related notifications. Click Next to continue. 8. Ready to Install: The installer has all the necessary information. Click Install to begin the physical installation.

17 Page 17 A status dialog box, like the one shown above, will be displayed to keep you informed of progress. During the process you may see other informative messages similar to those shown above. Once the installation is complete, the final dialog box, shown above, is displayed with an offer to launch DataHub immediately in your browser. If you'd like complete the installation without launching Logi DataHub, uncheck the Launch Logi DataHub checkbox. Click Finish to close the installer.

18 Page 18 The installation process will create a program group and shortcut in the Start Menu for DataHub, under Logi Analytics, and an optional shortcut on the Desktop, as shown above. If you need to manually create a shortcut, here's that path (assuming a default installation location): C:\Program Files\Logi Analytics\DataHub\datahub.url Login to Complete the Installation To complete the installation, use the Admin credentials from Installation Step #7 to login:

19 Page 19 Once you do, you can check your profile and customize your avatar, provide a mobile number for SMS notifications, and engage in other administrative tasks, like creating users. Uninstalling DataHub To uninstall DataHub and all of its components, do the following: 1. Run the installation program again (as an Administrator) and confirm the uninstallation. 2. Go to the C:\Program Files\Logi Analytics\DataHub\infobrite-postgres folder (adjust path as needed if the default installation location was not used) and run uninst.exe to completely remove the database server. 3. Delete the entire installation folder: C:\Program Files\Logi Analytics\DataHub. 4. Reboot the computer. The.NET Framework 4.5 will not be uninstalled.

20 Page 20 Getting Started with DataHub Now that you've installed Logi DataHub, what should you do next? Typical first steps and links to more detailed documentation are presented here. DataHub Interface Tour When you login to DataHub, you'll see the main menu, above the Dataviews page: The main menu options include: 1. Dataviews - This option lets you manage "dataviews", by creating them or selecting one from lists of Recent or All dataviews. A dataview is just what its name implies: a view of data, which can be from multiple sources, governed by relationships, and/or enriched. 2. Sources - This option lets you manage your data sources, by creating connections to applications and/or databases. 3. Repository - This option lets you manage the data objects stored in the DataHub Repository. The number in the icon indicates the percentage of storage that's in use. 4. Help - This option displays the "About" dialog box for DataHub, which displays the version number. 5. Bell - This icon includes an indicator when you have an event notification and you can hover your mouse cursor over it to see the message. Events that trigger a notification include successful dataview creation, dataview creation failure, scheduled dataview load status, storage limit approach, etc. 6. Your-User-Name - This option allows you to manage your user profile and to logout of DataHub. It also allows Administrator users to manage other users. The "hamburger" icon at the left edge of the menu returns you to the initial (Dataviews) page.

21 Page 21 Managing Your Profile If you click the main menu option with your user name, you'll see an option for managing your profile: In the related page, you can manage the data associated with your user account and change your password. You can also choose a specific avatar, from the examples shown above.

22 Page 22 Working with Data A mentioned earlier, DataHub connects with sources to retrieve data; that data can be manipulated and combined with other data to create a Dataview. The following chapters will guide you in accessing and using Dataviews. Admins: Managing Users Administrator users can add and manage other DataHub users. Their Your-User-Name menu option includes a Users item, as shown above, which displays the Users list: Users can be managed from the list, or their User Name can be clicked to view and edit their user account details and change their passwords. The list can be filtered by user role and status. New user accounts can be created by clicking Create New User and supplying appropriate details. When a new user account is created, they're assigned a default avatar and they're automatically sent an with their temporary password. Users or groups of users can be activate or inactivated by checking their checkbox in the list, selecting the appropriate action, and clicking Apply. Other tasks for Administrators, such as managing the data repository management and scheduling data refreshes are described in later chapters.

23 Page 23 Connecting to Databases Logi DataHub is capable of connecting to a variety of commercial databases in order to retrieve data. This document presents an example of how to create a connection to data on a SQL database. The supported database servers include: Microsoft SQL Server ODBC database providers OLEDB database providers Oracle PervasiveDB PostgresSQL Vertica Connections to applications are discussed in the next chapter. Create a New Data Source In DataHub, a connection to data is called a "Source" and can be an application, database, or file connection. In this document, we'll create a new Source based on a SQL database. There are multiple paths to the Add Source dialog box, including: Sources menu option Create New Source. Dataviews menu option Create New Dataview From Source tab Add New Source The Add Source dialog box will appear (the fields shown below will not be visible unless the Database option is selected):

24 Page 24 Select or provide the required information, as follows: 1. Database - Select the Database radiobutton, making the fields shown above visible. 2. Data Provider - Select the desired database or provider type. If you select the generic ODBC or OLEDB providers, skip down to the next section. 3. New Source Name - Give the source an arbitrary name for easy recognition later. 4. Server Name - Enter the database server name. 5. User Name and Password - Enter the credentials required to access the database. 6. Database Name - Enter the target database name (or, for Oracle, the Service Name). For Microsoft SQL Server, MySQL, and PostgreSQL providers, you can use the Get List button to select the database name from a list. For MySQL, Oracle, PervasiveDB, and PostgreSQL providers, the Advanced Options link and Port Number

25 Page 25 field will be shown. They are not shown for other providers. 7. Visibility - Select the security settings to be applied to the connection: Only Me or Everyone. The user who creates a connection is the manager of the connection and may choose the connection sharing options. 8. Test Source - This action will attempt to make the connection specified and provide a status message. In addition to indicating either success or failure, any existing Source with the exact same specifications will be identified so you can decide whether to use it instead or proceed to save your new Source. Click Save to save your new Source. Repeat as necessary to create the Sources you need. Generic Providers If you select the generic ODBC or OLEDB providers, the dialog box will be reconfigured: Select or provide the required information, as follows: 1. Database - Select the Database radiobutton, making the fields shown above visible. 2. Data Provider - Select the ODBC or OLEDB provider type. 3. New Source Name - Give the source an arbitrary name for easy recognition later.

26 Page Connection String - Enter an appropriate connection string. If you need help, see the ConnectionStrings.com web site. 5. Visibility - Select the security settings to be applied to the connection: Only Me or Everyone. The user who creates a connection is the manager of the connection and may choose the connection sharing options. 6. Test Source - This action will attempt to make the connection specified and provide a status message. In addition to indicating either success or failure, any existing Source with the exact same specifications will be identified so you can decide whether to use it instead or to proceed to save your new Source. Click Save to save your new Source. Repeat as necessary to create the Sources you need. Managing Data Sources As you create Sources, they're represented in the Sources page with graphic "pills": The collection of pills can be searched and filtered using the provided controls. Each pill displays the Source name and database type, as shown above, and includes a "gear" icon. Hovering your mouse cursor over the icon displays a menu of management actions.

27 Page 27 The Info menu option allows you to see, among other details, who created the Source, as shown above. This can be helpful if the Source has been shared with you by another user. The Delete option will only be included if you're the user who created the Source. Source pills are also visible in the Create a New Dataview page, under the From Source tab, as shown above. The gear icon, however, only displays an information option from this page.

28 Page 28 Connecting to Applications Logi DataHub is capable of connecting to and retrieving the data stored by a variety of commercial applications. This document presents examples of how to create connections to them. Connections to standalone databases are discussed in the previous chapter. Create a New Data Source In DataHub, a connection to data is called a "Source" and can be an application, database, or file connection. In this document, we'll explain how to create a new application Source. There are multiple paths to the Add Source dialog box, including: Sources menu option Create New Source. Dataviews menu option Create New Dataview From Source tab Add New Source The Add Source dialog box will appear; its fields will vary depending on the data provider chosen:

29 Page 29 Select or provide the required information, as follows: 1. Application - Select the Application radiobutton. 2. Data Provider - Select the desired application. The dialog box fields will change with each selection. The first nine application options have specific data providers associated with them and these are shipped with DataHub. You must supply an ADO.NET connection string for those applications with an asterisk by their names. See the additional discussion of connection strings below. 3. New Source Name - Give the source an arbitrary name for easy recognition later. 4. User and Password - Enter the credentials required to access the application. 5. Company - Enter any other information required by the target application. 6. Visibility - Select the security settings to be applied to the connection: Only Me or Everyone. The user who creates a connection is the manager of the connection and may choose the connection sharing options. 7. Test Source - This action will attempt to make the connection specified and provide a status message. In addition to indicating either success or failure, any existing Source with the exact same specifications will be identified so you can decide whether to use it instead or proceed to save your new Source. Click Save to save your new Source. Repeat as necessary to create the Sources you need. Connection Strings A connection string is a series of option=value strings separated by semicolons. If a connection string property value has special characters such as semicolons, single quotes, spaces, etc., then you must enclose the value in either single or double quotes. Connection options are case insensitive unless otherwise noted. The ConnectionStrings.com web site is an excellent source of connection string examples. Application Source Details The following are detailed steps for establishing a connection to each of the named application data sources. Amazon Dynamo DB The connection to Dynamo DB is made using your AccessKey, SecretKey, and optionally your Domain and Region.

30 Page 30 Your AccessKey and SecretKey can be obtained on the security credentials page for your Amazon Web Services account. Your Region will be displayed in the upper left-hand corner of the page when you are logged into Dynamo DB. For additional information about the Amazon Dynamo DB provider, click here. Eloqua The Add Source dialog box configuration and fields for Eloqua are shown in the previous section. Facebook Selecting Facebook as the data provider will result in the display of these fields: Click Get Auth Token to login to Facebook and copy your authorization token. You'll need to have Platform access turned on in Facebook to do this and you'll be prompted to enable it during this process if it's not. The Page ID is the unique identifier of the Facebook page whose statistics you'd like to track. You may need to look it up via online tools or the page URL in Facebook. This web page explains how to find it, or a quick Google search for "Facebook Page ID" will lead you to several other sources.

31 Page 31 Financial Accounts (OFX) To specify a location for the database where the table, view, and stored procedures are located, set the Location property in the connection string. In addition you must also set the User and Password properties. For additional information about the Financial Accounts provider, click here. FreshBooks To specify a location to the database where the tables, views, and stored procedures are located, set the Location property in the connection string. In addition, you must also set the CompanyName property. For additional information about the FreshBooks provider, click here. Google Analytics Selecting Google Analytics as the data provider will result in the display of these fields: Click Get Auth Token and log into Google Analytics using your credentials. Copy and paste the token into the field provided. Once you've provided a token, you'll need to select the Web Profile that will provide the data you are interested in. To do that, click Get Profiles to populate a list of web profiles and select one from the list.

32 Page 32 Google Spreadsheets To specify a location to the database where the tables, views and stored procedures are located, set the Location property in the connection string. In addition, you must also set the User and Password, or OAuthAccessToken properties. For additional information about the Google Spreadsheets provider, click here. HubSpot Selecting HubSpot as the data provider will result in the display of these fields: You'll need to enter your HubID. This web page can help you find it. And you'll also need to provide an Authorization Token. Click Get Auth Token and log into HubSpot. Copy and paste the token into the field provided. Marketo Selecting Marketo as the data provider will result in the display of these fields:

33 Page 33 You'll need to enter User ID, Encryption Key and SOAP Endpoint values in the fields provided in order to access to the data. Microsoft Dynamics CRM Online Selecting Microsoft Dynamics CRM Online as the data provider will result in the display of these fields:

34 Page 34 You'll need to enter your User ID and Password credentials that grant access to Microsoft Dynamics CRM Online. Enter the URL of the application that contains the data you want to work with. Microsoft Dynamics GP To specify a location to the database where the tables, views, and stored procedures are located, set the Location property in the connection string. You must also supply the required CompanyId, Url, User, and Password properties. For additional information about the Microsoft Dynamics GP provider, click here. NetSuite To specify where the table, view, and stored procedures of the database are located, set the Location property in the connection string. In addition, you must also set the User, Password, and AccountId properties. For additional information about the NetSuite provider, click here.

35 Page 35 OData You must set the User, Password, and URL properties in the connection string. For additional information about the OData provider, click here. QuickBooks on Premise To specify a location to the database where the tables, views, and stored procedures are located, set the Location property in the connection string. In addition, if QuickBooks is not installed on the local machine and you're using the QuickBooks Remote Connector, you must set the User, Password, and URL properties. For additional information about the QuickBooks on Premise provider, click here. QuickBooks Online Selecting QuickBooks Online as the data provider will result in the display of these fields: You'll need to provide an Authorization Token. Click Get Auth Token and log into QuickBooks Online using your credentials. Copy and paste your authorization token into the field provided. Enter the Company ID for the organization that contains the data you want to use.

36 Page 36 QuickBooks POS To specify a location to the database where the table, view, and stored procedures are located, set the Location property in the connection string. For additional information about the QuickBooks POS provider, click here. Salesforce Selecting Salesforce as the data provider will result in the display of these fields: Enter the Username and Password for the Salesforce account whose data you want to use. SAP Netweaver To specify a location to the database where the tables, views and stored procedures are located, set the Location property in the connection string. In most cases, you'll also need to set the User, Password, ConnectionType, Client, and SystemNumber properties. For additional information about the SAP Netweaver provider, click here.

37 Page 37 Twitter Selecting Twitter as the data provider will result in the display of these fields: You'll need to provide an Authorization Token. Click Get Auth Token and log into Twitter using your credentials. Copy and paste the authorization token into the field provided. Managing Data Sources As you create Sources, they're represented in the Sources page with graphic "pills":

38 Page 38 The collection of pills can be searched and filtered using the provided controls. Each pill displays the Source name and application type, as shown above, and includes a "gear" icon. Hovering your mouse cursor over the icon displays a menu of management actions. The Info menu option allows you to see, among other details, who created the Source, as shown above. This can be helpful if the Source has been shared with you by another user. The Delete option will only be included if you're the user who created the Source.

39 Page 39 Source pills are also visible in the Create a New Dataview page, under the From Source tab, as shown above. The gear icon, however, only displays an information option from this page.

40 Page 40 Using Excel and CSV Data Logi DataHub Dataviews can be created using data from Excel spreadsheet (.xls and.xlsx) or comma-separated values (.csv) files. This chapter discusses preparing and importing this kind of data. Preparing the Data Before using Excel data with DataHub, it's important to understand it. Many Excel files are basically formatted reports. The raw data may have been annotated, aggregated, sorted, formatted, and grouped. Unfortunately, it can be difficult to extract meaningful data from this kind of file. In these cases, the data in the Excel file will need to be "massaged" before it can be used. DataHub consumes only raw data. In order for an Excel file to be imported cleanly into DataHub, it must follow these three rules: The first row in a worksheet must only contain column header information. All subsequent rows must be completely filled with data in each column. Each worksheet in an Excel file is considered a different data object in DataHub. The first two rules also apply to comma-separated values (CSV) data files for use with DataHub. Cleaning Up the Data Excel data to be loaded into DataHub should be "de-normalized", raw data. The worksheet should be edited to: Remove rows and cells used for annotation Remove aggregations (e.g. totals, averages, counts) from the rows and columns Remove compound header rows or spanned column headers Remove or fill empty or Null cells Provide any missing column headers Make data in a column homogeneous - don't mix text, numeric, or date values within a column The following two images are examples of Excel worksheets:

41 Page 41 The incorrect example shown above is an Excel worksheet that cannot be used with DataHub. It violates many of the requirements previously outline. And now the same data is shown above after being cleaned-up for use with DataHub. Let's examine each of the clean-up steps in more detail.

42 Page 42 Remove Annotations Many Excel files contain annotations describing details of the data, such as its origin, when it was generated and by whom, copyright notices, disclaimers, etc. For analysis purposes, this information is generally useless. From the DataHub perspective, annotations should be removed so that they don't corrupt the data. For example, the rows containing the title and date from the worksheet shown earlier should be removed. Remove Aggregations Since DataHub is expecting raw data, any aggregated data in a column in the worksheet would be incorrectly considered raw data. So, the aggregated data should be removed before uploading it to DataHub. If you compare the two example Excel worksheets shown earlier, you'll notice that the invoice aggregations, as well as the Line Total column, have been removed during the clean-up process. Remove Multi-Row Column Headers Excel allows you to create multi-column headers where the text spans multiple columns (this is not shown in the example images). These should be reduced to a single-column header in the first row. It may require a change in naming conventions for the column header. For example, you may have 2015 and 2016 data broken into Revenue, Expense and Net columns. You may want to set the final column headers as 2015 Revenue, 2015 Expense, 2015 Net, 2016 Revenue, 2016 Expense, 2016 Net. A better choice might be to include a Year column and set the 2015/2016 data values and keep the Revenue, Expense, and Net columns. Remove or Fill Empty or Null Cells Empty cells and cells with Null values are considered data once imported into Datahub. For example, in the incorect example worksheet shown earlier, all of the cells that are subordinate to the grouped data are empty. These cells must be filled to correlate the data values to the proper group.

43 Page 43 For example, the cells in pink shown above should be filled with their parent data. The Company Name should be filled all the way down, along with their respective Order ID and Order Date cells. Provide Missing Column Headers Every column must have header text. This text will be used to identify the column data in Datahub. After formatting your Excel worksheet, make sure that each column is properly identified in the first row of the worksheet. Make the Data Homogeneous Within each column, all data should be of one data type. DataHub examines the data during import and sets the data type (dimension, measure, date, identity) according to column contents, so it's important that all of the data be of the same type. Excel worksheet column formatting can be tricky here. For example, in the worksheet shown earlier, the Order ID column appears to be numeric and would therefore be assumed to be a "measure" in DataHub. However, the Order ID is actually text data (as you can tell by the small, green triangle in the upper left corner of each Order ID cell) and would actually be imported into DataHub incorrectly as a "dimension". Importing Multiple Worksheets When an Excel file is identified in DataHub as a data source, each of the worksheets found in the Excel file will be presented as a different data object. Consider the following Excel file:

44 Page 44 The file includes two worksheets, Invoices and Expenses. When imported into DataHub, they will be presented as two separate data objects, shown highlighted in yellow above. Uploading Data Files In DataHub's Create a New Dataview page, there are two tabs: From File and From Source.

45 Page 45 In the From File tab, a drop zone and Browse button are displayed. You can either drag-and-drop an Excel or CSV data file into the area, or browse to and select the desired file. Once the file has been selected, you will be presented with a Dataview Configuration page that allows you to select the work sheets and columns to be included in the new Dataview for analysis. 1. Sources in Use - This panel identifies all of the data sources being used in the current Dataview. In the example above, the Excel file called "InvoiceSample" is the data source. 2. Objects - This panel identifies all of the data objects created from the worksheets in the imported Excel file. Their names are the worksheet names. The Excel file name will become the initial Dataview name once data configuration has been completed.

46 Page Columns - This panel displays the data columns from the currently selected data object (worksheet). Click columns to include them in the dataview (or click the All link in the upper right-hand corner). As you select columns, they'll be added to a row of "column pills" along the bottom of the page, as shown above. Repeat the process if you need columns from a different data object or different source until your Dataview is complete. To remove a column previously selected, click it again in the Columns panel. Click None link to remove all of the columns related to the selected object. Click the Save icon to save your Dataview and the data from the file will be loaded. During that process, the Dataview Status page will be displayed and an initial "sample" analysis table will be presented. Click the Reset icon to start the column selection process over again, or cick the Cancel icon to cancel the operation entirely. If you're having trouble importing a CSV file and you suspect that it may be due to special characters in the data, try opening the file in Excel first, saving it as an Excel file, and then importing the Excel file.

47 Page 47 DataHub allows you to blend data from multiple, disparate data sources. This can be done by clicking the Add Data icon, circled above. Your current data sources will appear in a panel on the right and a new Add Data panel will open on the left. Click the appropriate tab to begin the data selection process again. Click the icon (now shown as a minus "-" sign) again to close the panels.

48 Page 48 Creating Data Relationships This chapter discusses how to define relationships between data objects in DataHub. The relationship definition determines what data will be loaded into the Dataview when there are multiple data objects specified as sources. Data objects usually contain records related to a specific data entity (e.g. Customer information, Product information, Order information). As part of the database design, the different data objects might contain common data so that the content of one data object can be used to augment or extend the information from a second data object. For example, if we wanted to know the order information for a customer there would have to be some common data to relate the order records to the customer records. In a very simplistic sense, a relationship is defined with three basic pieces of information: the object and column from the "parent" object, the object and column from the "related" object, and the type of relationship. Using the above example for customer orders, the three pieces of information would be the Customer's ID from the Customer object, the related Customer ID included in the Orders object, and an Inner Join relationship between the two objects' columns. DataHub provides a convenient interface for specifying the entire relationship. Using relationship terminology, in the same example, the Customer object would be called the "left" object and the Orders object, the "right" object. Supported Relationship Types Here are the relationship types DataHub allows you to use: A Left Join will return all of the data from the "left" object and the data from the "right" object where the data in the related columns match. Where there isn't matching data, return Null information for the "right" object's columns. An Inner Join will return all of the data from the "left" and "right" objects where the data in the related columns match. Exclude the data from both objects if the data in the related columns doesn't match

49 Page 49 A Right Join will return all of the data from the "right" object and the data from the "left" object where the data in the related columns match. Where there isn't matching data, return Null information for the "left" object's columns. An Outer Join will return all of the data from both the "left" and "right" objects. Where there are matches in the related columns, return the data from both "left" and "right" objects. Where a match can't be found, return Null information. All records from both data objects will be represented in the final dataview. A Union All will return all of the data from both the "left" and "right" objects where there are matches in column names and data types. The non-matching columns will be appended to the rows. Note: No column specification is required for a Union All relationship. DataHub will analyze the objects to determine the resulting recordset. Creating Relationships For the following examples, we ll use the venerable Northwind database as the primary data source and walk through the process of creating a Dataview. Our goal for this exercise is to create a Dataview that encompasses Customer Orders. In this practical exercise we will cover the creation of relationships. Once the user interface basics are understood, the pattern used to select the objects, columns, and establish relationships is relatively simple. We'll being by navigating to DataHub's Create a New Dataview, selecting or creating a Northwind data source (not shown here), and displaying the Dataview Configuration tab. We're going to need Customer information and Order information, so let's start by selecting the Customers data object and identifying the customer data that we need.

50 Page 50 We clicked on the Customers data object and clicked on the Company Name, Region, and Country columns, as shown above. Notice at the bottom-left of the tab, the current definition of the columns in the Dataview is taking shape. Our next step is to identify the Order information we need in the Dataview. If we click on the Orders object, the Dataview Configuration tab will look like:

51 Page 51 Notice that the Customers columns are still in the Dataview, the Orders columns are available for selection, but no relationship between them has been defined yet. If we click Create Relation in the Columns panel, the Create Relationship dialog box is displayed:

52 Page 52 In order to create a relationship, do the following: 1. Relationship Type - Click the type of relationship you want to use; in our example, it's a Left Join, which is selected by default. 2. Relate To Object - Select the object we want to relate the Orders object to; we want to relate to Customers. 3. Select "Left" Key Column - Select the column in Customers that we want to relate to one in Orders; we'll use CustomerID. 4. Select "Right" Key Column - Select the column in Orders that we want to relate to the one we selected in Customers; we'll use CustomerID. It's not necessary for this example, but we can create multiple key column pairs by clicking the Click Save to save the relationship. icon. The relationship description has been added to the top of the Columns panel, as shown above. To modify it, click the Join icon in the middle. It's possible to create additional relationships between these and other objects by repeating the process described above. Click the Save icon in the upper right-hand corner of the tab to save the Dataview with its relationship.

53 Page 53 Creating Dataviews The fundamental purpose of DataHub is to create "Dataviews", a view of cached data, which can be consumed by other Logi products. This chapter discusses the creation and management of Dataviews. About Dataviews Logi DataHub is a data virtualization product that connects to multiple sources, retrieves and caches data for high-performance, and prepares data in intuitive ways, so you can deliver efficient reporting and analysis that doesn't affect transactional systems. DataHub includes DataSmart profiling, which automatically identifies the types and formats of your data and provides easy ways of enriching it through new calculations and manipulation of multi-part data. It uses its own repository database to store prepared data and that database can be accessed by a Logi Info application just like any other datasource. That data is available as a "Dataview", which can be accessed as a by other Logi products. A Dataview is defined using a Dataview definition or "DVD". It describes the data sources to connect to, the data objects and columns to retrieve, and the relationships between objects. You create a Dataview by specifying a DVD. Creating a Dataview The following discussion assumes you've already created one or more data Sources, by connecting to a database or connecting to an application. Creating a Dataview directly from data in an Excel or CSV data file is discussed in our document Using Excel and CSV Data. There are multiple paths in DataHub to the Create a New Dataview page, including: Dataviews menu option Create a New Dataview Sources menu option "Gear" icon in Source pill Create Dataview Once you arrive at the Create a New Dataview page, click the From Source tab:

54 Page 54 Your data Sources will be displayed as "pills", as shown above. Click one to create a Dataview that uses it, and the Dataview Configuration tab will appear:

55 Page 55 Here are the important features of this tab, keyed to the image shown above: 1. Sources in Use - This panel displays the data Sources in use by this Dataview. Click the Add Data icon to add more Sources ("blending data" from multiple sources is discussed in a separate chapter. 2. Data Objects - This panel displays a list of the data objects (tables and views) available in the selected Source. The list can be searched and filtered using the included controls. Click an object to select it. When you do, its columns appear in the Columns panel. The Filter icon allows you to filter the data and its use is discussed in a separate chapter. If you select any of an object's columns, the object will be placed in the "Objects in Use" list. 3. Available Objects - This is a list of the data objects that are available for use but haven't been used yet. 4. Action Icons - Click these icons to Cancel definition creation, Reset all selections to their defaults, or Save your definition. 5. Columns - This panel displays a list of the columns available in the selected data object. The list can be searched and filtered using the included controls, and the All and None links can used for bulk selection. Click a column to add or remove it from the Dataview. 6. Column Pills - When you select a column in the Columns panel, it will be represented by a "pill" at the bottom of the page. These provide a representation of the data, in tabular form, included in the Dataview. You can also define relationships (joins) between data objects in your Dataview. For information about how to do this, see the previous chapter. When you have selected your Source(s), Objects, and Columns, click the Save icon provide a name for your Dataview. It will be saved and then immediately loaded. and you'll be prompted to Dataview Loading Data is loaded for the first time as soon as you save a new Dataview definition. You can also "refresh" the data manually at any time and schedule reloading to occur at regular intervals. Loading progress and actions are seen in the Dataview Status tab:

56 Page 56 Several status indicators will show you the progress of the load. The process includes four steps: 1. DataSmart initializing - DataHub is waiting for the scheduler and connection code to initialize. 2. Importing data - A connection is made to the Source, the data is requested, and retrieved. 3. Updating dataview - The retrieved data overwrites, or is appended to, the existing cached data. 4. Profiling data - DataHub identifies column types, designates look-up columns, and sets the correct column order and sorting. When loading completes, the Object Details will be updated......and you'll see a notification indicator in the menu bar, as shown above. Hover your mouse cursor over it to read the message. If, in the course of loading the data, DataHub detects the right conditions, the button shown above will be displayed. An optimized Dataview consolidates all the underlying objects into a single table in the DataHub repository database, considerably improving performance. Click Optimize Dataview to begin the optimization process if you want to do this.

57 Page 57 Refreshing the Data Clicking Refresh Now will display a prompt to select the type of refresh: If you select: Replace Data - All data in the cache will be replaced by new data retrieved from the data source. Append Data - New data retrieved from the data source will be appended to the existing data. Existing data will not be updated. Click the Refresh Now icon to start the data loading operation immediately. Clicking Add Schedule will let you define a scheduled refresh and is discussed in a separate chapter. Data Enrichment DataHub's Data Enrichment feature allows you to specify a number of properties for Dataview data columns, including their order, data type, display format, and more, and to create custom calculated columns. You can also create custom data conversion and "multi-part" columns. This feature is discussed in a separate chapter. Managing Dataviews As you create Dataviews, they're represented in the Dataview page with graphic "pills":

58 Page 58 The collection of pills can be searched and filtered using the provided controls. Each pill displays the Dataview name, its status, and its loaded record count, as shown above, and includes a "gear" icon. Hovering your mouse cursor over the icon displays a menu of management actions. The Delete option will only be included if you're the user who created the Dataview.

59 Page 59 Blending Multi-Source Data Blending data from multiple sources lets you get unique datasets in DataHub. This document presents an example of how to configure a Dataview to make use of this feature. About Data Blending The term "data blending" describes the retrieval of data from disparate data Sources for use in your DataHub Dataview. For example, you could bring together data from a database, an Excel spreadsheet, and an application, all in one Dataview. Just as relational databases use "foreign key" relationships between tables, all of the data objects you want to blend in a Dataview must share at least one column of similar content. There must be a way to relate records in the primary data source with records in the secondary data source. Part of the process of creating a Dataview using disparate data is the specification of the relationship between the data sources. Starting at the Create New Dataview page, the steps you need to take to blend data are: 1. Identify the primary objects and columns to be included in the Dataview. 2. Select a different data source. 3. Select an object from the secondary source. 4. Build a relationship between the selected object and objects/columns already included in the Dataview. 5. Select columns from the selected data source to be included in the Dataview. 6. Repeat the process until you've identified all of the information required for the Dataview. Let's see how this is done. Data Blending Example This example will demonstrate how to blend data from a relational database with information from an Excel spreadsheet. In our example data, we have a list of Invoices in an Excel worksheet and we want to blend that information with customer Order information from a SQL database.

60 Page 60 To save time, we're not going to show you the steps required to create the SQL Server data Source or the selection of the desired database objects. We'll just drop into the example at the point where we're ready to blend data. The Dataview, as configured so far, is shown above. Notice the column pills at the bottom of the page, for the columns we've already selected. The next step is to blend information from an Excel spreadsheet into the Dataview. We'll start by clicking the Add Data icon at the left edge of the page:

61 Page 61 The Sources in Use panel slid to the right and continues to identify the Northwind data source we used to start the Dataview. The Sources page appeared, with its From File tab selected, to let use select another Source. Now we'll either drag-n-drop the Excel file we want to work with onto the tab or use the Browse button to locate and select it. DataHub will automatically return to the Dataview Configuration tab:

62 Page 62 The Excel data source is now selected in the Sources in Use panel, and we'll select the Invoices data object. The next step is to create the relationship between the Invoices object and the previously selected Northwind objects. Click the Create Relation button in the Columns panel and the Create Relationship dialog box will appear:

63 Page 63 To configure the correct relationship: 1. Select the desired relationship, which is Left Join (the default) 2. Select the object to relate, which is the Customers table 3. Select the "left" column, which is the ID column 4. Select the "right" column, which is the Code column. Note that the two related columns do not have to have the same column name. Click Save to return to the Dataview Configuration page and select the column from the Excel object that we'd like included in the Dataview. Invoices The relationship now appears at the top of the column list in the Columns panel (you can click the join icon to modify the relationship definition), and the selected Excel column is now part of the column pill collection at the bottom of the page. Click the Save icon to save and then load the Dataview. And that's all you need to do to create a Dataview with multi-source blended data. However, you may find that you have to rename columns in the Dataview itself for clarity. For example, both the Customer and Supplier

64 Page 64 tables contained Region and Country columns and DataHub will automatically, intelligently shorten the column names, however, as the Dataview designer you have the option of overriding the derived names. Our example showed the use of data blending when creating a Dataview, but you have the option of extending existing Dataviews at any time by revisiting the Dataview Configuration tab and adding related content from other sources.

65 Page 65 Filtering Data on Load Logi DataHub lets you filter data loaded into Dataviews by specifying filtering criteria for data sources. This document discusses the filtering criteria configuration. About Filters and Data Caching When using filtering, questions sometimes arise about the load times and statuses indicated in DataHub. Here's an explanation of the data caching scheme and how filtering affects it: When a Dataview ("A") is created, the data necessary for it is downloaded and cached by DataHub. If another Dataview ("B") is created that happens to use the exact same data, then it's pointed to the existing, cached data downloaded for "A". This improves performance and saves storage space. However, when a Dataview ("C") is subsequently created with a filter, the filtered data is downloaded and cached separately because it does not exactly match the existing cached data for "A". Then if another Dataview is created that uses filtering identical to that of "C", it will use the cached, filtered data download for "C". Create a Filter Expression Filtering essentially adds a standard ANSI SQL-92 "WHERE" clause to the query used to download data. Therefore, filtering is only available for the supported database and Salesforce data providers. Access to the filtering options is available on a Dataview's page, in its Dataview Configuration tab:

66 Page 66 Click the Filter icon, shown circled above, to specify the filter criteria for that Object. If a filter already exists for an object, the filter icon will appear "filled in", as shown above for the Orders object. If there is no icon, then filtering is not supported. Clicking the icon will display the Filter Criteria dialog box: If the filter already exists, its filter expression will be loaded into the dialog box, as shown above. Do not surround your expression with double-quotes, as suggested by the example in the dialog box. If you try to modify the filter expression for an object that's already been used in a Dataview, a warning message will be displayed. Filter Syntax Usage Notes The Filter Criteria dialog box lets you specify the expression that will be part of the WHERE clause in the query DataHub issues to the data source.

67 Page 67 Do not include the SQL reserved word "WHERE" in your filter expression. The query syntax conforms to ANSI-92 PostgreSQL syntax. There are numerous references on the Internet for PostgreSQL-style WHERE clauses. Here's the general syntax for the WHERE clause issued by DataHub: WHERE exp AND OR exp AND OR exp where exp can be one of the following: [column] = value [column] = value OR [column] = value [column] = 'stringvalue' [column] = 'datevalue' [column] > value [column] >= value [column] < value [column] <= value [column] BETWEEN value1 AND value2 [column] IN (value1,value2,...) [column] NOT IN (value1,value2,...) [column] LIKE value [column] NOT LIKE value [column] IS NULL/IS NOT NULL And Column names must be enclosed in square brackets, e.g., [OrderDate]. Column names are case-sensitive. Column names can only reference a column in the current data object. Column names can be either the "friendly" name found in the Columns panel or the actual column name in the data source. Conditions may be nested. String and date values must be enclosed in single quotes. Numeric values must not be enclosed in quotes. The filter expression only applies to the associated data object. In a typical Dataview, data objects are often related via Left Outer Joins and this may result in Null values for filtered information. Furthermore, it's possible to create a filter on a data object to remove Null values but, due to the behavior of a Left Outer Join, continue to see Null values in the resulting Dataview.

68 Page 68 Apply Data Enrichment Data that's retrieved into DataHub can be enriched in multiple ways to make it easier to use and more understandable. Enrichment is automatically applied, based on the Dataview definition, and this chapter discusses how to configure data enrichment. About Data Enrichment "Data Enrichment" is the term altering the Dataview definition for data in DataHub's data repository in order to enhance it. Some enrichment actions are passed through in metadata to applications consuming the Dataview, while others also actually cause new columns and data to be added to the data repository. Enrichment consists of the following possible actions: Set a column's "physical" name (the name displayed in DataHub) Categorize a column as a Dimension or a Metric value Set a column's data type Sub-Category Set a column's display Format Create a new Calculated column Create a new Multi-Part column Create a new Conversion column All of these actions are undertaken in a Dataview's Data Enrichment tab:

69 Page 69 And are discussed individually in the following sections. The image above shows the selected column properties, identifies the column selection indicator (simply click a column to select it), and identifies the column "Gear" icon used to display a drop-down menu of options. Click the Save icon Click the Reset icon to save changes. to restore all the properties for a column to their original values. Click the Preview icon to preview the effects of formatting changes. Note that this requires the Physical Name property be the same as the original name of the column in the data (the Source Name). Use Reset icon to undo any previewed changes. Set Column Properties After selecting a Dataview and clicking its Data Enrichment tab, select a column by clicking it.

70 Page 70 The column properties that can be set are shown above and include: 1. Physical Name - This property is the column name that will be displayed in the column header and elsewhere, sometimes called the "friendly" name. The Source Name (the name of the column in the data repository) is shown below it for reference. 2. Category - This property lets you specify if the column should be treated as a Dimension (an independent variable - usually descriptive or categorical) or a Metric (a dependent variable - usually numeric). DataHub makes an initial assignment based on its analysis of the data, but you may wish to change it. 3. Sub-Category - This property further categorizes the column. If the main category is set to Dimension, then options here include Boolean, Identity, Location, Numeric, Temporal, and Text. If set to Metric, then options include Currency, Decimal, and Integer. DataHub makes an initial assignment based on its analysis of the data, but you may wish to change it. 4. Format - This property determines how the data values in this column will be formatted for display. Options displayed here are related to the Sub-Category property setting and include a wide variety of formats. 5. Special Properties - Depending on the Sub-Category property value, other special properties may be displayed. Remember to click the Save icon to save your changes.

71 Page 71 Create a Calculated Column The ability to create calculated columns is a very useful enrichment activity and DataHub makes it easy to do. A calculated column adds a new column in the data repository and is usually the result of operations on one or more existing columns. Once defined, a calculated column is created automatically when the Dataview is loaded. In the following example, we'll create a new column that contains the results of multiplying the Quantity column by the UnitPrice column. To get started, we'll hover the mouse cursor over the gear icon of one of the existing columns, Quantity, as shown above, and select the Calculation drop-down menu item.

72 Page 72 The Calculation panel, shown above, will appear, with these keyed elements: 1. Built-in Functions - Hover your mouse cursor over one of these built-in functions to see its description, click to add it to the expression area. See the section below about using special SQL functions. 2. Expression Area - This is where you build your expression for generating the calculation column values. You can type directly into this field, if desired. Field names must be enclosed in [square brackets] and, as soon as you type an opening square bracket, a list of available columns will be shown for you to choose from. 3. Operators - Click to add one of these standard operators to the expression. 4. Test Area - Click Test to run the expression and see the results in this area. 5. New Column - The new column will be inserted into the columns at the bottom of the page, to the right of the column whose gear icon menu you used, and marked as the selected column.

73 Page 73 Click the Save icon to save the new calculated column. To exit the Calculation panel, select a different column by clicking on it. You can then re-select the new calculated column and set its properties (for example, you may want to change its name). Click the Reset icon Click the Preview icon expression. to clear the expression and results before previewing the new column. to preview the calculated column, which will be populated with values based on the Click the Delete icon to delete the calculated column altogether and start over. To exit the Calculation panel, select a different column by clicking on it. Handling Null Values Null values in a column pose special challenges. In an application consuming a Dataview, they may be considered when grouping on the column but may not be included in aggregates and charts. While this may be the desired behavior, there are situations where a Null value should be counted. In those cases, the Null must be converted to a real value (e.g. "N/A", 0, -1, "Unknown"). This calculated column expression can be used to convert the Null values to a real value: IFNULL ([column], 'N/A') Special SQL Functions For Dataviews that use database sources, you can use SQL functions if the built-in functions aren't sufficient. As shown in the example above, you can wrap an ANSI SQL-92-compliant function in a "SQL_FUNCTION( )" structure in your expression and it will be understood and executed as part of the query used in the data retrieval process. DataHub column names must still be enclosed in square brackets. Complex SQL statements, like this, are supported: SQL_FUNCTION("CASE WHEN [Country] = 'Argentina' THEN 1 ELSE 0 END")

74 Page 74 Create a Multi-Part Column A multi-part column is a special column type that combines the values from two or more existing columns. It generally behaves in DataHub as a dimension (text) column. Once defined, a multi-part column is created automatically when the Dataview is loaded. Common examples include complete addresses and full names. A complete address column might be created by combining values from street address, city, state, and zip code columns. A full name column might combine first name, middle initial, and last name column values. You could accomplish the same result with a calculated column, however, you wouldn't be able to manage the parts as you can with a multi-part column. In the following example, we'll create a new column that combines the values of the First_Name and Last_Name columns. To get started, we'll hover the mouse cursor over the gear icon of one of the existing columns, First_Name, as shown above, and select the Multipart drop-down menu item.

75 Page 75 The Multi-part panel, shown above, will appear, with these keyed elements: 1. Drop-Zone - This area is where you drag-n-drop the other columns that you want to combine with the original column. 2. Preview - Shows the columns with the selected separator separating them. 3. Separation - Select one of the standard separator characters: None, Space, Comma, Slash, or Dash. 4. New Column - The new column will be inserted into the columns at the bottom of the page, to the right of the column whose gear icon menu you used, and marked as the selected column. Now you're ready to add the next part of the new column:

76 Page 76 To combine columns, drag the desired column pill header from the bottom of the page and drop it into the drop zone, as shown above.

77 Page 77 The panel will look like the example above and, if you click the Preview icon, the new column at the bottom of the page will populate with preview data. If you're combining more columns, just repeat the process. Click the Save icon to save the new multi-part column. To exit the Multi-Part panel, select a different column by clicking on it. You can then re-select the new multi-part column and set its properties (for example, you may want to change its name). Click the Reset icon to clear the dropped columns. Click the Delete icon to delete the multi-part column altogether and start over. To exit the Multi-part panel, select a different column by clicking on it. Create a Conversion Column Conversion columns let you convert measurements and DataHub makes it easy to convert column values including time, length, temperature and weight measurements. For example, a column containing data in ounces can be used to create a column containing the same data converted to pounds.

78 Page 78 This feature is similar to calculated columns but is limited to a fixed set of standard data conversions, so you don't have to find or recall the conversion formulae. You also don't have to recall the SQL syntax required to complete the task as you would by using a calculated column to perform this function. Here are the categories and units that can be converted with a Conversion column: Category Time Length Temperature Weight Units Year, Month, Week, Day, Hour, Minute, Second, Millisecond Kilometer, Meter, Centimeter, Millimeter, Mile, Yard, Foot, Inch, Nautical Mile Celsius, Fahrenheit, Kelvin Pounds, Ounces, Kilograms, Grams, Milligrams, Long Ton, Short Ton, Metric Ton In the following example, our dataview contains a column named MinTemp_C that contains Celsius scale data values. Our goal is to create a new Conversion column that will contain the same values converted to the Fahrenheit scale. To get started, we'll hover the mouse cursor over the gear icon of the MinTemp_C column, as shown above, and select the Conversion drop-down menu item.

79 Page 79 The Conversion panel, shown above, will appear, with these keyed elements: 1. Column - The name of the column containing the values to be converted. 2. Category - The category of measurement units being converted: Time, Length, Temperature, or Weight. 3. From Value & Units - The source measurement unit and example source value from the first record. 4. To Value & Units - The target measurement unit and example converted value from the first record. 5. New Column - The new column will be inserted into the columns at the bottom of the page, to the right of the column whose gear icon menu you used, and marked as the selected column. Click the Save icon to save the new conversion column. To exit the Conversion panel, select a different column by clicking on it. You can then re-select the new conversion column and set its properties (for example, you may want to change its name). Click the Preview icon conversion. to preview the conversion column, which will be populated with values based on the Click the Delete icon to delete the calculated column altogether and start over. To exit the Conversion panel, select a different column by clicking on it.

80 Page 80 Data Repository Management DataHub's data repository is self-tuning and self-optimizing, resulting in excellent performance. This chapter discusses the data repository and how administrators can manage it. The DataHub data repository is a self-tuning columnar data store that offers high-performance, even for very large (250M+ records) data sets. Generally, it requires very little attention and works autonomously. However, there are some administrative tasks associated with it and they're discussed in the following sections. Managing Usage It's easy for Administrators to see the amount of repository storage space being used - the usage icon is part of the main menu: It provides both a percentage value and a visual indicator for the amount of storage being used. The standard DataHub installation has a 1 TB data storage ceiling. Click Manage Objects to display the Manage Repository Objects page:

81 Page 81 The purely informational page, shown above, shows the details of the Dataviews and data objects currently loaded into the repository. The information displayed includes: The number of records loaded for each data Source The data object name and number of records The last time the data was refreshed The current state (whether it's in use or shared) The Dataviews currently using the object The amount of storage being used The display can be searched and sorted using the controls provided. Deleting Data Objects Data objects that are not in use can be deleted. Click the Unassociated Objects option to filter the list of objects down to those that have not been used. Click the red Delete icon in the right-most column of the table to delete an object. If a Padlock icon is displayed in the right-most column, you'll need to remove the object from its associated Dataviews before you can delete it.

82 Page 82 Changing the Repository Password You may need to change the repository password at some time. To do so, click Change Repository Password, highlighted in the image above. A dialog box will appear with appropriate input fields. The repository Admin user name appears at the top of this dialog box. Backing Up the Repository There are two primary components of the DataHub repository: the metadata and the data cache. The metadata is the schema structure of the repository. It identifies the connections, data sources, data objects and columns, relationships, and other non-data constructs that allow DataHub to function. The data cache is all of the data that has been loaded into the repository. While you may refresh the data cache regularly and consider a backup of it unnecessary, the metadata is possibly the result of a lot of work and should be backed up, especially after any changes to Source and Dataviews have been made. Backup is not automatic - so it should be scheduled or become a regular maintenance task for an admin user. Backup DataHub's Backup tool lets you to make a full backup of the repository metadata and/or data cache. Select the Manage Repository Objects page's Backup and Restore tab to display the tool's controls:

83 Page 83 Take the following steps to make a backup: 1. Backup Type - Select the Full Backup or Metadata And/Or Data Cache Backup option. If you select the latter, additional controls will appear for selecting and identifying which items to backup. 2. Location - The read-only location of the backup folder, based on the repository installation location is displayed here. This location can be changed in C:\Program Files\Logi Analytics\DataHub\web.config if desired; search for "backup location" and restart DataHub after making a change to the value. 3. Friendly Name - (Optional) Enter an arbitrary identifying name if it will help you distinguish backups, but keep in mind that the date and time are automatically appended (see below). 4. Backup Now - Click to start the backup. 5. Add Schedule - Click to create a scheduled backup - see the section below for more details. The information form #2 and #3 above provide you with the format of the resulting backup file path and name. In the case of the example above, it would be:

84 Page 84 %LOGI_DB_SERVER%\Infobright-postgres\Backup\fb-Q1Backup zip The "fb" highlighted in the name indicates a full backup; if you decided to backup the metadata or data cache separately, the name would include "md" or "dc" instead, respectively. You'll be asked to confirm starting the backup, in the dialog box shown above. Since a backup is intended to take a snapshot in time, the process will check to see if there are other pending updates to the repository and will queue the backup if necessary. When the backup runs, it will put the repository in Maintenance Mode, making it unavailable to other users until it completes. Click Start to begin the backup. During the backup, users will see the warning shown above.

85 Page 85 Scheduling a Backup The most convenient way to make backups is to schedule them to occur at regular intervals. If you click Add Schedule in the page, these controls will be displayed: Use these steps to create a scheduled backup: 1. Existing Backup - This indicates that a scheduled backup has already been created. Click the link to modify or delete it. 2. Backup Type - Select the Full Backup or Metadata And/Or Data Cache Backup option. If you select the latter, additional controls will appear for selecting and identifying which items to backup. 3. Location - The read-only location of the backup folder, based on the repository installation location is displayed here. This location can be changed in: C:\Program Files\Logi Analytics\DataHub\web.config if desired; search for "backup location" and restart DataHub after making a change to the value. 4. Friendly Name - (Optional) Enter an arbitrary identifying name if it will help you distinguish backups, but keep in mind that the date and time are automatically appended (see below). 5. Recurs - Select the frequency (Daily, Weekly, Monthly), interval, and start time for the backup.

86 Page Timezone - Select the appropriate time zone. 7. Start and End Date - Enter the Start Date (required) and End Date (optional) for the backup. Leaving the End Date blank will cause to backup to continue to occur at the specified frequency and interval until the schedule is modified or deleted. Remember that, when the scheduled backup occurs, the repository in be put into Maintenance Mode, making it unavailable to other users until the operation completes. Click the Save icon Click the Reset icon Click the Delete icon Click the Cancel icon to save the schedule. to clear any changes in the controls during this sequence. to delete this schedule. to cancel this operation and hide the controls. Restoring a Repository Backup If necessary, you can restore the repository from an existing backup file, overwriting its contents.

87 Page 87 Follow these steps to restore a backup file: 1. Backup Type - Select the Full Restore or Metadata And/Or Data Cache Restore option, depending on the nature of the backup file. If you select the latter, additional controls will appear for selecting and identifying which items to restore. You can determine the nature of the backup file by looking for the code embedded in its file name: fb-q1backup zip where "fb" = full backup, "md" = metadata only, "dc" = data cache only 2. Location - The read-only location of the backup folder, based on the repository installation location is displayed here. This location can be changed in C:\Program Files\Logi Analytics\DataHub\web.config if desired; search for "backup location" and restart DataHub after making a change to the value. 3. File Name - Select an existing backup file. 4. Restore Now - Click to start the restore operation. You'll be asked to confirm starting the restore:

88 Page 88 When the restore runs, it will put the repository in Maintenance Mode, making it unavailable to other users until it completes. Click Start to begin the restore. During the restore users will see the warning shown above.

89 Page 89 Scheduling Data Refreshes Logi DataHub allows you to create Dataviews using data from databases, applications, and files. As a "snapshot" of this data, the DataHub repository will likely require periodic updating in order to "refresh" the data. This document discusses that operation. DataHub makes it easy to refresh the data in your repository on a scheduled basis or manually. Essentially you will need to schedule a "data reload" for each data object. From the Dataview Status page, you also have the option of manually triggering an immediate reload. During a refresh operation, DataHub will compare the schema from the data source with the column information previously loaded. If they match the scheduled reload will continue. If the schema no longer matches, however, the reload will stop and warning will be displayed. Immediate Refresh To immediately refresh data, navigate to the page for the desired Dataview and click its Dataview Status tab: If there are multiple objects, select the desired one, then click Refresh Now, as shown above.

90 Page 90 A prompt will appear requesting that you select the type of refresh to execute. If you select: Replace Data - All data in the cache will be replaced by new data retrieved from the data source. Append Data - New data retrieved from the data source will be appended to the existing data. Existing data will not be updated. Click the Refresh Now icon to start the data loading operation immediately. Remember that, when the refresh occurs, the Dataview will be unavailable to other users until the operation completes. Schedule a Refresh The most convenient way to refresh data is to schedule it to occur at regular intervals. If there are multiple objects, select the desired one, then click Add Schedule, and these controls will be displayed:

91 Page 91 Use these steps to create a scheduled refresh: 1. Refresh Type - Select Replace Data to replace all data with new data retrieved from the data source. Select Append Data to have new data retrieved from the data source appended to the existing data - existing data will not be updated. 2. Location - (File-type data sources only) Enter a fully-qualified file name with path. Click Test to ensure that server and scheduler can access the file. 3. Recurs - Select the frequency (Daily, Weekly, Monthly), interval, and start time for the backup. 4. Timezone - Select the appropriate time zone. 5. Start and End Date - Enter the Start Date (required) and End Date (optional) for the backup. Leaving the End Date blank will cause to backup to continue to occur at the specified frequency and interval until

92 Page 92 the schedule is modified or deleted. Remember that, when the refresh occurs, the Dataview will be unavailable to other users until the operation completes. Click the Save icon to save the schedule. Click the Refresh Now icon to refresh the data immediately. Click the Reset icon Click the Delete icon Click the Cancel icon to clear any changes in the controls during this sequence. to delete this schedule. to cancel this operation and hide the controls. Once the schedule has been saved, you'll notice some changes in the Dataview Status tab: As shown above, a special "scheduled" icon (circled above) appears in the selected object graphic, the scheduled time appears with a link (highlighted above, click to edit) in the object details, and an Edit Schedule button is now visible.

93 Page 93 Using DataHub with Logi Info Logi DataHub is separate Logi product that is used to retrieve and prepare data, which it stores in its own repository database. That data is available as "Dataviews", which can be accessed as a datasource in Logi Info applications. This chapter discusses how to access and use Dataviews in Logi Info applications. Logi DataHub is a data virtualization product that connects to multiple sources, caches data for highperformance, and prepares data in intuitive ways, so you can deliver efficient reporting and analysis that doesn't affect transactional systems. DataHub includes DataSmart profiling, which automatically identifies the types and formats of your data and provides easy ways of enriching it through new calculations and manipulation of multi-part data. It uses its own repository database to store prepared data and that database can be accessed by a Logi Info application just like any other datasource. Connecting to Dataviews Logi Info applications use a special Connection element, Connection.DataHub, in the _Settings definition to connect to Dataviews: This Connection element is available in Logi Info v SP2 and later. It provides a single connection point to all of the Dataviews available in the DataHub repository database. The element's attributes are:

94 Page 94 Attribute ID Server User Command Timeout Connection String Database Password Port SQL Syntax Description (Required) Specifies an element identifier, unique within the application. (Required) Specifies the name of the server hosting the DataHub repository database. (Required) Specifies a user name for accessing the DataHub repository database. The standard installed database user name is postgres. Specifies the amount of time, in seconds, before the request to connect to the DataHub repository is presumed to have failed. The default value is 60 seconds. Specifies a full connection string to the DataHub repository database. If a value is defined here, it will override all other attributes for this element and be used for the connection to the database. Specifies The name of the DataHub repository database. The standard installed database user name (the default value) is logidatahub. Specifies a password for accessing the DataHub repository database. Specifies the port address of the DataHub repository database. The standard installed database port number (the default value) is Specifies the type of database server. The value is used by ActiveSQL datalayers which must know the database type to generate correct SQL statements. Once the connection has been made, then dataviews are available using the standard Logi Studio data retrieval elements and tools: In the example shown above, Studio's Database Browser tool is being used to explore the DataHub repository.

95 Page 95 Dataviews in the Metadata Builder Wizard Logi Info installations that include the Self-Service Reporting Module add-on enable the Active Query Builder and Metadata elements in Logi Studio. The Active Query Builder allows Analysis Grid users to determine what dataset to work with, by giving them a way to interactively select tables and joins, at runtime. The tables, views, columns, and relationships are available for use with the Active Query Builder are determined by building a "metadata file", associated with a Connection element, which enumerates all of the database objects that will be available to users for selection in the Analysis Grid. The Metadata element provides a tool, the Metadata Builder wizard, which is used during development to create the file. The Metadata Builder wizard can create metadata files for DataHub dataviews. To do this, the Metadata element is added as child of the Connection.DataHub element in _Settings, as shown above.

96 Page 96 The Metadata Builder wizard is then used, as shown above, to create the metadata file for the DataHub dataview. The Active Query Builder is then able to use the DataHub metadata and display it in the Analysis Grid interface, as shown above.

97 Page 97 Advanced Repository Configurations DataHub's columnar data repository generally requires no configuration after installation. However, there may be special circumstances where a tweak is desirable and related configurations are discussed in this chapter. Logi DataHub uses the Infobright columnar relational database as its data repository. It's a self-tuning, highperformance storage system that generally requires nothing from administrators after installation. However, there are special circumstances we've identified wherein tweaking the database's configuration or schema may be desirable. Allocating Memory The optimum amount of main memory (the "heap") available to the Infobright server is automatically determined and set by the DataHub installer, based on the physical memory available in the server. Infobright's suggestion is that 60-80% of memory be allocated to the database server. However, there may be situations that warrant changing the setting. For example, the initial memory setting assumes that there are no other services consuming significant memory on the machine. If other services do consume significant memory, a lower memory setting for Infobright might be better. You don't want to allocate so much memory to Infobright that you "starve" other applications. Or, perhaps you decided to physically add more memory to the server after the DataHub installation and want to change the Infobright allocation to take advantage of it (in which case, do the 60-80% math on the new memory size to get the correct setting value). In any case, assuming a standard installation location, to change the memory allocation: 1. Navigate to, and make a safety copy of, this file: C:\Program Files\Logi Analytics\DataHub\Infobright-postgres\ib_data\infobright.cnf 2. Use a text editor to open the original file and then search in it for "ServerMainHeapSize". 3. Set the ServerMainHeapSize value to a new value (in MB) and save the file. 4. Use Windows Admin Tools Services tool to stop and restart the infobright-iee-postgres service.

98 Page 98 Configuring Look-up Columns Infobright supports a column designation of "Lookup", which causes columns to be stored uncompressed, in memory, for very fast access. It improves performance when applied to columns that will be used for grouping, filtering, and aggregating data. When data is loaded, DataHub applies this designation automatically to Dimension (text only) columns, based on certain criteria and, under most circumstances, it results in very fast query performance. However, if you have a very large table with a very large number of distinct values, you may consume significant amounts of memory using lookups, adversely impacting performance. The criteria DataHub uses to designate lookup columns is determined by their "cardinality", the measure of the number of columns in a set, in this case the ratio of the "number of distinct values" to the "total number of records". This process is illustrated in the table shown above. The default cardinality threshold is 20%. To change that threshold, assuming a standard installation location, navigate to and open this text file: C:\Program Files\Logi Analytics\DataHub\Web.config And search for this line: <add key="cardinalitythreshold" value="0.2" /> Change the value to the new desired cardinality threshold percentage, save the file, and restart DataHub. Remember that changes to this value will not affect currently loaded data - you must reload the data to have it take effect.

Talend Data Preparation Free Desktop. Getting Started Guide V2.1

Talend Data Preparation Free Desktop. Getting Started Guide V2.1 Talend Data Free Desktop Getting Guide V2.1 1 Talend Data Training Getting Guide To navigate to a specific location within this guide, click one of the boxes below. Overview of Data Access Data And Getting

More information

NovaBACKUP CMon v19.0

NovaBACKUP CMon v19.0 June 2017 NovaBACKUP CMon v19.0 User Manual Features and specifications are subject to change without notice. The information provided herein is provided for informational and planning purposes only. 2017

More information

Working with Workbooks

Working with Workbooks Working with Workbooks In Datameer, you can create a workbook to get to new insights with your data. Inside the workbook, you can add additional data sources, change the column and sheet names, collapse

More information

SSRS 2016 for WITS. Web Portal User Guide. Applies to: WITS Version 18.0+

SSRS 2016 for WITS. Web Portal User Guide. Applies to: WITS Version 18.0+ SSRS 2016 for WITS Web Portal User Guide Applies to: WITS Version 18.0+ Microsoft SQL Server Reporting Services (SSRS) 2016 Last Updated June 1, 2017 Microsoft SQL Server 2016 Report Builder 3.0 Version

More information

SAS Data Explorer 2.1: User s Guide

SAS Data Explorer 2.1: User s Guide SAS Data Explorer 2.1: User s Guide Working with SAS Data Explorer Understanding SAS Data Explorer SAS Data Explorer and the Choose Data Window SAS Data Explorer enables you to copy data to memory on SAS

More information

VST Hospital Administrator Guide. Version 2.0.4

VST Hospital Administrator Guide. Version 2.0.4 VST Hospital Administrator Guide Version 2.0.4 Notice Copyright 2002- Vocera Communications, Inc. All rights reserved. Vocera is a registered trademark of Vocera Communications, Inc. This software is licensed,

More information

Workstation Configuration Guide

Workstation Configuration Guide Workstation Configuration Guide August 13, 2018 Version 9.6.134.78 For the most recent version of this document, visit our documentation website. Table of Contents 1 Workstation configuration 4 1.1 Considerations

More information

Eucalyptus User Console Guide

Eucalyptus User Console Guide Eucalyptus 3.4.1 User Console Guide 2013-12-11 Eucalyptus Systems Eucalyptus Contents 2 Contents User Console Overview...5 Install the Eucalyptus User Console...6 Install on Centos / RHEL 6.3...6 Configure

More information

ROCK-POND REPORTING 2.1

ROCK-POND REPORTING 2.1 ROCK-POND REPORTING 2.1 Installation and Setup Guide Revised on 09/25/2014 TABLE OF CONTENTS ROCK-POND REPORTING 2.1... 1 SUPPORT FROM ROCK-POND SOLUTIONS... 2 ROCK-POND REPORTING OVERVIEW... 2 INFRASTRUCTURE

More information

Perceptive TransForm E-Forms Manager 8.x. Installation and Configuration Guide March 1, 2012

Perceptive TransForm E-Forms Manager 8.x. Installation and Configuration Guide March 1, 2012 Perceptive TransForm E-Forms Manager 8.x Installation and Configuration Guide March 1, 2012 Table of Contents 1 Introduction... 3 1.1 Intended Audience... 3 1.2 Related Resources and Documentation... 3

More information

ZENworks Reporting System Reference. January 2017

ZENworks Reporting System Reference. January 2017 ZENworks Reporting System Reference January 2017 Legal Notices For information about legal notices, trademarks, disclaimers, warranties, export and other use restrictions, U.S. Government rights, patent

More information

USER GUIDE WASHINGTON ACCESS

USER GUIDE WASHINGTON ACCESS edirect USER GUIDE WASHINGTON ACCESS to INSTRUCTION and MEASUREMENT (WA-AIM) Spring 2018 Administration Produced by Data Recognition Corporation (DRC) 13490 Bass Lake Road Maple Grove, MN 55311 Direct:

More information

Getting Around. Welcome Quest. My Fundraising Tools

Getting Around. Welcome Quest. My Fundraising Tools As a registered participant of this event, you have a variety of tools at your fingertips to help you reach your goals! Your fundraising center will be the hub for managing your involvement and fundraising

More information

Workspace Administrator Help File

Workspace Administrator Help File Workspace Administrator Help File Table of Contents HotDocs Workspace Help File... 1 Getting Started with Workspace... 3 What is HotDocs Workspace?... 3 Getting Started with Workspace... 3 To access Workspace...

More information

Logi Ad Hoc Reporting Management Console Overview

Logi Ad Hoc Reporting Management Console Overview Logi Ad Hoc Reporting Management Console Overview Version 12 July 2016 Page 2 Table of Contents INTRODUCTION...3 System Requirements...4 Management Console Overview...5 Configuration Wizard Overview...9

More information

SharePoint General Instructions

SharePoint General Instructions SharePoint General Instructions Table of Content What is GC Drive?... 2 Access GC Drive... 2 Navigate GC Drive... 2 View and Edit My Profile... 3 OneDrive for Business... 3 What is OneDrive for Business...

More information

RELEASE NOTES. Version NEW FEATURES AND IMPROVEMENTS

RELEASE NOTES. Version NEW FEATURES AND IMPROVEMENTS S AND S Implementation of the Google Adwords connection type Implementation of the NetSuite connection type Improvements to the Monarch Swarm Library Column sorting and enhanced searching Classic trapping

More information

Logi Ad Hoc Reporting System Administration Guide

Logi Ad Hoc Reporting System Administration Guide Logi Ad Hoc Reporting System Administration Guide Version 10.3 Last Updated: August 2012 Page 2 Table of Contents INTRODUCTION... 4 Target Audience... 4 Application Architecture... 5 Document Overview...

More information

Welcome to your Lacerte products! We appreciate your business.

Welcome to your Lacerte products! We appreciate your business. Welcome to your Lacerte products! We appreciate your business. Please follow steps 1 through 4 in this guide: Step 1: Install the Final 2014 Program Step 2: Install the 2015 Program Step 3: Transfer Your

More information

Workstation Configuration

Workstation Configuration Workstation Configuration December 12, 2017 - Version 9.4 - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

More information

ActivityTimeline User Guide https://activitytimeline.com

ActivityTimeline User Guide https://activitytimeline.com ActivityTimeline User Guide https://activitytimeline.com Copyright 2018 ActivityTimeline Contents 1. Getting Started... 3 1.1 Overview... 3 1.2 Logging In and Out... 3 1.3 Dashboard Overview... 4 1.4 Issues

More information

Switching to Sheets from Microsoft Excel Learning Center gsuite.google.com/learning-center

Switching to Sheets from Microsoft Excel Learning Center gsuite.google.com/learning-center Switching to Sheets from Microsoft Excel 2010 Learning Center gsuite.google.com/learning-center Welcome to Sheets Now that you've switched from Microsoft Excel to G Suite, learn how to use Google Sheets

More information

Business Insight Authoring

Business Insight Authoring Business Insight Authoring Getting Started Guide ImageNow Version: 6.7.x Written by: Product Documentation, R&D Date: August 2016 2014 Perceptive Software. All rights reserved CaptureNow, ImageNow, Interact,

More information

InSite Prepress Portal Quick Start Guide IPP 9.0

InSite Prepress Portal Quick Start Guide IPP 9.0 InSite Prepress Portal Quick Start Guide IPP 9.0 Exported on 07/26/2018 Table of Contents 1 What is InSite Prepress Portal?... 4 1.1 Getting familiar with InSite Prepress Portal 9.0...4 1.2 Use a single

More information

Perceptive TransForm E-Forms Manager

Perceptive TransForm E-Forms Manager Perceptive TransForm E-Forms Manager Installation and Setup Guide Version: 8.x Date: February 2017 2016-2017 Lexmark. All rights reserved. Lexmark is a trademark of Lexmark International Inc., registered

More information

Apptix Online Backup by Mozy User Guide

Apptix Online Backup by Mozy User Guide Apptix Online Backup by Mozy User Guide 1.10.1.2 Contents Chapter 1: Overview...5 Chapter 2: Installing Apptix Online Backup by Mozy...7 Downloading the Apptix Online Backup by Mozy Client...7 Installing

More information

Getting started with Marketing

Getting started with  Marketing Getting started with E-mail Marketing 3 Create Email a marketing campaign remains one of the most important tools available to digital marketers today, providing a cost-effective technique to reach prospects

More information

External Data Connector for SharePoint

External Data Connector for SharePoint External Data Connector for SharePoint Last Updated: July 2017 Copyright 2014-2017 Vyapin Software Systems Private Limited. All rights reserved. This document is being furnished by Vyapin Software Systems

More information

External Data Connector for SharePoint

External Data Connector for SharePoint External Data Connector for SharePoint Last Updated: August 2014 Copyright 2014 Vyapin Software Systems Private Limited. All rights reserved. This document is being furnished by Vyapin Software Systems

More information

Workstation Configuration

Workstation Configuration Workstation Configuration December 15, 2017 - Version 9.3 - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

More information

SharePoint: Fundamentals

SharePoint: Fundamentals SharePoint: Fundamentals This class will introduce you to SharePoint and cover components available to end users in a typical SharePoint site. To access SharePoint, you will need to log into Office 365.

More information

Installation Guide. Sage Intelligence Reporting

Installation Guide. Sage Intelligence Reporting Installation Guide Sage 300 2016 Intelligence Reporting 07.2016 Table of Contents 1.0 Introduction 3 1.1 How to use this Guide 3 1.2 Topic summary 3 1.3 Network configurations 4 1.3.1 Typical network configurations

More information

Enterprise Data Catalog for Microsoft Azure Tutorial

Enterprise Data Catalog for Microsoft Azure Tutorial Enterprise Data Catalog for Microsoft Azure Tutorial VERSION 10.2 JANUARY 2018 Page 1 of 45 Contents Tutorial Objectives... 4 Enterprise Data Catalog Overview... 5 Overview... 5 Objectives... 5 Enterprise

More information

Oracle Big Data Cloud Service, Oracle Storage Cloud Service, Oracle Database Cloud Service

Oracle Big Data Cloud Service, Oracle Storage Cloud Service, Oracle Database Cloud Service Demo Introduction Keywords: Oracle Big Data Cloud Service, Oracle Storage Cloud Service, Oracle Database Cloud Service Goal of Demo: Oracle Big Data Preparation Cloud Services can ingest data from various

More information

Install and upgrade Qlik Sense. Qlik Sense 3.0 Copyright QlikTech International AB. All rights reserved.

Install and upgrade Qlik Sense. Qlik Sense 3.0 Copyright QlikTech International AB. All rights reserved. Install and upgrade Qlik Sense Qlik Sense 3.0 Copyright 1993-2016 QlikTech International AB. All rights reserved. Copyright 1993-2016 QlikTech International AB. All rights reserved. Qlik, QlikTech, Qlik

More information

ActivityTimeline User Guide

ActivityTimeline User Guide ActivityTimeline User Guide https://activitytimeline.com Copyright 2018 ActivityTimeline Contents 1. Getting Started... 3 1.1 Overview... 3 1.2 Logging In and Out... 3 1.3 Dashboard Overview... 5 1.4 Header

More information

SharePoint: Fundamentals

SharePoint: Fundamentals SharePoint: Fundamentals This class will introduce you to SharePoint and cover components available to end users in a typical SharePoint site. To access SharePoint, you will need to log into Office 365.

More information

BBVA Compass Spend Net Payables

BBVA Compass Spend Net Payables User Guide BBVA Compass Spend Net Payables User Guide Vault Services Table of Contents Introduction 2 Technical Requirements 2 Getting started 3 Sign In 3 General Navigation 4 Upload/Create Payment 5

More information

Administrator s Guide

Administrator s Guide Administrator s Guide 1995 2011 Open Systems Holdings Corp. All rights reserved. No part of this manual may be reproduced by any means without the written permission of Open Systems, Inc. OPEN SYSTEMS

More information

SAP BusinessObjects Profitability and Cost Management Upgrade Guide

SAP BusinessObjects Profitability and Cost Management Upgrade Guide PUBLIC SAP BusinessObjects Profitability and Cost Management Document Version: 10.0 2019-04-09 SAP BusinessObjects Profitability and Cost Management Upgrade Guide 2019 SAP SE or an SAP affiliate company.

More information

Logi Ad Hoc Management Console Overview

Logi Ad Hoc Management Console Overview Logi Ad Hoc Management Console Overview Version 10 Last Updated: July 2010 Page 2 Table of Contents INTRODUCTION... 3 System Requirements... 4 Management Console Overview... 5 Configuration Wizard Overview...

More information

MicroStrategy Analytics Desktop

MicroStrategy Analytics Desktop MicroStrategy Analytics Desktop Quick Start Guide MicroStrategy Analytics Desktop is designed to enable business professionals like you to explore data, simply and without needing direct support from IT.

More information

EDAConnect-Dashboard User s Guide Version 3.4.0

EDAConnect-Dashboard User s Guide Version 3.4.0 EDAConnect-Dashboard User s Guide Version 3.4.0 Oracle Part Number: E61758-02 Perception Software Company Confidential Copyright 2015 Perception Software All Rights Reserved This document contains information

More information

AT&T IP Flexible Reach Group Administrator Guide

AT&T IP Flexible Reach Group Administrator Guide AT&T IP Flexible Reach Group Administrator Guide 1 Get Started... 7 Log In... 8 What a Group Administrator Can Do... 10 About Premier... 13 Use Premier... 14 Use the AT&T IP Flexible Reach Customer Portal...

More information

How to Use Google. Sign in to your Chromebook. Let s get started: The sign-in screen. https://www.youtube.com/watch?v=ncnswv70qgg

How to Use Google. Sign in to your Chromebook. Let s get started: The sign-in screen. https://www.youtube.com/watch?v=ncnswv70qgg How to Use Google Sign in to your Chromebook https://www.youtube.com/watch?v=ncnswv70qgg Use a Google Account to sign in to your Chromebook. A Google Account lets you access all of Google s web services

More information

Logi Ad Hoc Reporting System Administration Guide

Logi Ad Hoc Reporting System Administration Guide Logi Ad Hoc Reporting System Administration Guide Version 12 July 2016 Page 2 Table of Contents INTRODUCTION... 4 APPLICATION ARCHITECTURE... 5 DOCUMENT OVERVIEW... 6 GENERAL USER INTERFACE... 7 CONTROLS...

More information

Ektron Advanced. Learning Objectives. Getting Started

Ektron Advanced. Learning Objectives. Getting Started Ektron Advanced 1 Learning Objectives This workshop introduces you beyond the basics of Ektron, the USF web content management system that is being used to modify department web pages. This workshop focuses

More information

ControlPoint. Advanced Installation Guide. September 07,

ControlPoint. Advanced Installation Guide. September 07, ControlPoint Advanced Installation Guide September 07, 2017 www.metalogix.com info@metalogix.com 202.609.9100 Copyright International GmbH., 2008-2017 All rights reserved. No part or section of the contents

More information

Product Documentation. ER/Studio Portal. User Guide. Version Published February 21, 2012

Product Documentation. ER/Studio Portal. User Guide. Version Published February 21, 2012 Product Documentation ER/Studio Portal User Guide Version 1.6.3 Published February 21, 2012 2012 Embarcadero Technologies, Inc. Embarcadero, the Embarcadero Technologies logos, and all other Embarcadero

More information

Logi Ad Hoc Reporting Management Console Usage Guide

Logi Ad Hoc Reporting Management Console Usage Guide Logi Ad Hoc Reporting Management Console Usage Guide Version 12.1 July 2016 Page 2 Contents Introduction... 5 Target Audience... 5 System Requirements... 6 Components... 6 Supported Reporting Databases...

More information

Getting Started with Soonr

Getting Started with Soonr WWW.SOONR.COM Getting Started with Soonr A Quick Start Guide for New Users Soonr Inc. 12/19/2012 Revision 1.1 Copyright 2012, Soonr Inc., all rights reserved. Table of Contents 1 How Soonr Workplace Works...

More information

LiveNX Upgrade Guide from v5.1.2 to v Windows

LiveNX Upgrade Guide from v5.1.2 to v Windows LIVEACTION, INC. LiveNX Upgrade Guide from v5.1.2 to v5.1.3 - Windows UPGRADE LiveAction, Inc. 3500 Copyright WEST BAYSHORE 2016 LiveAction, ROAD Inc. All rights reserved. LiveAction, LiveNX, LiveUX, the

More information

Visual Workflow Implementation Guide

Visual Workflow Implementation Guide Version 30.0: Spring 14 Visual Workflow Implementation Guide Note: Any unreleased services or features referenced in this or other press releases or public statements are not currently available and may

More information

Armatus 2.0 Administrator Procedures

Armatus 2.0 Administrator Procedures 2015 Armatus 2.0 Administrator Procedures Praesidium 2015. All rights reserved. Armatus 2.0 Administrator Procedures Overview Introduction This guide shows you how to perform tasks in Armatus 2.0 Administrator

More information

Alfresco Content Services 5.2. Getting Started Guide

Alfresco Content Services 5.2. Getting Started Guide Alfresco Content Services 5.2 Getting Started Guide Contents Contents Getting started with Alfresco Share... 3 Signing in...3 Personalizing Alfresco Share... 4 Setting up your dashboard... 4 Updating your

More information

Workstation Configuration

Workstation Configuration Workstation Configuration September 22, 2015 - Version 9 & 9.1 - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

More information

Talend Open Studio for Data Quality. User Guide 5.5.2

Talend Open Studio for Data Quality. User Guide 5.5.2 Talend Open Studio for Data Quality User Guide 5.5.2 Talend Open Studio for Data Quality Adapted for v5.5. Supersedes previous releases. Publication date: January 29, 2015 Copyleft This documentation is

More information

TABLE OF CONTENTS PAGE

TABLE OF CONTENTS PAGE Alchemex for SAP Business One Getting Started Guide Sept 2010 TABLE OF CONTENTS PAGE Getting Started Guide Introduction... 1 What is Alchemex for SAP Business One?... 1 System Requirements... 2 Recommended

More information

NTP Software File Auditor for Windows Edition

NTP Software File Auditor for Windows Edition NTP Software File Auditor for Windows Edition An NTP Software Installation Guide Abstract This guide provides a short introduction to installation and initial configuration of NTP Software File Auditor

More information

Installing and Setting Up the Snap-on EPC. Rev.1.10 (10 Oct 2013) PN EN

Installing and Setting Up the Snap-on EPC. Rev.1.10 (10 Oct 2013) PN EN Installing and Setting Up the Snap-on EPC Rev.1.10 (10 Oct 2013) PN 275-0800-EN Table of Contents 1. Introduction... 3 2. Minimum Requirements... 4 3. Installing the Snap-on EPC... 6 4. Licensing the Snap-on

More information

This guide details the deployment and initial configuration necessary to maximize the value of JetAdvantage Insights.

This guide details the deployment and initial configuration necessary to maximize the value of JetAdvantage Insights. HP JetAdvantage Insights Deployment Guide This guide details the deployment and initial configuration necessary to maximize the value of JetAdvantage Insights. 1. Overview HP JetAdvantage Insights provides

More information

AT&T IP Flexible Reach Department Administrator Guide

AT&T IP Flexible Reach Department Administrator Guide AT&T IP Flexible Reach Department Administrator Guide 1 Contents Get Started... 5 Log In... 6 What a Department Administrator Can Do... 8 About Premier... 11 Use Premier... 12 Use the AT&T IP Flexible

More information

Netwrix Auditor for Active Directory

Netwrix Auditor for Active Directory Netwrix Auditor for Active Directory Quick-Start Guide Version: 8.0 4/22/2016 Legal Notice The information in this publication is furnished for information use only, and does not constitute a commitment

More information

OU EDUCATE TRAINING MANUAL

OU EDUCATE TRAINING MANUAL OU EDUCATE TRAINING MANUAL OmniUpdate Web Content Management System El Camino College Staff Development 310-660-3868 Course Topics: Section 1: OU Educate Overview and Login Section 2: The OmniUpdate Interface

More information

Sage 50 U.S. Edition Intelligence Reporting Getting Started Guide

Sage 50 U.S. Edition Intelligence Reporting Getting Started Guide Sage Intelligence Reporting Sage 50 U.S. Edition Intelligence Reporting Getting Started Guide Table of Contents Introduction... 2 System requirements... 3 How it works... 4 Getting started guide... 5 Running

More information

Griffin Training Manual Grif-WebI Introduction (For Analysts)

Griffin Training Manual Grif-WebI Introduction (For Analysts) Griffin Training Manual Grif-WebI Introduction (For Analysts) Alumni Relations and Development The University of Chicago Table of Contents Chapter 1: Defining WebIntelligence... 1 Chapter 2: Working with

More information

Troubleshooting. Participants List Displays Multiple Entries for the Same User

Troubleshooting. Participants List Displays Multiple Entries for the Same User Participants List Displays Multiple Entries for the Same User, page 1 Internet Explorer Browser Not Supported, page 2 "404 Page Not Found" Error Encountered, page 2 Cannot Start or Join Meeting, page 2

More information

Spotfire Advanced Data Services. Lunch & Learn Tuesday, 21 November 2017

Spotfire Advanced Data Services. Lunch & Learn Tuesday, 21 November 2017 Spotfire Advanced Data Services Lunch & Learn Tuesday, 21 November 2017 CONFIDENTIALITY The following information is confidential information of TIBCO Software Inc. Use, duplication, transmission, or republication

More information

TABLE OF CONTENTS PAGE

TABLE OF CONTENTS PAGE Alchemex 7 for Sage 50 Getting Started Guide Oct 2010 1 TABLE OF CONTENTS PAGE Getting Started Guide Introduction... 5 What is Alchemex 7 for Sage 50?... 5 System Requirements... 6 Recommended System Requirements...

More information

KENDLE CLINICAL TRIALS PORTAL USER GUIDE

KENDLE CLINICAL TRIALS PORTAL USER GUIDE KENDLE CLINICAL TRIALS PORTAL USER GUIDE Notes to Users Copyright Copyright by Kendle International Inc., 2010. All rights reserved. No part of this document may be downloaded, reproduced, transmitted,

More information

Introduction & Navigation

Introduction & Navigation Introduction & Navigation Logging In to Marketing Cloud 1. Open a tab in either the Chrome or Firefox web browser. 2. Place your cursor in the URL bar then type mc.exacttarget.com. 3. Strike the Enter

More information

Enterprise Vault.cloud CloudLink Google Account Synchronization Guide. CloudLink to 4.0.3

Enterprise Vault.cloud CloudLink Google Account Synchronization Guide. CloudLink to 4.0.3 Enterprise Vault.cloud CloudLink Google Account Synchronization Guide CloudLink 4.0.1 to 4.0.3 Enterprise Vault.cloud: CloudLink Google Account Synchronization Guide Last updated: 2018-06-08. Legal Notice

More information

Overview. Top. Welcome to SysTools MailXaminer

Overview. Top. Welcome to SysTools MailXaminer Table of Contents Overview... 2 System Requirements... 3 Installation of SysTools MailXaminer... 4 Uninstall Software... 6 Software Menu Option... 8 Software Navigation Option... 10 Complete Steps to Recover,

More information

Global Software, Inc.'s Spreadsheet Writeback User Manual. Release V14 R1 M2

Global Software, Inc.'s Spreadsheet Writeback User Manual. Release V14 R1 M2 Global Software, Inc.'s Spreadsheet Writeback User Manual Release V14 R1 M2 Worldwide Headquarters 3201 Beechleaf Court, Suite 170 Raleigh, NC 27604 USA +1.919.872.7800 www.glbsoft.com EMEA/APAC Headquarters

More information

www.insightsoftware.com for JD Edwards World and EnterpriseOne Version: 3.3 Last Updated: September 2, 2011 Contents 1. Architecture... 3 Overview... 3 Deployment... 4 Database Space... 4 Using This Guide...

More information

IMPLEMENTING DATA.COM CLEAN FOR ACCOUNTS, CONTACTS, AND LEADS

IMPLEMENTING DATA.COM CLEAN FOR ACCOUNTS, CONTACTS, AND LEADS IMPLEMENTING DATA.COM CLEAN FOR ACCOUNTS, CONTACTS, AND LEADS Data.com Clean Overview In addition to finding and adding new accounts, contacts, and leads, Data.com cleans your existing Salesforce data

More information

SCHULICH MEDICINE & DENTISTRY Website Updates August 30, Administrative Web Editor Guide v6

SCHULICH MEDICINE & DENTISTRY Website Updates August 30, Administrative Web Editor Guide v6 SCHULICH MEDICINE & DENTISTRY Website Updates August 30, 2012 Administrative Web Editor Guide v6 Table of Contents Chapter 1 Web Anatomy... 1 1.1 What You Need To Know First... 1 1.2 Anatomy of a Home

More information

Informatica Cloud Data Integration Winter 2017 December. What's New

Informatica Cloud Data Integration Winter 2017 December. What's New Informatica Cloud Data Integration Winter 2017 December What's New Informatica Cloud Data Integration What's New Winter 2017 December January 2018 Copyright Informatica LLC 2016, 2018 This software and

More information

Contents Release Notes System Requirements Using Jive for Office

Contents Release Notes System Requirements Using Jive for Office Jive for Office TOC 2 Contents Release Notes...3 System Requirements... 4 Using Jive for Office... 5 What is Jive for Office?...5 Working with Shared Office Documents... 5 Get set up...6 Get connected

More information

Doc. Version 1.0 Updated:

Doc. Version 1.0 Updated: OneStop Reporting Report Composer 3.5 User Guide Doc. Version 1.0 Updated: 2012-01-02 Table of Contents Introduction... 2 Who should read this manual... 2 What s included in this manual... 2 Symbols and

More information

RONA e-billing User Guide

RONA e-billing User Guide RONA e-billing Contractor Self-Service Portal User Guide RONA e-billing User Guide 2015-03-10 Table of Contents Welcome to RONA e-billing What is RONA e-billing?... i RONA e-billing system requirements...

More information

Group Administrator Guide

Group Administrator Guide Get Started... 4 What a Group Administrator Can Do... 7 About Premier... 10 Use Premier... 11 Use the AT&T IP Flexible Reach Customer Portal... 14 Search and Listing Overview... 17 What s New in the Group

More information

FORK Xchange Suite User Manual Version 3.0

FORK Xchange Suite User Manual Version 3.0 FORK Xchange Suite User Manual Version 3.0 Primestream Corporation Copyright 2014 Primestream Corp. All rights reserved. Your rights to the software are governed by the accompanying software license agreement.

More information

CalPlan. Creating a Unit Plan Navigating CalPlan Workbook 1/25/18

CalPlan. Creating a Unit Plan Navigating CalPlan Workbook 1/25/18 CalPlan Creating a Unit Plan Workbook 1/25/18 Table of Contents Exercise 1: Log into the Workspace & Run a CalPlan Report... 3 Exercise 2: Launching CalPlan and Setting Your Entity... 10 Exercise 3: Actualized

More information

CleanMyPC User Guide

CleanMyPC User Guide CleanMyPC User Guide Copyright 2017 MacPaw Inc. All rights reserved. macpaw.com CONTENTS Overview 3 About CleanMyPC... 3 System requirements... 3 Download and installation 4 Activation and license reset

More information

Web Console Setup & User Guide. Version 7.1

Web Console Setup & User Guide. Version 7.1 Web Console Setup & User Guide Version 7.1 1 Contents Page Number Chapter 1 - Installation and Access 3 Server Setup Client Setup Windows Client Setup Mac Client Setup Linux Client Setup Interoperation

More information

8.0 Help for Community Managers About Jive for Google Docs...4. System Requirements & Best Practices... 5

8.0 Help for Community Managers About Jive for Google Docs...4. System Requirements & Best Practices... 5 for Google Docs Contents 2 Contents 8.0 Help for Community Managers... 3 About Jive for Google Docs...4 System Requirements & Best Practices... 5 Administering Jive for Google Docs... 6 Understanding Permissions...6

More information

BE Share. Microsoft Office SharePoint Server 2010 Basic Training Guide

BE Share. Microsoft Office SharePoint Server 2010 Basic Training Guide BE Share Microsoft Office SharePoint Server 2010 Basic Training Guide Site Contributor Table of Contents Table of Contents Connecting From Home... 2 Introduction to BE Share Sites... 3 Navigating SharePoint

More information

Introduction...5. Chapter 1. Installing System Installing Server and ELMA Designer... 7

Introduction...5. Chapter 1. Installing System Installing Server and ELMA Designer... 7 Chapter 1 Contents Installing System Contents Introduction...5 Chapter 1. Installing System... 6 1.1. Installing Server and ELMA Designer... 7 1.2. Verifying ELMA Server and ELMA Designer Installation...

More information

TopView SQL Configuration

TopView SQL Configuration TopView SQL Configuration Copyright 2013 EXELE Information Systems, Inc. EXELE Information Systems (585) 385-9740 Web: http://www.exele.com Support: support@exele.com Sales: sales@exele.com Table of Contents

More information

End User s Guide Release 5.0

End User s Guide Release 5.0 [1]Oracle Application Express End User s Guide Release 5.0 E39146-04 August 2015 Oracle Application Express End User's Guide, Release 5.0 E39146-04 Copyright 2012, 2015, Oracle and/or its affiliates. All

More information

Tableau Server - 101

Tableau Server - 101 Tableau Server - 101 Prepared By: Ojoswi Basu Certified Tableau Consultant LinkedIn: https://ca.linkedin.com/in/ojoswibasu Introduction Tableau Software was founded on the idea that data analysis and subsequent

More information

Multi-Sponsor Environment. SAS Clinical Trial Data Transparency User Guide

Multi-Sponsor Environment. SAS Clinical Trial Data Transparency User Guide Multi-Sponsor Environment SAS Clinical Trial Data Transparency User Guide Version 6.0 01 December 2017 Contents Contents 1 Overview...1 2 Setting up Your Account...3 2.1 Completing the Initial Email and

More information

Aspera Connect Windows XP, 2003, Vista, 2008, 7. Document Version: 1

Aspera Connect Windows XP, 2003, Vista, 2008, 7. Document Version: 1 Aspera Connect 2.6.3 Windows XP, 2003, Vista, 2008, 7 Document Version: 1 2 Contents Contents Introduction... 3 Setting Up... 4 Upgrading from a Previous Version...4 Installation... 4 Set Up Network Environment...

More information

USING MICROSOFT OUTLOOK 2016

USING MICROSOFT OUTLOOK 2016 U N I V E R S I T Y O F S O U T H E R N C A L I F O R N I A USING MICROSOFT OUTLOOK 2016 USER S GUIDE FEBRUARY 2016 U N I V E R S I T Y O F S O U T H E R N C A L I F O R N I A 2016 UNIVERSITY OF SOUTHERN

More information

Lab 7 Macros, Modules, Data Access Pages and Internet Summary Macros: How to Create and Run Modules vs. Macros 1. Jumping to Internet

Lab 7 Macros, Modules, Data Access Pages and Internet Summary Macros: How to Create and Run Modules vs. Macros 1. Jumping to Internet Lab 7 Macros, Modules, Data Access Pages and Internet Summary Macros: How to Create and Run Modules vs. Macros 1. Jumping to Internet 1. Macros 1.1 What is a macro? A macro is a set of one or more actions

More information

Deploying a System Center 2012 R2 Configuration Manager Hierarchy

Deploying a System Center 2012 R2 Configuration Manager Hierarchy Deploying a System Center 2012 R2 Configuration Manager Hierarchy This document is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED, OR STATUTORY, AS TO THE INFORMATION

More information

Style Report Enterprise Edition

Style Report Enterprise Edition INTRODUCTION Style Report Enterprise Edition Welcome to Style Report Enterprise Edition! Style Report is a report design and interactive analysis package that allows you to explore, analyze, monitor, report,

More information

VERSION 7 JUNE Union Benefits. Employer User Guide Data Collection Tool

VERSION 7 JUNE Union Benefits. Employer User Guide Data Collection Tool VERSION 7 JUNE 2018 Union Benefits Employer User Guide Data Collection Tool About this guide This document is intended to provide an overview of the main sections of the Data Collection Tool ( DCT ) for

More information

Business Intelligence on Dell Quickstart Data Warehouse Appliance Using Toad Business Intelligence Suite

Business Intelligence on Dell Quickstart Data Warehouse Appliance Using Toad Business Intelligence Suite Business Intelligence on Dell Quickstart Data Warehouse Appliance Using Toad Business Intelligence Suite This Dell technical white paper explains how to connect Toad Business Intelligence Suite to Quickstart

More information