IBM Operations Analytics Log Analysis Version User's Guide IBM

Size: px
Start display at page:

Download "IBM Operations Analytics Log Analysis Version User's Guide IBM"

Transcription

1 IBM Operations Analytics Log Analysis Version User's Guide IBM

2

3 IBM Operations Analytics Log Analysis Version User's Guide IBM

4 Note Before using this information and the product it supports, read the information in Notices on page 81. Edition notice This edition applies to IBM Operations Analytics Log Analysis and to all subsequent releases and modifications until otherwise indicated in new editions. References in content to IBM products, software, programs, services or associated technologies do not imply that they will be available in all countries in which IBM operates. Content, including any plans contained in content, may change at any time at IBM's sole discretion, based on market opportunities or other factors, and is not intended to be a commitment to future content, including product or feature availability, in any way. Statements regarding IBM's future direction or intent are subject to change or withdrawal without notice and represent goals and objectives only. Please refer to the developerworks terms of use for more information. Copyright IBM Corporation US Government Users Restricted Rights Use, duplication or disclosure restricted by GSA ADP Schedule Contract with IBM Corp.

5 Contents About this publication Audience Publications Accessing terminology online Accessibility Tivoli technical training Providing feedback Conventions used in this publication Typeface conventions Searching and visualizing data Search UI overview Tabs Side bar icons Search UI reference Changing the search time zone Searching data Query syntax Search results timeline List and Grid views Refining search results Saving a search Saved searches Visualizing data Creating charts and graphs Percentile statistical functions Dashboards Aggregated Search Configuring Aggregated Search Datacenter Topology UI Searching aggregated data Manage Aggregations UI reference Managing aggregation templates Turning Aggregated Search on and off Search dashboards Alerts dashboard Custom Search Dashboards Creating a Custom Search Dashboards Defining a Custom Search Dashboard Steps to create a Custom Search Dashboard.. 40 Defining a search filter app Adding a shortcut to a Custom Search Dashboard to the Table view toolbar Notices Trademarks Terms and conditions for product documentation.. 83 IBM Online Privacy Statement Trademarks Copyright IBM Corp iii

6 iv IBM Operations Analytics Log Analysis: User's Guide

7 About this publication This guide contains information about how to use IBM Operations Analytics Log Analysis. Audience Publications This publication is for users of the IBM Operations Analytics Log Analysis product. This section provides information about the IBM Operations Analytics Log Analysis publications. It describes how to access and order publications. Accessing terminology online The IBM Terminology Web site consolidates the terminology from IBM product libraries in one convenient location. You can access the Terminology Web site at the following Web address: Accessibility Accessibility features help users with a physical disability, such as restricted mobility or limited vision, to use software products successfully. In this release, the IBM Operations Analytics Log Analysis user interface does not meet all accessibility requirements. Tivoli technical training Accessibility features This information center, and its related publications, are accessibility-enabled. To meet this requirement the user documentation in this information center is provided in HTML and PDF format and descriptive text is provided for all documentation images. Related accessibility information You can view the publications for IBM Operations Analytics Log Analysis in Adobe Portable Document Format (PDF) using the Adobe Reader. IBM and accessibility For more information about the commitment that IBM has to accessibility, see the IBM Human Ability and Accessibility Center. The IBM Human Ability and Accessibility Center is at the following web address: (opens in a new browser window or tab) For Tivoli technical training information, refer to the following IBM Tivoli Education Web site at Copyright IBM Corp

8 Providing feedback We appreciate your comments and ask you to submit your feedback to the IBM Operations Analytics Log Analysis community. Conventions used in this publication This publication uses several conventions for special terms and actions, operating system-dependent commands and paths, and margin graphics. Typeface conventions This publication uses the following typeface conventions: Bold Italic Monospace v Lowercase commands and mixed case commands that are otherwise difficult to distinguish from surrounding text v Interface controls (check boxes, push buttons, radio buttons, spin buttons, fields, folders, icons, list boxes, items inside list boxes, multicolumn lists, containers, menu choices, menu names, tabs, property sheets), labels (such as Tip:, and Operating system considerations:) v Keywords and parameters in text v Citations (examples: titles of publications, diskettes, and CDs v Words defined in text (example: a nonswitched line is called a point-to-point line) v Emphasis of words and letters (words as words example: "Use the word that to introduce a restrictive clause."; letters as letters example: "The LUN address must start with the letter L.") v New terms in text (except in a definition list): a view is a frame in a workspace that contains data. v Variables and values you must provide:... where myname represents... v Examples and code examples v File names, programming keywords, and other elements that are difficult to distinguish from surrounding text v Message text and prompts addressed to the user v Text that the user must type v Values for arguments or command options 2 IBM Operations Analytics Log Analysis: User's Guide

9 Searching and visualizing data Search UI overview This section outlines how to use the IBM Operations Analytics Log Analysis Search workspace to search your indexed data and to display this data in charts and dashboards. To find the root cause of a problem experienced by users such as slowness or a failure, you can search through data such as log files, traces, configuration information, and utilization data. This type of search is iterative because the results for one search might lead to a set of other searches. An example of iterative search is finding the connection timeout in the error logs, which could lead you find the connection pool utilization details. Use this topic to help you to get started with the Search UI. The following screen shot shows the capabilities of the Search UI: 1. Sidebar Use the UI icons on the side bar to open the Getting Started UI, a saved search, a search dashboard or the Administrative Settings UI. 2. Search Patterns pane The Search Patterns pane lists the fields that are found in your search. To filter the search for a field, click on the field and click Search. 3. Discovered Patterns pane The Discovered Patterns pane lists fields and values. To display discovered patterns, you need to configure the source types in the data source. 4. Timeline pane The Timeline pane displays the current search results filtered by time. To drill down to a specific time, click on a bar in the graph. 5. Timeline slider Use the time line slider icon to narrow and broaden the time period. Copyright IBM Corp

10 6. Search box Enter search queries in the Search field. When you click on a field in the Search or Discovered Patterns pane, the query is displayed in this field. 7 Time filter list Use the time filter list to filter search results for a specified time range. 8. Data source filter Use the data source filter icon to filter search results for a specific data source. 9. Table view / Grid view Use the List View / Grid View icon to switch between both view. Use the list view to identify search results quickly. Use the table view to display search results in a tabular format. Tabs Side bar icons The user interface consists of seven tabs. Read this topic to get an overview of the functions available under each one. Getting Started Use this tab to help you to get started with Log Analysis. You can view demonstrations and use sample data to help you to familiarize yourself with search and the other functions. Data Types Use this tab to create, edit, and delete collections, source types, rule sets, and file sets. Data Sources Use this tab to create, edit, and delete data sources and database connections. Hadoop Integration Configure your Hadoop integration on the Configuration subtab and check the status of your current processes on the Integration Helath subtab. Roles Users Create, edit, and delete roles. Create, edit, and delete users. Server Statistics Display the average and peak average data ingestion. Use the side bar to quickly navigate the user interface. The following table explains the available icons. Table 1.. Side bar icons Icon Name Description Getting Started icon Use guided demonstrations and find links to useful information. Saved Searches icon Run saved and example searches. 4 IBM Operations Analytics Log Analysis: User's Guide

11 Table 1. (continued). Side bar icons Icon Name Description Search Dashboards icon Administrative Settings icon Manage Alerts icon Datacenter Topology icon Run custom and sample search dashboards to view charts based on search results. Create and administrate the data model, users, and roles. Display server statistics. Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 Create and administrate alerts. This feature is only available in the Standard Edition. Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 View the hierarchy of multiple instances of Log Analysis that represent the topology of your data centers. The instance that are logged into is highlighted in red. Search UI reference Use this reference topic to help you to navigate the Search user interface (UI). Buttons and fields on the Search UI Table 2. Buttons and fields on the Search UI Button or Field Name Description Search button and field Search for a keyword or enter a search query. Data Sources icon Time filter icon Save Quick Search icon Columns to be displayed icon Plot chart icon List View icon Filter the search for specific data sources or groups of data sources. Specify a relative or custom time filter. Save the search query and results for later use. To view the saved searches, click the Saved Searches icon on the side bar. Filter the columns that are displayed in the Grid and Table views. Select the columns that you want to graph and click the Plot Chart icon. To open the List View, click this icon. Searching and visualizing data 5

12 Table 2. Buttons and fields on the Search UI (continued) Button or Field Name Description Grid View icon To open the Grid View, click this icon. Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 Launch icon To open the Search UI for the selected aggregated record. This feature is only available to Standard Edition users. Time line graph Table 3. Time line graph Field, icon or button Time line slider icon Y-axis Description Filter the time that is displayed in the timeline graph. Shows the number of log events for each bar. Bar Shows the number of log events for the specific time. To view more details, click a bar to drill down to more details about that time range. Time zone button Log Events Granularity Time Range To change the time zone, click the time zone button. Displays the granularity of the log records that are displayed. The level depends on the time that you have filtered for. You can drill down from years to months to days to hours to minutes and seconds. Describes the time range that is displayed in the timeline. 6 IBM Operations Analytics Log Analysis: User's Guide

13 Plot Chart editor Table 4. Buttons and fields on the Plot Chart editor Button or Field Name Description Generate Count check box Selected Columns To generate a count of the selected columns, ensure that this check box is selected. A list of the columns that you selected in the Grid view. Plot Chart (Current Page Data) Plot Chart (All Data) To create a chart of the data on the current page, click the Plot Chart (Current Page Data) button. To create a chart of all the data in the results, click the Plot Chart (All Data) button. Render Chart editor Table 5. Render chart editor Button or Field Name Description Clear All To clear the graph and close the window, click Clear All. Create New Dashboard Add Charts to Existing Dashboard Hide Portlet icon Settings icon Close Portlet icon To create a new dashboard based on the data in the graph, click the Create New Dashboard icon. To add the chart data to a dashboard, click the Add Charts to Existing Dashboard icon. To hide the graph, click the Hide Portlet icon. To change the type of chart or the values that are displayed on the axes, click the Settings icon. To close the graph and delete the chart, click the Close Portlet icon. Chart Settings editor Table 6. Chart settings editor Button or Field Name Description Visualization tab Render To create a chart, click Render. Searching and visualizing data 7

14 Table 6. Chart settings editor (continued) Button or Field Name Description Title Enter a name for the chart. Query tab Chart Type Parameters: x-axis Parameters: y-axis Query String Time Filters Datasource Filters Selected Columns Generate Count Select the kind of chart that you want to use. Select the value that you want to display on the x-axis. Select the value that you want to display on the y-axis. The query string that is used by the search that generated the results. Select a time filter that for the chart. Filter the chart data for specific data sources. The columns that the graph is based on. Indicates whether a count was generated when the chart was plotted. Change Time Zone dialog box Table 7. Change Time Zone dialog box Button or field Name Description Time zone Default time zone selection check box. Select a time zone from the list, or type an entry in the field. Select this check box to use the specified time zone for all future searches. Changing the search time zone By default, Log Analysis converts all times to the browser time zone. The browser time may not match the time displayed in Log Analysis if there are issues with Daylight Savings Time. For more information about this issue, see the Search time zone does not match browser topic in the Troubleshooting guide. If you do not want to use the default time zone, you can change it. To change the time zone: 1. Click the time zone. 8 IBM Operations Analytics Log Analysis: User's Guide

15 Searching data 2. Select the city or region that represents the time zone that you want to use. For example, if you are in Ireland and you want to set the time zone as Greenwich Mean Time (GMT) you can select Europe/Dublin (Greenwich Mean Time. 3. If you want to use this time zone in subsequent searches, click the Use this time zone as the default for all searches check box. This step is optional. When you select a default time zone, you need to use the default time zone that is most commonly used by Log Analysis users because this helps Log Analysis to process log records more quickly and efficiently. 4. To save your changes, click OK You can search ingested data such as log files for keywords. Search results are displayed in a timeline and a table format. Before you begin Before you can search, you must first define a data source and ensure that the log file is configured for consumption or you can load the sample data. Procedure 1. From the Search workspace, click the New Search or Add Search tab to open a new search table. Enter the search query. 2. Optional: You can filter data source by name, description, host name, log path, or tags or enter * to do a wildcard search. To limit the extent of the search to an individual data sources and any descendant data sources, select a leaf node from the Data Sources tree ( ). 3. In the Time Filter pane, click the Time Filter list ( ) and select the time period for which you want to search. Select Custom to specify a start time and date, and an end time and date for your search. 4. In the Search field, type the string for which you want to search in the log files. To view distribution information for all logs, in the Search field, type the wildcard character (*). To search for a partial string, type an asterisk (*) at the start and end of your search string. For example, to search for strings that contain the phrase hostname, type *hostname*. To narrow your search based on a service topology, type the service topology component on which you want to base your search, followed by a colon (:), followed by your search string. For example, service:daytrader. 5. Click Search. The first search you perform after the IBM Operations Analytics Log Analysis processes have been restarted might take longer to complete than subsequent searches. The user interface refreshes every 10 seconds. The updated results are displayed in the progress bar. Maximum search results: The search returns a maximum of 1000 records by default. This limit applies only to raw searches and not facet queries. This limit can be configured in unitysetup.properties file property: MAX_SEARCH_RESULTS=1000. Do not to use a high value for the MAX_SEARCH_RESULTS parameter. When a large number of results are returned, it degrades search performance. Searching and visualizing data 9

16 Results A graph displaying the distribution of matching events in the log is displayed. Log records containing a match for your search term are also displayed in Table view. When you search for a specific term, the term is highlighted within the individual log records to facilitate faster analysis. If you search for a partial term, each term that contains the search phrase is highlighted. Fields that contain only tagged values, in other words values that are contained within angled brackets (<>), are not highlighted. If a field contains values that are tagged and values that are not tagged, the tagged terms are removed and the remaining terms are highlighted as appropriate. If your search spans data that is stored in the archive, IBM Operations Analytics Log Analysis displays the initial results while it retrieves the rest of the data. You can interact with the initial search results while IBM Operations Analytics Log Analysis generates the search results. The progress bar displays the search progress. To display the latest results during the search, click We have more results for you. To stop the search, close the tab. To start another search while you are waiting for the first search to complete, click the Add Search tab. What to do next If you want to load data that contains tags and want to keep the tagging, you can disable highlighting. To disable highlighting: 1. Open the unitysetup.properties file. 2. Locate the ENABLE_KEYWORD_HIGHIGHTING property and set it to false. 3. Save the file. 4. To restart IBM Operations Analytics Log Analysis, enter the following command: <HOME>/IBM/LogAnalysis/utilities/unity.sh -restart Related concepts: Loading and streaming data Before you can perform a search on log or other data, you must first load the data into IBM Operations Analytics Log Analysis. When the file is loaded the data is indexed and is then available to be searched. Data Source creation You create data sources to ingest data from a specific source. Query syntax This section describes the combination of words, keywords, and symbols that you can use when searching for phrases using IBM Operations Analytics Log Analysis. Query syntax This section describes the combination of words, keywords, and symbols that you can use when searching for phrases using IBM Operations Analytics Log Analysis. The query syntax is based on the Indexing Engine query syntax. For more information, see: 10 IBM Operations Analytics Log Analysis: User's Guide

17 Indexing Engines use a number of different query parser plug-ins. Log Analysis supports the Lucene query parser plug-in. For more information about the Lucene query syntax, see: queryparser/classic/package-summary.html Standard keywords and operators This topic lists the keywords and operators that you can use when searching in IBM Operations Analytics Log Analysis. Note: The operators such as AND and OR, which are part of this query syntax, are case sensitive. You need to use capitals for these operators. OR This is the default operator. Either term or expression must be matched in the results. A variation to this keyword is or. For example, to search for a specific severity or message classifier, enter severity:m OR msgclassifier:"wltc0032w". + To get AND like functions, use the + operator. You must add + as a prefix to these queries. For example, to search for a specific severity and message classifier, enter +severity:w +msgclassifier:"wltc0032w". AND As an alternative to the + operator, you can use the AND operator. For example, to search for a specific severity and message classifier, enter severity:w AND msgclassifier:"wltc0032w". Enables you to group individual terms into phrases that are searched for as a unit. For example, document clustering. () Enables you to group expressions to guarantee precedence. For example, document AND (cluster OR clustering). * Wildcard operator that can be replaced in the returned value with a number of characters. This can be either passed as an operator to the sources or expanded when the meta.wildcard-expand option is turned on. For example, test* might return test, tests or tester. You can also use the wildcard in the middle of the search term. For example, t*est. Note: You cannot use this wildcard as the first character in a search. For example, you cannot use *test.? Wild character operator that can be replaced in the returned value with a single character. This can be either passed as an operator to the sources or expanded when the meta.wildcard-expand option is turned on. For example, te?t might return text or test. Note: You cannot use this wildcard as the first character in a search. For example, you cannot use?test. + Must operator. Forces the use of a keyword. For example WAS +and DB2 searches for strings that contain the keyword and. field: Enables you to restrict your query to a specific field. For example, ID:123A or msgclassifier: WLTC0032W. These operators are activated for every field defined in your syntax. By default, the search engine supports the title field. When you are creating a search collection, you can extract any number of contents, for each document, and relate these contents to searchable fields. This is specified in the form of the source associated with each collection. Searching and visualizing data 11

18 NOT The specified term or expression must not be matched in the search results. Variations to this keyword are! and -. For example, to search for log records that contain WAS ID but that do not contain DB2 ID, enter "WAS ID" NOT "DB2 ID". Note: You cannot use this operator for a single term. Additional keywords and operators This topic lists additional keywords that are more specific to the search and indexing operations performed by the search engine. Range searches To search for records in a range, use a range query. Range queries can include the terms in the range or they can exclude them. To include the query range terms, use brackets, for example: [<search term> TO <search term>] To exclude the query range terms, use braces, for example: <search term> TO <search term> Results are returned in lexicographical order. For example, to search for all the log records modified on or between two dates, enter: mod_date:[ TO ] The search returns all the log records that have been modified in 2003, that is all the log records where the mod_date field is within the specified range. You can also use range queries to search for fields that are not dates. For example, to search for all the log records that contain an ID between A to D but that do not include A or D, enter: title:a TO D DateMath queries To help you to implement more efficient filter queries for dates, you can use DateMath queries. For example, here are 4 possible DateMath queries: v timestamp:[* TO NOW] v timestamp:[ t23:59:59.999z TO *] v timestamp:[ t23:59:59.999z TO T00:00:00Z] v timestamp:[now-1year/day TO NOW/DAY+1DAY] For more information, see the DateMathParser topic in the Lucene documentation at: DateMathParser.html Escaping special characters Regular expression or regex queries are supported by the query syntax. Log Analysis supports escaping for the following special characters: + - &&! ( ) [ ] ^ " ~ *? : \ / 12 IBM Operations Analytics Log Analysis: User's Guide

19 To escape a special character, use a back slash (\) before the special characters. For example, to search for the query (1+1):2, enter: \(1\+1\)\:2 To find multiple terms, use brackets. For example, to search for moat and boat, enter: /[mb]oat/ Example query: Search for a keyword in a specified range This example search query searches for a keyword in a specified time range. You can use queries like this one to search for a keyword in a specified range. You want to search for all the instances of the error code 6543 in the SUMMARY field with a response time less than 5 seconds. For example: fieldname:summary fieldcontents: "Transaction has failed with response time of 10 seconds and error code of 6789." You enter the following query. It specifies the summary field and a query for the time range: "query" : "SUMMARY:/.*\\s([6-9] \\d\\d+)\\ssecond.*6789\\./" Search results timeline The search results timeline displays a graph showing the distribution of log events over a time period. You can use the timeline slider to view the search result timeline for a specific duration. If there are a large number of dates in the log file, the timeline might display them as ### rather than displaying the dates. Use the timeline scroller to zoom in and display the appropriate date information. The Timeline does not display data at the seconds level for data ingested using IBM Operations Analytics Log Analysis Version 1.1. A message is displayed indicating that the data was indexed using a previous version of IBM Operations Analytics Log Analysis. For this data, the Timeline cannot display information for multiple events that occur in time periods of less than one minute. List and Grid views Log records are displayed in both in a grid view and a list view. The default view is the List view. Log records are displayed in the grid view can be ordered by column for easy analysis. This view can be customized and used to display information in a range of ways: Sorting in Grid view You can also sort the information in the table columns by clicking on the column header. Not all columns are sortable. The Index configuration determines the fields that can be sorted. The _datasource field is an internal field and cannot be sorted or used for charting purposes. If you want to sort your data by data source or to create a chart, create a field in the Index configuration for this purpose. This field can be used for sorting and in charts. The order in which the fields are displayed is governed by the associated index configuration and source type. You can use the Index Configuration editor to add new fields or adjust the order of existing fields. Searching and visualizing data 13

20 For more information about editing index configuration, see the Editing an index configuration topic in the Administrating and Installing Guide. Toggle views Click List View or Grid View button to toggle between views. In each of these views, the button displayed allows you to toggle to the alternative view. Customizing the columns displayed To configure Grid view to display only the columns that you require, click the Select Columns icon on the Grid view toolbar, remove the columns that you do not want to display, and click OK. Display a chart of your results You can display the distribution of values for a number of the columns as a chart. To display the chart, select the column and click the Plot Column icon on the Grid view toolbar. The distinct values used to plot the chart can be viewed as hover help for each chart sector. The Plot feature is available for some values only. Where available, the Plot Column button is active when the columns are selected. If the chart title contains a loading icon, the chart is loading data from the archive. The chart is automatically updated when all the searches are complete. If you log out before the search is completed, the search stops. Running a Custom Search Dashboard from the Grid view toolbar If you have configured a shortcut to a Custom Search Dashboard, click the icon on the toolbar to launch the Custom Search Dashboard. Using a Custom Search Dashboard to display selected data from a column or cells If you have created the required Custom Search Dashboard, you can select and display the contents of a column or individual cells. To display the data, select a column or individual cell in Grid view and then launch the application. If you select a column, only data from the currently displayed page is displayed by the application. Related concepts: Custom Search Dashboards on page 37 Custom Search Dashboards allow you to create and execute custom scripts and display the output of those scripts in a dashboard. Refining search results You can refine the search results. You can narrow your search, by adding extra criteria in the search field. For example, the string severity : E returns log lines that contain errors. Alternatively, you can perform a free text search for a value in a column. All of the log lines that contain that text are returned. If more than 100 log lines are returned, click the arrows to view more log lines. Note: If a host file contains the character sequence ::1 next to the host name, ::1 might be displayed as the value in the sourceip column. You can also refine your search in these ways: Search Patterns To refine your search, use the values in the Search Patterns pane. For each new search, the list of fields with which you can filter your search is updated and listed 14 IBM Operations Analytics Log Analysis: User's Guide

21 in the Search Patterns pane. The number of occurrences of each value that is found is displayed with each unique keyword added as a child node. Click a keyword to add it to the Search field. The keyword is added in the format field:"value". You can add multiple keywords to refine your search. If you want to run an OR query, type the word OR between each added keyword search string. When you add all of the search criteria, click Search to return log lines that contain the values that you specified. Discovered Patterns When you search a data source that has been configured with a Source Type that uses the Generic annotator, the results of the search are listed in the Discovered Patterns pane. For each new search, the list of fields with which you can filter your search is updated and listed. The counts in the Discovered Patterns pane indicate the number of records that contain a specific key or key-value pair. A key-value pair might occur multiple times in a record, but the total reflects the number of records in which the key-value pair occurs. The count of the value of nodes in a key-value pair tree might exceed the key count when multiple values occur for the same key in a single record. Click a keyword to add it to the Search field. The keyword is added in the format field:"value". You can add multiple keywords to refine your search. If you want to run an OR query, type the word OR between each added keyword search string. When you add all of the search criteria, click Search to return log lines that contain the values that you specified. Data Source filtering Refine your search by selecting a Data Sources leaf node. When you select a leaf node in the Data Sources tree, your search is refined to search only that data source and any descendant data sources. The Data Sources tree is defined by selecting a service topology node when you configure your data source. For more information, see Editing groups in the service topology JSON file. Time Filters Us the Time Filters list to refine your search based on a selected time period. Select a value from the list to limit the search period. The time period chosen limits the search time period based on the log entries. For example, choosing Last Hour limits the search to the final 60 minutes of log file entries. Selecting a timeline value Click a value in timeline to refine your search based on that value. Log events can be visualized up to second-level granularity. Selecting a time zone To change the time zone that is used in one or all of your searches, click the Browser time link in the timeline chart. For more information about changing the time zone, see Changing the search time zone in the Searching and visualizing data guide. Searching and visualizing data 15

22 Saved searches Saving a search After you search for a keyword or series of keywords, you can save your search so that you can run it again at a later time. The searches you save are added to the Quick Searches pane. About this task Any directories that you created to organize your Quick Searches cannot be deleted. The directory structure is maintained. Procedure To save a search: 1. In the Search workspace, click the Save Quick Search icon. The Save Quick Search dialog box is displayed. 2. Enter a value in the Name and Tag fields. Adding a tag allows you to contain similar searches within a folder. 3. (Optional) Specify a time range as an absolute or relative time. The default option is relative time. 4. Click OK. The search is saved to the Save Quick Search pane. What to do next To use a saved search pattern, browse to the saved search in the Quick Searches pane and double-click the search pattern that you want to launch. You can also edit and delete the search from the right-click menu. To display a list of saved searches, click the Saved Searches icon. The following saved searches are available by default after you install the sample data: sample WAS System Out Example search that displays results from WebSphere Application Server. sample DB2 db2diag Example search that displays results from DB2. sample MQ amqerr Example search that displays results from IBM MQ. sample Oracle alert Example search based on alerts for the Oracle sample data. sample App transaction log Example search based on the sample application's transaction log. sample Omnibus events Example search based on the sample Omnibus events. sample Windows OS events Example search based on the sample Windows OS events. 16 IBM Operations Analytics Log Analysis: User's Guide

23 Visualizing data You can create charts and graphs that help users to process information quickly and efficiently. Creating charts and graphs After you select one or more columns in the grid view, you can use the Plot Column button to create charts to display the results. About this task To create a chart that is based on a specific field, you can create a search query and plot a chart based on the results. If your query returns blank fields, the chart might not display. To ensure that this does not happen, use queries that return data for the specific field. For example, to search for the severity field, enter severity:*to*. Procedure 1. In the Search workspace, select Grid view. 2. Select one or more columns. 3. Click the Plot Column icon in the Grid view toolbar. The Plot Chart UI is displayed. 4. To display counts for the selected columns, select the Generate Counts check box. 5. If the columns that you selected contain dates or numeric values, you can use the Granularity field to specify the granularity. You can use this setting only for columns that contain dates or numeric values and can be filtered. 6. If the columns that you selected contain numeric values, you can also apply statistical functions on the values. v v Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Entry Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 If you are using the Entry Edition, you can use the sum, min, max, avg, and count functions. Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 If you use the Standard Edition, you can use the missing, sumofsquares, stddev, and percentile functions. 7. To plot the chart on 100 or less of the records that you selected, click Plot Chart (Current Page Data). 8. To plot the chart on all the indexed data, click Plot Chart (All Data). If one or more of the fields in the selected columns is not filterable, the charts are only plotted if the total number of records is less than To change this setting, you must modify the MAX_DATA_FACETS_IN_CHART property in the unitysetup.properties file. Results The graph is rendered. To change the graph type, use the Edit icon. If the chart title contains a loading icon, the chart is loading data from the archive. The chart is automatically updated when all the searches are complete. If you log out before the search is completed, the search stops. Percentile statistical functions You can use the statistical function to return values for specified percentiles. Searching and visualizing data 17

24 You can use the Search REST API or the user interface to create percentile statistical functions. For example, to query the maximum and minimum results for the 50th, 95th, and 99th percentiles with the Search REST API, you enter "stats": ["min", "max", "percentile,50,95,99"]. The facet response is: "min": 10, "max", 1000, "percentile": "50": 100, "95": 200, "99": 225 Percentile queries are not calculated incrementally, unlike the other queries that are used in Log Analysis. This fact means that the query needs to run over the entire time range before it can return any results.log Analysis limits the number of asynchronous windows that can run simultaneously for this function. This property is set in the MAX_NON_INCREMENTAL_WINDOWS property in theunitysetup.properties. The default value is 2. For example, if you specify a percentile query based on a time range from August to August and MAX_NON_INCREMENTAL_WINDOWS=5 and COLLECTION_ASYNC_WINDOW=1d, only the most recent 5 days of data that is returned by the query are considered for percentile evaluation. Dashboards You can use dashboards to collate multiple charts, which are created during problem diagnosis, on a single user interface (UI). For example, imagine an organization uses IBM Operations Analytics Log Analysis to monitor all the server logs that it generates. The system administrator wants to be able to view the most critical errors, the highest severity errors, and the total number of errors on a single UI. To facilitate this scenario, you create a dashboard that is called System Admin and you add the charts that show the required information to it. The data that is displayed on the dashboards is based on charts. For more information, see the Charts topic under Custom Search Dashboard > Steps to create a Custom Search Dashboard > Application files in the Extending IBM Operations Analytics Log Analysis section. Sample dashboards Sample dashboards are included as part of the sample content for the following Custom Search Dashboard samples: v Sample_EventInsightpack_v v Sample_AppTransInsightpack_v v Sample_weblogInsightpack_v v WASInsightPack_v Creating dashboards You can create a dashboard to visualize data from multiple sources on a single UI. 18 IBM Operations Analytics Log Analysis: User's Guide

25 About this task This procedure describes how to create a dashboard and chart. You can also add a chart to an existing dashboard. To add a chart to an existing dashboard, click Add Charts to an Existing Dashboard. Select a dashboard from the list. Use the drill-down feature to search the data for records that correspond to the area of the chart that you select. When you drill down on a field in a chart, a new search is created that is based on the field that you selected. The drill-down feature is only supported for dashboards that are created in the UI. The drill-down feature is not supported in some cases where the underlying chart query does not support it. For example, if the query searches two date fields. Procedure 1. Open an existing search UI or click Add New Search to create a new search UI. 2. Switch to the Grid View and plot the charts that you would like to include in the dashboard. You cannot use more than eight charts in a single dashboard. For more information, see Creating charts and graphs on page To plot the charts that you would like to include in the dashboard, select the columns that you are interested in and click the Plot column icon. Click Plot Chart (All Data). If you want to use the drill-down feature, you must click Plot Chart (All Data). You cannot use the Plot Chart (Current Page Data) button. The drill-down function is not supported for this option. You can also run a number of statistical operations on the selected columns. Select one of the following functions from the Summary Function drop-down list. min max sum avg count The minimum values in a field. The maximum values in a field. The sum of the values in a field. The average of the values in a field. The count of the values in a field. missing The number of records for which the value for a field is missing. sumofsquares Sum of the squares of the values in a field. stddev The standard deviation of the values in a field. 4. To create the dashboard, click the Create New Dashboard button. Enter a name and a tag. The tags are used to define groupings. 5. Save the dashboard. What to do next After the dashboard is created, a Custom Search Dashboard is automatically generated that represents the dashboard. You can view the Custom Search Dashboard and dashboard on the Search UI under Search Dashboard > Dashboards. Searching and visualizing data 19

26 To view the visualization and query settings for the charts that you added to a custom dashboard, click Search Dashboard > Dashboards > Dashboard name. Select the required chart. Click the Settings icon. There are two tabs, the Query and the Visualization tabs. Information about the query that provides the information that is visualized in the chart is displayed on the Query tab. This query is saved when the chart is created. To change the time filter from the default relative setting to match the absolute time that is used by the Search UI, open the Query tab and select the setting from the list. To view the effect of changing this setting without changing the dashboard, click Render. The chart type and parameters are detailed on the Visualization tab. These settings are the settings that you made when you created the chart. For more information about these settings, see the Application files and Charts topics in the Custom Search Dashboards section of the Extending IBM Operations Analytics Log Analysis guide. To ensure that your dashboard displays current information, you can use the auto-refresh feature to regularly refresh a dashboard at scheduled intervals. For more information on configuring automatic dashboard refreshes, see the Configuring automatic refreshes for new dashboards topic in the Administering IBM Operations Analytics Log Analysis guide. Deleting search dashboards If you no longer need a dashboard, you can delete it. About this task There are two types of dashboard that is displayed in the Search Dashboards list on the UI, dynamic dashboards and Custom Search Dashboards. Dynamic dashboards do not contain any custom logic and the type value that is defined in the associated JSON file is DynamicDashboard. Custom Search Dashboards are customized dashboards that are installed as part of an Insight Pack. They do contain custom logic. The delete functions differently for each type. When you delete a dynamic dashboard, the dashboard and all related data are deleted. When you delete a Custom Search Dashboard, the Custom Search Dashboard file extension is changed to.deleted for deletion by the IBM Operations Analytics Log Analysis administrator. Procedure 1. Open the Search UI. 2. Open the Search Dashboards list and select the dashboard that you want to delete. 3. Right-click the dashboard and click Delete. Confirm that you want to delete it when prompted. Results If the dashboard is a dynamic dashboard, the dashboard and associated data is deleted. If the dashboard is a Custom Search Dashboard, the Custom Search Dashboard file extension is changed to.deleted. You can contact the IBM Operations Analytics Log Analysis administrator and ask them to delete the Custom Search Dashboard if appropriate. 20 IBM Operations Analytics Log Analysis: User's Guide

27 Aggregated Search Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 Use Aggregated Search to search aggregated data from multiple instances of Log Analysis. You can install multiple instances of Log Analysis. For example, your environment might consist of data centers in different regions. You install an instance of Log Analysis in each region. The Aggregated Search feature helps you to view, search, and drill down into the aggregated data from these instances of Log Analysis. Before you can use Aggregated Search, you must configure it, including how you want to aggregate data that is sent from the children to the parent node. For more information, see Configuring Aggregated Search. After you configure Aggregated Search, you can use it to help you to get an operational overview of your data centers. You can view and search aggregated data in your parent node. You can also drill down from this node to the child nodes to view the data there. For more information, see Searching aggregated data on page 30. Only Standard Edition users can use this feature. Configuring Aggregated Search Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 To configure Aggregated Search, complete the following steps. 1. To specify the topology of your cluster, create the required JSON file on each Log Analysis server in your cluster. For more information, see Defining the topology for Aggregated Search. 2. Configure Aggregated Search automatically or manually. You can also configure Lightweight Directory Access Protocol (LDAP) for user authentication as part of this step. For more information about how to configure it automatically, see Configuring Aggregated Search automatically on page 25. For more information about how to configure it manually, see Configuring Aggregated Search manually on page Create a data aggregation template to specify how data is aggregated. For more information, see Creating aggregation templates on page Review the information about how you need to set up your users, roles, and access permissions to ensure that records are visible. For more information, see Role-based Access Management for Aggregated Search on page 28. Defining the topology for Aggregated Search Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 Before you use the Aggregated Search feature, you must define the topology. About this task Create the JSON file on each Log Analysis server in your cluster. Searching and visualizing data 21

28 Procedure 1. Create a JSON file and specify the topology of the data center in it. Specify the parameters described in the Topology parameters table. Table 8. Topology parameters Parameter OS_user OS_password LA_home LA_host LA_port LA_user LA_password Description The operating system user name. Specify the name of the user who installed Log Analysis on the server. The password for the operating system user. This parameter is optional. If you do not specify it, the script prompts you for it when you configure Aggregated Search automatically. The directory where Log Analysis is installed on the node. The fully qualified host name for the Log Analysis server. To facilitate single sign-on (SSO), use a fully qualified host name. The port number that is used by Log Analysis. This value must be an integer. You cannot use a string. The Log Analysis user name. The Log Analysis user password. If you are automatically configuring Aggregated Search, you can specify a plain text password or you can leave this parameter blank. The utility provided encrypts the password when it copies the topology. If you do not specify a password, the configuration script prompts you for it when you configure Aggregated Search automatically. If you are manually configuring it, you can use plain text or encrypted passwords. You can use the unity_securityutility.sh to encrypt the password on each server. For more information, see unity_securityutility.sh command. children connected sshport The names of the immediate descendants of the topology node. The names of the nodes that can be accessed from the node that is defined in this section of the JSON. If you do not specify any values or specify a blank array, all nodes can be accessed from this instance of Log Analysis. The Operating System port that is used for Secure Shell Authentication (SSH). If you do not specify a value, the default value of 22 is used. For an example, see Topology file for Aggregated Search. 2. Save your changes. 22 IBM Operations Analytics Log Analysis: User's Guide

29 Example The following example displays the structure of the input JSON object. AP : OS_user : unity, sshport : 22, LA_home : /home/unity/ibm/loganalysis/, LA_host : server1.example.com, LA_port : 9987, LA_user : unityadmin, LA_password : unityadmin, children : [ India, Australia ], connected :[], Europe : OS_user : unity, sshport : 22, LA_home : /home/unity/ibm/loganalysis/, LA_host : server2.example.com, LA_port : 9987, LA_user : unityadmin, LA_password : unityadmin, children : [ UK, France ], India : OS_user : unity, sshport : 22, LA_home : /home/unity/ibm/loganalysis/, LA_host : server3.example.com, LA_port : 9987, LA_user : unityadmin, LA_password : unityadmin, children : [ Pune, Bangalore ], connected : [ AP, Pune, Bangalore ], Australia : OS_user : unity, sshport : 22, LA_home : /home/unity/ibm/loganalysis/, LA_host : server4.example.com, LA_port : 9987, LA_user : unityadmin, LA_password : unityadmin, children : [ Sydney, Perth ],, UK : OS_user : unity, sshport : 22, LA_home : /home/unity/ibm/loganalysis/, LA_host : server5.example.com, LA_port : 9987, LA_user : unityadmin, LA_password : unityadmin, children : [ London ],, France : OS_user : unity, sshport : 22, LA_home : /home/unity/ibm/loganalysis/, LA_host : server6.example.com, LA_port : 9987, LA_user : unityadmin, LA_password : unityadmin, children : [ Paris ],, Searching and visualizing data 23

30 Pune : OS_user : unity, sshport : 22, LA_home : /home/unity/ibm/loganalysis/, LA_host : server7.example.com, LA_port : 9987, LA_user : unityadmin, LA_password : unityadmin, children : [],, Bangalore : OS_user : unity, sshport : 22, LA_home : /home/unity/ibm/loganalysis/, LA_host : server8.example.com, LA_port : 9987, LA_user : unityadmin, LA_password : unityadmin, children : [],, Sydney : OS_user : unity, sshport : 22, LA_home : /home/unity/ibm/loganalysis/, LA_host : server9.example.com, LA_port : 9987, LA_user : unityadmin, LA_password : unityadmin, children : [],, Perth : OS_user : unity, sshport : 22, LA_home : /home/unity/ibm/loganalysis/, LA_host : server10.example.com, LA_port : 9987, LA_user : unityadmin, LA_password : unityadmin, children : [],, London : OS_user : unity, sshport : 22, LA_home : /home/unity/ibm/loganalysis/, LA_host : server11.example.com, LA_port : 9987, LA_user : unityadmin, LA_password : unityadmin, children : [],, Global : OS_user : unity, sshport : 22, LA_home : /home/unity/ibm/loganalysis/, LA_host : server12.example.com, LA_port : 9987, LA_user : unityadmin, LA_password : unityadmin, children : [ AP, Europe ], connected :[], Paris : OS_user : unity, sshport : 22, LA_home : /home/unity/ibm/loganalysis/, LA_host : server13.example.com, LA_port : 9987, 24 IBM Operations Analytics Log Analysis: User's Guide

31 LA_user : unityadmin, LA_password : unityadmin, children : [], The topology is displayed in the following screen shot: When you export an aggregation template, the level is specified in the JSON file. The top or root node is the Global node and it designated level 0 in an exported aggregation template. The regions are AP and Europe and are on the second level and are designated level 1 in the template. The countries are defined on the third level and are assigned level 2. The bottom level contains the cities and is designated as level 3. You can only import templates from the same level. You cannot import templates from another level. For example, if you export a template from the Pune node on the bottom level, you cannot import it into the India node. What to do next After you create the topology, you need to enable Aggregated Search. If you are automatically configuring Aggregated Search, run the utility. For more information, see Configuring Aggregated Search automatically. If you are configuring Aggregated Search manually, copy the topology file to each Log Analysis server. Note the directories where you save these files as you must specify it in the unitysetup.properties file when you enable the Aggregated Search feature. For more information, see Configuring Aggregated Search manually on page 27. Configuring Aggregated Search automatically Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 To automatically configure Aggregated Search, complete the following steps. Searching and visualizing data 25

32 Before you begin Create a topology before you start. To specify the topology, create a JSON file with the required parameters. For more information about these parameters, see Topology file for Aggregated Search. Use plain passwords in the topology. The aggregationconfigutil.sh script encrypts these passwords when it copies the topology to the nodes. The password is optional. If you do not specify it, users are required to enter the password when they run the script. About this task You use the <HOME>/IBM/LogAnalysis/utilities/aggregation/ aggregationconfigutil.sh script to configure Aggregated Search. For more information, see aggregationconfigutil.sh. Procedure 1. Go to the <HOME>/IBM/LogAnalysis/utilities/aggregation directory. 2. To copy the topology file and enable Aggregated Search in the unitysetup.properties file, enter the following command:./aggregationconfigutil.sh -tf ~/<topology_file_name> -n all -o setuptopology where <topology_file_name> is the name of the topology file. For example, example.json. 3. To set up the Java certificates, enter the following command:./aggregationconfigutil.sh -tf ~/<topology_file_name> -n all -o setupjavakeystores 4. You can also configure Lightweight Directory Access Protocol (LDAP) for user authentication. This step is optional and is only required if you want to use LDAP. To configure LDAP, enter the following command:./aggregationconfigutil.sh -tf ~/<topology_file_name> -n all -o configldap -lp ~/ldapregistryhelper.properties 5. You can also configure single sign-on (SSO) to facilitate the drill-down feature. To configure SSO, you must set up LDAP as described in step 2. This step is optional and is only required if you want to use SSO when you drill down. To configure SSO and update the server.xml file with the domain name, enter the following command, specifying the path to the keys file:./aggregationconfigutil.sh -tf ~/<topology_file_name> -n all -o configsso -kf ~/<path_ltpa_file> where <path_ltpa_file> is the path to the key file that is copied to all nodes in the cluster. For example, home/la/ltpa.keys. 6. Restart each instance of Log Analysis. To restart Log Analysis, log in each server and enter the following command: <HOME>/IBM/LogAnalysis/utilities/unity.sh -restart What to do next To verify that SSO is configured correctly, enter the following command:./aggregationconfigutil.sh -tf ~/<topology_file_name> -n all -o checksso 26 IBM Operations Analytics Log Analysis: User's Guide

33 If any certificate errors occur when you use the checksso option, enter the following command:./aggregationconfigutil.sh -tf ~/<topology_file_name> -n all -o prepsso You can also configure Aggregated Search with a single command. You can omit any options that you do not want to use such as configsso. For example:./aggregationconfigutil.sh -o setuptopology,setupjavakeystores,configldap,configsso -tf ~/<topology_file_name> -n all -lp ~/ldapregistryhelper.properties -kf ~/<path_ltpa_file> After you configure Aggregated Search, you can create an aggregation template to specify how the data is aggregated and is sent from the children nodes to the parent nodes. For more information, see Creating aggregation templates on page 32. Configuring Aggregated Search manually To manually configure the Aggregated Search feature, complete these steps. Before you begin Create the topology file and copy the file to a directory on each Log Analysis server in your cluster. For more information, see Defining the topology for Aggregated Search on page 21. The Log Analysis password that you specify can be in plain text or it can be encrypted. You can use the unity_securityutility.sh utility to encrypt passwords on each node. To encrypt the password, use the utility on the server that you are copying the topology file to. For more information, see unity_securityutility.sh command. About this task Repeat the following step for each Log Analysis server that is specified in your topology file. Procedure 1. To stop the Log Analysis server, enter the following command: <HOME>/IBM/LogAnalysis/utilities/unity.sh -stop 2. Open the <HOME>/IBM/LogAnalysis/wlp/usr/servers/Unity/apps/Unity.war/ WEB-INF/unitysetup.properties file. 3. To enable the Aggregated Search feature, change the following property value to true. The default value is false. For example: ENABLE_AGGREGATION = true 4. Specify the path to the JSON file that contains the topology information. For example: DC_TOPOLOGY_PATH=<path_topology_json> where <path_topology_json> is the path to the file that contains the topology information. For example, /home/la/datacenter_topology.json. 5. Specify the name of the local node. This name must correspond to the name specified in the topology JSON file. For example, DC_TOPOLOGY_NODE_NAME=<node_name> Where <node_name> is the name of the node. Searching and visualizing data 27

34 6. Save your changes. 7. To import the Java client certificates, complete the following steps: a. Copy the <HOME>/IBM/LogAnalysis/wlp/usr/servers/Unity/resources/ security/client.crt file from the current node to temporary directory. If your current node has one parent node and two child nodes that are specified in the topology, copy 3 client.crt files, one from each node, to the temporary directory. b. After you copy the file, enter the following command to import it: <HOME>/IBM/LogAnalysis/ibm-java/bin/keytool -import -file <crt_file_path> -keystore <HOME>/IBM/LogAnalysis/wlp/usr/servers/Unity/ resources/security/key.jks -alias <alias_name> -storepass loganalytics where <crt_file_path> is the full path to the temporary directory you saved the client.crt file. For example, /tmp/ap_cert/client.crt. <alias_name> is the name of each certificate that you want to import for the parent and children nodes. Use a unique alias for each certificate that you import. 8. If you want to use LDAP for authentication, configure it. This step is optional and is only required for authentication. For more information, see Manually configuring LDAP authentication. 9. You can also configure single sign-on (SSO). This step is optional. It is required to allow users to drill down without having to sign in to each node. For more information, see Configuring Log Analysis single sign-on (SSO) on the Liberty profile for WebSphere Application Server. 10. To start the Log Analysis server, enter the following command: <HOME>/IBM/LogAnalysis/utilities/unity.sh -restart What to do next After you configure Aggregated Search, you can create an aggregation template to specify how data that is sent from the children nodes is aggregated. For more information, see Creating aggregation templates on page 32. Role-based Access Management for Aggregated Search Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 To view aggregated data, the user must have read access to the data sources whose data was aggregated. Aggregated data flows upwards from child nodes to the parent node. The drill-down feature allows users in the parent node to view data that is generated in the child nodes. Users in the parent node can search all aggregated data but they can drill down only to the records that they have access to. Access is defined in the data source that collected the data. For example, consider the following situation. Assume that this example has three instances of Log Analysis: v Asia Pacific (AP). It represents the region and is the global node that is at the top of the hierarchy. v India. It represents the country and is a child node that is in the middle of the hierarchy. 28 IBM Operations Analytics Log Analysis: User's Guide

35 v Bangalore. It represents the city and is a child node that is at the bottom of the hierarchy. The Bangalore node contains two data sources. Different users have access to these data sources, as listed in the following table. Table 9. Bangalore data sources and users Data source DS1 DS2 Users with read access user1 and user3 user2 and user3 DS1 contains 10 records and DS2 contains 5. user1 can view records from DS1. user2 can view records from DS2 only. user3 can view all 15 records. You create two aggregation templates in the Bangalore node. The templates are called Aggregation1 and Aggregation2. The data that is aggregated by Aggregation1 is collected by DS1. The data that is aggregated by Aggregation2 is collected by DS2. Aggregation1 uses the COUNT function to count the number of records that are collected by DS1, in this case 10. Aggregation2 uses the COUNT function to count the number of records that are collected by DS2, in this case 5. Both templates send the aggregated data to the India node. It is sent from here to the _aggregations data source. Aggregation1 sends one aggregation record with an aggregated value of 10 to the India node. Aggregation2 sends one aggregation record with an aggregated value of 5 to the India node. Both aggregation records are loaded by the _aggregations data source. In the India node, you create an aggregation template, Aggregation3, which uses the SUM functions to add the aggregated values in the _aggregations data source. Aggregation3 creates one aggregation record with an aggregated value of 15 and sends it to the AP node where it is loaded by the _aggregations data source. When users log in to the AP node and search the _aggregations data source, one aggregation record with an aggregated value of 15 is displayed. When they drill down to the India node, 1 or 2 aggregation records with aggregated values of 10 and 5 are displayed. This discrepancy depends on the users access as assigned in the data source in the child nodes, in this case DS1 and DS2. When user1 logs in to the AP node and searches the _aggregations data source, one aggregation record with an aggregated value of 15 is displayed. When user1 drills down, one aggregation record with a value of 10 is displayed. This aggregation record is generated from the Aggregation1 template that is specified in the Bangalore node. user1 cannot view the aggregation record with a value of 5 that is generated by the Aggregation2 template. This situation occurs because user1 does not have access to the view records from DS2. When user1 or user3 and user 2 or user 3 logs in to the Bangalore node, they can view 10 and 5 records that are loaded by DS1 and DS2. These records are the basis for the aggregation records in the other nodes. When user2 logs in to the AP node, one aggregation record is displayed with an aggregated value of 15. When user2 drills down to the India node, one aggregation record is displayed with an aggregated value of 5. They can also drill down to view five records in the Bangalore node. These records are the records that are associated with DS2. user2 cannot view the records that are associated with DS1. Searching and visualizing data 29

36 When user3 logs in to the AP node, one aggregation record with a value of 15 is displayed. When user3 drills down to the India node, two aggregation records are displayed with aggregated values of 10 and 5. They can also drill down to view 15 records that are associated with DS1 and DS2 in the Bangalore node. This example is summarized in the following table: Table 10. RBAM for Aggregated Search example Node user1 user2 user3 AP 1 aggregation record with an aggregated value of 15 1 aggregation record with an aggregated value of 15 1 aggregation record with an aggregated value of 15 India 1 aggregation record with an aggregated value of 10 1 aggregation record with an aggregated value of 5 2 aggregation record with aggregated values of 10 and 5 Bangalore 10 records from DS1 5 records from DS2 15 log file records. 10 from DS1 and 5 from DS2. Datacenter Topology UI Use the Datacenter Topology UI to view the hierarchy of the Log Analysis servers in your data center and to navigate from one Log Analysis server to another connected Log Analysis server search page. To open the Datacenter Topology UI, click the Datacenter Topology icon ( ). The Datacenter Topology window displays all of the datacenter nodes. The current node is highlighted in red. You cannot access Log Analysis server nodes that are highlighted in grey. The nodes that can be accessed are those which are specified in the connected parameter in the topology JSON file. For more information, see Defining the topology for Aggregated Search on page 21. Searching aggregated data After you configure Aggregated Search, you can search the aggregated data. About this task To search aggregated data, you filter for the _aggregation data source as described in this task. You can also use the standard search features. For more information, see Searching data on page 9. Procedure 1. Click the New Search or Add Search tab to open a new search table. 2. Select the _aggregation data source from the Data Sources tree ( ). 3. Click Search. 4. To view the aggregated search details, click Grid. 5. Select the log record for which you want to drill down to a specific aggregation. 30 IBM Operations Analytics Log Analysis: User's Guide

37 6. Click the Launch icon. A new window displays the next level Log Analysis server UI with the corresponding log file records. Manage Aggregations UI reference Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 Use the Manage Aggregations UI to create, edit, delete, import, and export aggregation templates. The following table lists the icons on the Manage Aggregations UI: Table 11. Manage Aggregations UI Icon Name Description Add template Create a template. Delete template Delete the selected templates. Import template Export template Import the selected templates. Export the selected templates. The following table lists the fields on the Add Aggregation Template UI: Table 12. Fields on the Manage Aggregations UI Field Name Label Description Associate with Query Query Builder icon Aggregation Function Description The name of the aggregation template. Enter any further descriptions or labels that you want to add. The aggregation template description. Select the source type with which the template is associated. Specify the query that you want to use to filter log records for aggregation. You can use the same query syntax that you use for any search. For more information, see Query syntax on page 10. Click this icon to open the Query Builder UI. Use it to create a query based on the regions, aggregation templates, and source types that you want to search. You can use this UI only on regional nodes. You cannot use it on the top or bottom nodes in your topology as it is not available in these nodes. Select the aggregation function from the drop-down list. Possible values are COUNT,MIN, MAX, and SUM. If you use the COUNT value, you do not need to use the Aggregation Attribute field. Searching and visualizing data 31

38 Table 12. Fields on the Manage Aggregations UI (continued) Field Aggregation Attribute Aggregation Window Threshold Operator Threshold Datasource Type Description Select the attribute that the aggregation function is run against. Select the aggregation time interval. You can select 1, 2, 3, 4, 6, 8, 12, 24 hours. An aggregation record is created after half this time elapses. For example, if you select 2 hours, a new aggregation record is created every 1 hour that queries for data that is created in the last 2 hours. Select the threshold operator from the drop-down list. You can use less than, greater than, equal to, less than or equal to, and greater than or equal to. Specify the value that triggers the threshold operator. Select the datasource type from the drop-down list. The type can be Physical or Logical. Managing aggregation templates You can import, export, create, edit, and delete aggregation templates. Creating aggregation templates Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 Before you can use the Aggregated Search feature, create an aggregation template. Aggregation templates define the data that is aggregated and how it is aggregated before it is sent to the parent node. Procedure 1. Click Administrative Settings. 2. Click the Manage Aggregations tab. 3. To create a new template, click the Add Template icon ( ). 4. Specify the required parameters as outlined in the following table. Table 13. Fields on the Manage Aggregations UI Field Name Label Description Associate with Description The name of the aggregation template. Enter any further descriptions or labels that you want to add. The aggregation template description. Select the source type with which the template is associated. 32 IBM Operations Analytics Log Analysis: User's Guide

39 Table 13. Fields on the Manage Aggregations UI (continued) Field Query Query Builder icon Aggregation Function Aggregation Attribute Aggregation Window Threshold Operator Threshold Datasource Type Description Specify the query that you want to use to filter log records for aggregation. You can use the same query syntax that you use for any search. For more information, see Query syntax on page 10. Click this icon to open the Query Builder UI. Use it to create a query based on the regions, aggregation templates, and source types that you want to search. You can use this UI only on regional nodes. You cannot use it on the top or bottom nodes in your topology as it is not available in these nodes. Select the aggregation function from the drop-down list. Possible values are COUNT,MIN, MAX, and SUM. If you use the COUNT value, you do not need to use the Aggregation Attribute field. Select the attribute that the aggregation function is run against. Select the aggregation time interval. You can select 1, 2, 3, 4, 6, 8, 12, 24 hours. An aggregation record is created after half this time elapses. For example, if you select 2 hours, a new aggregation record is created every 1 hour that queries for data that is created in the last 2 hours. Select the threshold operator from the drop-down list. You can use less than, greater than, equal to, less than or equal to, and greater than or equal to. Specify the value that triggers the threshold operator. Select the datasource type from the drop-down list. The type can be Physical or Logical. 5. To save the template, click OK. 6. To confirm that you want to save the template, click OK. What to do next You can import and export your templates. You can also edit and delete them. Exporting aggregation templates Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 You can export aggregation templates to back up your data, and to import it into another Log Analysis server. Searching and visualizing data 33

40 Procedure 1. Click Administrative Settings. 2. Click the Manage Aggregations tab. 3. Select one or more templates that you want to export, and click the Export Templates icon ( ). 4. Select the path to the directory where you want to export the template to in the File Selector field, and click OK. What to do next The aggregation templates that you selected are exported in the JSON format. You can import these templates into other instances of Log Analysis. Importing aggregation templates Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 To import aggregation templates, complete these steps. Before you begin Ensure that the file that you are importing is in the JSON format and was exported from Log Analysis. When you import an aggregation template, ensure that the required source types exist in the server that you are importing the files into. The source type is specified in the data source configuration. For example, you want to import aggregated data from the DS1 data source, which uses the WASSystemOut source type and DS2, which uses the WASSystemOutExtended source type into another Log Analysis server. However, only the WASSystemOut source type exists on the server. When you try to import the template, only the template that is associated with the WASSystemOut source type is imported. To avoid this issue, create the missing source type. You must import an aggregation template into a node that is on the same level in the hierarchy. The level is defined by the children that are specified in the topology file. For more information, see Defining the topology for Aggregated Search on page 21. Procedure 1. Export the Aggregated Search aggregation template For more information, see Exporting aggregation templates on page Click Administrative Settings. 3. Click the Manage Aggregations tab. 4. Click the Import Templates icon ( ). Select the file that you want to import and import the file. Results The aggregation templates are imported. Editing aggregation templates To edit an aggregation template, complete these steps. 34 IBM Operations Analytics Log Analysis: User's Guide

41 Procedure 1. Click Administrative Settings. 2. Click the Manage Aggregations tab. 3. Select the template that you want to edit. 4. To edit the template, click the Edit icon. 5. Edit the template values as required. For more information about the template values, see Manage Aggregations UI reference on page To save the changes to the template, select OK. Deleting aggregation templates Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 To delete an aggregation template, complete these steps. Procedure 1. Click Administrative Settings. 2. Click the Manage Aggregations tab. 3. Select one or more templates that you want to delete. 4. To delete the template, click the Delete Template icon ( ). 5. To confirm that you want to delete the template, select OK. Results The selected Aggregated Search aggregation template is deleted. Turning Aggregated Search on and off Documentum Content Orchestrator Domino.Doc Professional Workgroup Cloudscape SQL System Developer Enterprise Advanced Fix Oracle Standard Telesales Business Panagon WAS WCS zseries Express AltiVec pseries Sybase xseries Studio iseries Start DB2 B2C C89 C99 CEP ALL B2B IDS Pack i5 Server Mgr p5 z9 i52 31 To turn the Aggregated Search feature on or off, complete the following steps. About this task If you choose to turn off the feature, do so for each instance of Log Analysis. If you turn off the feature for some nodes and not others, it can disrupt the flow of data in your specified topology. Procedure 1. To stop the Log Analysis server, enter the following command: <HOME>/IBM/LogAnalysis/utilities/unity.sh -stop 2. To turn the feature on or off, edit the <HOME>/IBM/LogAnalysis/wlp/usr/ servers/unity/apps/unity.war/web-inf/unitysetup.properties file. To turn the Aggregated Search feature on, change the following property value to true: ENABLE_AGGREGATION = true To turn the Aggregated Search feature off, change the following property value to false: Searching and visualizing data 35

42 Search dashboards ENABLE_AGGREGATION = false 3. To start the Log Analysis server, enter the following command: <HOME>/IBM/LogAnalysis/utilities/unity.sh -start To display a list of search dashboards, click the Search Dashboards icon on the side bar. The Dashboards group contains the following search dashboards: sample-events-hotspots Displays example hot spot reports for the sample events. WAS Errors and Warnings Dashboard Displays example reports for errors and warnings that are generated by the samplewebsphere Application Server application. Sample-Web-App Displays example reports for errors and warnings that are generated by the sample web application. The DB2AppInsightPack group contains the following search dashboards: DB2 Information Links Displays useful information links for more information about DB2. DB2 Troubleshooting Displays example reports for DB2. The ExpertAdvice group contains the following search dashboard: IBMSupportPortal-ExpertAdvice Displays search results based on your searches. The WASAppInsightPack group contains the following search dashboards: WAS Information Links Displays useful information links for more information about WebSphere Application Server. WAS Errors and Warnings Displays example reports based on errors and warnings in WebSphere Application Server. The WindowsOSEventsInsightPack group contains the following search dashboard: Windows Events Log Dashboard Displays example reports that are based in event data from the Windows operating system. Fix Pack 1 The alertsinsightpack group contains the following search dashboard: Alerts Dashboard Displays alert data based on the alerts that are created on the Manager Alerts UI. The charts display the alert information grouped by type and data source and over time. 36 IBM Operations Analytics Log Analysis: User's Guide

43 Alerts dashboard Fix Pack 1 Use the Alerts dashboard to view information based on the alerts that you create in Log Analysis. This feature is not available in the Entry Edition. The data that is displayed on the dashboard is based on the last time data from one of the data sources used for alerts was loaded into Log Analysis. This means that there is a delay between changes to your alerts and the information that is displayed on the dashboard. In this case, you need to wait until data is loaded into Log Analysis again to see the latest data. If the dashboard does not display any data, ensure that the time that you specified matches a period when alerts were created. Prerequisites You must install IBM Operations Analytics Log AnalysisFix Pack 1. You must create alerts in Log Analysis. For more information, see. Using the dashboard Custom Search Dashboards The dashboard displays information on four graphs: TOTAL ALERTS BY TYPE Displays the total number of alerts grouped by type. TOTAL ALERTS BY TYPE OVER TIME Displays the total number of alerts grouped by type for the chosen time range. TOTAL ALERTS BY DATASOURCE Displays the total number of alerts grouped by data source. TOTAL ALERTS BY DATASOURCE OVER TIME Displays the total number of alerts grouped by data source for the chosen time range. Custom Search Dashboards allow you to create and execute custom scripts and display the output of those scripts in a dashboard. To generate a Custom Search Dashboard, you create a JSON application file in the <HOME>/IBM/LogAnalysis/AppFramework/Apps directory. You can create sub directories in this directory to organize your applications. After you have created your Custom Search Dashboard, the Custom Search Dashboards pane, in the Search workspace, displays the list of Custom Search Dashboards in the folder structure that you have specified. To run a Custom Search Dashboard, locate it in the Custom Search Dashboards, right-click and click Execute. Searching and visualizing data 37

44 For information about developing and customizing Custom Search Dashboards, see the Extending section of the documentation on the IBM Knowledge Center. Expert advice Expert advice is a Custom Search Dashboard that provides links to contextually relevant information to allow you to quickly resolve problems. Using the Expert advice Custom Search Dashboard, you can select any column or cells in Grid view and launch a search of the IBM support portal (IBMSupportPortal- ExpertAdvice.app). The Custom Search Dashboard searches for matches to unique terms contained in the column that you have selected. This Custom Search Dashboard can be launched from the Custom Search Dashboards panel in the left navigation pane of the Search workspace. To increases the likelihood of success, the Custom Search Dashboard removes data that is specific to your environment for each search term. For example, a log message that contained the search string unable to access jarfile /mymachine/foo/bar/foobar.jar is not likely to return a specific match as the server path is likely to be specific to a user. This is abbreviated to unable to access jarfile to ensure better search results. The criteria used to exclude data can be configured. For more information about configuring the Expert advice Custom Search Dashboard, see the Configuring IBM Operations Analytics Log Analysis section of the IBM Knowledge Center. To launch the Expert advice Custom Search Dashboard for the data returned by a search, select a column or cell of data in Grid view, right-click and click Execute to launch the IBMSupportPortal-ExpertAdvice.app Custom Search Dashboard. Creating a Custom Search Dashboards You can use Custom Search Dashboards to extract data from IBM Operations Analytics Log Analysis or from any external source and present that data in a useful format such as a chart. A Custom Search Dashboard must include a script to generate data, any parameters required by the script, and a specification to define how that data is displayed. There are two types of output that are generated by Custom Search Dashboards: Dashboard data Create a IBM Operations Analytics Log Analysis dashboard to display charts based on the data generated by a Custom Search Dashboard. You can also use a dashboard to display HTML content. Search filters Generate keywords and populate Configured Patterns widget to drill down through a search. When using the Insight Pack tooling in Eclipse, there is a folder called src-files/unity_apps in the Insight Pack project, specifically for Custom Search Dashboard files. There are sub folders for apps, chartspecs, and templates to contain application definitions, chart specifications, and template definitions, respectively. 38 IBM Operations Analytics Log Analysis: User's Guide

45 Defining a Custom Search Dashboard To create a Custom Search Dashboard you must create an application file and also create a script that extracts data to be displayed in your Custom Search Dashboard. Where appropriate, you can also create a template file that contains an array of JSON objects. Templates define the structure for Custom Search Dashboards and also allow you to share common components across Custom Search Dashboards. Prerequisite knowledge Creating a Custom Search Dashboard requires that you have a knowledge of these areas: v Coding in one of the accepted formats: Python, Shell scripting, or Java programming v JSON development Custom Search Dashboard components These are the components that are required for a Custom Search Dashboard: Application The application file is a JSON file that contains the configuration for your Custom Search Dashboard. The JSON application file is saved with a.app file extension and must be located in the <HOME>/AppFramework/Apps/ directory or in a directory within this directory. Any directory structure you create is reflected in the Custom Search Dashboards pane in the Search workspace. Script The application file contains: 1. References to the script that defines your Custom Search Dashboard 2. Parameters that must be passed to the script to generate data, such as user credentials and host name. 3. Specifications that define the layout of your Custom Search Dashboard and the charts that are displayed, including the chart type, and/or the HTML to be displayed. The script that you run to define your Custom Search Dashboard can be a shell script, a python script or a Java program. If you are running a Java program, you must bundle it as an executable JAR file. The script performs the actions: 1. Generates an HTTP POST request to a IBM Operations Analytics Log Analysis URL. 2. Extracts the JSON that contains the data generated by the search in the HTTP POST request. 3. Generates JSON string representing HTML and/or data that the Charts can process. Scripts can also be written to extract data from an external source instead of indexed data. The script must be saved to the same folder as the application file and is called by the application file. Note: For IBM Operations Analytics Log Analysis and later, the IBM Operations Analytics Log Analysis query syntax is based on the Indexing Engine query syntax rather than the IBM Data Explorer query syntax, which was used in releases before If your Custom Search Searching and visualizing data 39

46 Dashboard uses query syntax from a version of IBM Operations Analytics Log Analysis that is older than , you must update the query syntax used in your script to match the Indexing Engine based query syntax. For more information, see Appendix A: Query syntax in the Searching Guide. Templates Templates contain JSON objects that are similar to the JSON used for the application file. The template JSON objects contain an additional key named template with the value that is set to true. Each template can reference one or more custom scripts. The templates are saved with a file extension of.tmpl to the <HOME>/AppFramework/Templates/ directory or a directory within this directory. You must store the script file(s) referenced by the template in the same folder as the template. If an application file specifies a template, the script located in the same folder as the template is executed. Templates can be used to create a common structure for a group of applications and also to share common configuration elements across applications. Script parameters can be marked as mandatory by adding the parameter required and setting the value to true. You can also set a default value for any parameter. Steps to create a Custom Search Dashboard Complete these steps to create a Custom Search Dashboard. Procedure 1. The source log file you want to use must be loaded and processed by IBM Operations Analytics Log Analysis. 2. Using python, shell scripting, or Java, create a script. The script and its corresponding application or template file must reside in the same directory: <HOME>/AppFramework/Apps. If no value is specified for the type parameter at the top of the application file, the application file runs the script from the same folder as the application file. Insight Packs and applications: If you want to include the application in an Insight Pack project, the script and the associated application file must be in the same folder within the project: src-files/unity_apps/apps under the project folder. In the script, you need to specify the actions: a. Connect to IBM Operations Analytics Log Analysis. b. Use an HTTP POST to run a search request to a IBM Operations Analytics URL for JSON data. The query uses the same method as the queries you can run in the Search Workspace. For the query, use the JSON format for Search requests as described in the Search runtime API. If you need to narrow down the returned data further, you can parse the data within the script. c. Format the returned JSON data into a structure that can be consumed by the charts. 3. Create a JSON application file and save it to the <HOME>/AppFramework/Apps/ directory or a sub-directory within this directory. 40 IBM Operations Analytics Log Analysis: User's Guide

47 If you want to include the application in an Insight Pack project, create the JSON application file in the src-files/unity_apps/apps folder. The application file references your script and specifies the chart type and parameters for the dashboard display. Note: If you are using the Insight Pack Tooling, build the Insight Pack containing the application, and use the pkg_mgmt utility to install the Insight Pack on your IBM Operations Analytics system. 4. From the Custom Apps pane in the Search workspace, run your application and determine if it is displaying your data as you intended. Amend the script and application file as required. Refresh Custom Apps: If you do not see you application listed, refresh the Custom Apps pane in the Search workspace in order to list the newly installed Custom app. Note: If you close a chart portlet, you must run the Custom Search Dashboard again to reopen the chart. 5. (Optional) If you want to create a template for your application, create a template in the directory: <HOME>/AppFramework/Templates. If a template name has been specified in the type field, the application file references that template and executes the script in the same directory as that template. That is, the application file and script must be located in the same directory as the template file. Insight Packs and templates: If you want to include the application in an Insight Pack project, the script, the application, and template file must reside in the same in the folder within the project: src-files/unity_apps/templates under the project folder. Related concepts: Search REST API overview The Search REST interface can be used to execute search queries. Search can be executed through a HTTP POST request on /Search. Related reference: Facet requests Different types of facet requests are supported by USR, along with the JSON format used to specify each type of facet request. JSON Format for Search Request The input JSON for a search request is a single JSON object. Scripts The script that is executed can be written as a Python script, shell script, or a Java application packaged as an executable JAR file. When you execute the application, the parameters are written to a temporary file in JSON format, and the file name is passed to the script. The script reads the parameters from the file. And the script generates the output in the standard JSON format required for the dashboard specification. Within an Insight Pack project, scripts must reside in the same folder as the application or template that references it, either in src-files/unity_apps/apps or src-files/unity_apps/templates. In the script, use an HTTP POST request to query a IBM Operations Analytics URL for JSON data. The query uses the same method as the queries you can run in the Searching and visualizing data 41

48 Search Workspace. For the query, use the JSON format for Search requests as described in Search REST API. If you need to narrow down the returned data further, you can parse the data within the script. Setting environment and connection in Python scripts If you are using Python, you can import convenience methods in unity.py for interacting with the IBM Operations Analytics server. unity.py is shipped with IBM Operations Analytics and can be found in PYTHONPATH. It contains methods for connecting to the Operations Analytics servers, as well as making HTTP requests that include the required Cross Site Request Forgery (CSRF) tokens. To use unity.py, simply include it in an import statement. For example: from unity import UnityConnection, get_response_content try: import json except ImportError: import simplejson as json import sys To connect to the Operations Analytics server, create an instance of UnityConnection() and call the UnityConnection.login() method. The UnityConnection() object takes the parameters: v url - base URL to the Operations Analytics server; if you changed the default port number during install, you should also change it here v username - username used for authentication v password - password used for authentication For example: # Create a new UnityConnection to the server and login unity_connection = UnityConnection( unityadmin, password ) unity_connection.login() The parameters to the Python script are written to a file in JSON format, and the filename is passed to the script. For example, the parameters can be passed in a file containing the following: "parameters": [ "name": "search", "type": "SearchQuery", "value": "filter": "range": "timestamp": "from":"01/01/ :00: EST", "to":"01/01/ :00: EST", "dateformat":"mm/dd/yyyy HH:mm:ss.SSS Z", "logsources": [ "type": "logsource", "name": "SystemOut" ], 42 IBM Operations Analytics Log Analysis: User's Guide

49 ], Here is an example of how the parameters can be retrieved and parsed: # Get the parameters if len(sys.argv) > 1: filename = str(sys.argv[1]) fk = open(filename,"r") data = json.load(fk) parameters = data[ parameters ] for i in parameters: if i[ name ] == search : search = i[ value ] for key in search.keys(): if key == filter : filter = search[ filter ] elif key == logsources : logsource = search[ logsources ] To make an http request that uses the Search runtime API, use the connection.post() and get_response_content() methods. For example: request = "logsources": logsource, "query": "*", "filter": filter // this is the value from the parameters // this is the value from the parameters response = connection.post( /Search, json.dumps(request), content_type= application/json; charset=utf-8 ); content = get_response_content(response) # parse the results in found in content... Lastly, close the connection at the end of the Python script: # close the connection connection.logout() Related reference: JSON Format for Search Request The input JSON for a search request is a single JSON object. Facet requests Different types of facet requests are supported by USR, along with the JSON format used to specify each type of facet request. Example search request: The search request retrieves JSON data and is the body in the HTTP POST request. This is an example of a JSON search string that is the body HTTP POST request. "start":0, "results":1, "filter": "range": "timestamp": Searching and visualizing data 43

50 "from":"5/31/2012 5:37: ", "to":"5/31/2013 5:37: ", "dateformat":"mm/dd/yyyy HH:mm:ss.SSS Z", "query":"severity:(w OR E)", "sortkey":[ "-timestamp" ], "getattributes":[ "timestamp", "severity" ], "facets": "datefacet": "date_histogram": "field":"timestamp", "interval":"hour", "outputdateformat":"mm-dd HH:mm", "nested_facet": "severityfacet": "terms": "field":"severity", "size":10, "logsources":[ "type":"logsource", "name":"/systemout" ] The request specifies: v The number of records returned, in this case it is "results":1. It is helpful to set the number records to a low number when you are building and testing your script. v The date/time range is from May to May 31, 2013: "timestamp":"from":"5/31/2012 5:37: ","to":"5/31/2013 5:37: ","dateFormat":"MM/dd/yyyy HH:mm:ss.SSS Z" v The query searches for Warnings or Errors and the results are sorted by timestamp. Only the timestamp and severity attributes are returned in the results. "query":"severity:(w OR E)", "sortkey":["-timestamp"],"getattributes":["timestamp","severity"], v The logsource is set as the WebSphere Application Server SystemOut log. "logsources":["type":"logsource","name":"/systemout"] v (Optional) Facet requests that you can use to create sums, time interval, or other statistical data for a given field/value pair. In this example there is a date_histogram facet with a nested terms facet. Within each time interval returned by the date_histogram facet, the term facet, called severityfacet, counts the number of each type of severity. "facets":"datefacet":"date_histogram":"field": "timestamp","interval":"hour","outputdateformat":"mm-dd HH:mm", "nested_facet":"severityfacet":"terms":"field":"severity","size":10, 44 IBM Operations Analytics Log Analysis: User's Guide

51 1,000 or more results and Custom Search Dashboard: When a query in a Custom Search Dashboard returns more than 1000 records, you get only 1000 results back. The search result returned includes a field totalresults which shows total number of matching results. Another field numresults gives the number of records returned. You can check these values in the Custom Search Dashboard script and handle the results accordingly. Example JSON output from search request: Run your script and review the JSON output. Adjust the script to get the output you need. Search request output When you post the HTTP Search request, it returns a JSON string in raw format. It is helpful to open the raw output in a JSON editor. These samples show the output from the example search in raw format and in the tabbed format displayed in JSON editors. This sample is segment of the returned JSON in raw format. "searchrequest":"start":0,"results":1,"filter":"and": ["range":"timestamp":"from":"5\/31\/2012 5:37: ", "to":"5\/31\/2013 5:37: ", "dateformat":"mm\/dd\/yyyy HH:mm:ss.SSS Z","or":["phrase": "logsource":"systemout"], "range":"_writetime":"dateformat":"yyyy-mm-dd T HH:mm:ss.SSSZ","from": " T00:00: ","to":" T05:48: "], "query":"severity:(w OR E)", "sortkey":["-timestamp"],"getattributes":["timestamp","severity"], "facets":"datefacet":"date_histogram":"field":"timestamp","interval":"hour", "outputdateformat":"mm-dd HH:mm","nested_facet":"severityFacet": "terms":"field":"severity","size":10," usr_fast_date_query":"utc", "logsources":["type":"logsource","name":"\/systemout"],"collections": ["WASSystemOut-Collection1"],"totalResults":805,"numResults":1,"executionInfo": "processingtime":33,"searchtime":54,"searchresults":[ "resultindex":0,"attributes": "msgclassifier":"tcpc0003e","threadid":" ","message": "TCP Channel TCP_2 initialization failed. The socket bind failed for host SystemOut.log SystemOut.logOneLiners SystemOut.logOneLinersNoTS unity_populate_was_log.sh unity_search_pattern_insert.sh was_search_pattern.txt x and port The port may already be in use._log_2_", "_writetime":"05\/27\/13 12:39:03: ","logsourceHostname": "fit-vm12-230","logrecord":"[10\/21\/12 19:57:02:313 GMT+05:30] TCPPort E TCPC0003E: TCP Channel TCP_2 initialization failed. The socket bind failed for host SystemOut.log SystemOut.logOneLiners SystemOut.logOneLinersNoTS unity_populate_was_log.sh unity_search_pattern_insert.sh was_search_pattern.txt x and port The port may already be in use._log_2_", "timestamp":"10\/21\/12 14:27:02: ","severity":"E","logsource" :"SystemOut"], "facetresults":"datefacet":["label":" UTC","low":" :00", "high":" :00","count":425,"nested_facet":"severityFacet":"counts": ["term":"w","count":221,"term":"e","count":204],"total":2,"label": " UTC","low":" :00","high":" :00","count":380, "nested_facet":"severityfacet":"counts":["term":"e","count":197, "term":"w","count":183],"total":2], "metadata":"msgclassifier":"filterable":true,"datatype":"text","sortable":false,... ""exceptionmethodname":"filterable":false,"datatype":"text","sortable":false, "severity":"filterable":true,"datatype":"text" Searching and visualizing data 45

52 This sample shows the same results formatted in a JSON editor. 46 IBM Operations Analytics Log Analysis: User's Guide "searchrequest": "start": 0, "results": 1, "filter": "and": [ "range": "timestamp": "from": "5/31/2012 5:37: ", "to": "5/31/2013 5:37: ", "dateformat": "MM/dd/yyyy HH:mm:ss.SSS Z", "or": [ "phrase": "logsource": "SystemOut" ], "range": "_writetime": "dateformat": "yyyy-mm-dd T HH:mm:ss.SSSZ", "from": " T00:00: ", "to": " T05:48: " ], "query": "severity:(w OR E)", "sortkey": [ "-timestamp" ], "getattributes": [ "timestamp", "severity" ], "facets": "datefacet": "date_histogram": "field": "timestamp", "interval": "hour", "outputdateformat": "MM-dd HH:mm", "nested_facet": "severityfacet": "terms": "field": "severity", "size": 10, " usr_fast_date_query": "UTC", "logsources": [ "type": "logsource", "name": "/SystemOut"

53 ], "collections": [ "WASSystemOut-Collection1" ], "totalresults": 805, "numresults": 1, "executioninfo": "processingtime": 33, "searchtime": 54, "searchresults": [ "resultindex": 0, "attributes": "msgclassifier": "TCPC0003E", "threadid": " ", "message": "TCP Channel TCP_2 initialization failed. The socket bind failed for host SystemOut.log SystemOut.logOneLiners SystemOut.logOneLinersNoTS unity_populate_was_log.sh unity_search_pattern_insert.sh was_search_pattern.txt x and port The port may already be in use._log_2_", "_writetime": "05/27/13 12:39:03: ", "logsourcehostname": "fit-vm12-230", "logrecord": "[10/21/12 19:57:02:313 GMT+05:30] TCPPort E TCPC0003E: TCP Channel TCP_2 initialization failed. The socket bind failed for host SystemOut.log SystemOut.logOneLiners SystemOut.logOneLinersNoTS unity_populate_was_log.sh unity_search_pattern_insert.sh was_search_pattern.txt x and port The port may already be in use._log_2_", "timestamp": "10/21/12 14:27:02: ", "severity": "E", "logsource": "SystemOut" ], "facetresults": "datefacet": [ "label": " UTC", "low": " :00", "high": " :00", "count": 425, "nested_facet": "severityfacet": "counts": [ "term": "W", "count": 221, "term": "E", "count": 204 ], "total": 2,..., "logsource": "filterable": true, "datatype": "TEXT", Searching and visualizing data 47

54 "sortable": true Note: The TotalResults value shows the total number of matching results. The numresults value shows the number of returned results. The search request specified the return of one result. ("results":1). JSON output from a script: You can specify a Custom Search Dashboard with multiple charts with each chart pointing to a different data element. Data elements can be in the form of data or HTML. The script returns this JSON structure that contains the data that is used to populate the charts, or HTML to be displayed. You can also configure Custom Search Dashboards to return error messages in the following format: "error_msg": "<error_message_string>" where <error_message_string> is the message that you want to display. The following example shows the output parameters for a Custom Search Dashboard. The first data set contains the correct data that is used to render the chart. The second data set contains an error message. "data": [ "rows": [ "userid": "116", "count": 9 ], "fields": [ "label": "userid", "type": "TEXT", "id": "userid", "label": "count", "type": "LONG", "id": "count" ], "id": "DynamicDashboardSearch_0_ ", "error_msg": "CTGLA2107E : Custom applications need to be configured for every use case with details of data sources. No valid data sources were specified in the search request. Verify that the chosen data sources exist and that any specified tags contain data source descendants. For further information on how to configure the custom applications, refer to the documentation.", "id": "DynamicDashboardSearch_1_ " ] Example Script: The full script example shown here contains all the required elements. In this example script, the value for some elements are defined as variables inside the script. For example, the elements of the search request such as the logsource 48 IBM Operations Analytics Log Analysis: User's Guide

55 and query are defined as variables. Custom scripts should return date field data only in the supported formats for charts to render correctly. Supported formats include yyyy-mm-ddthh:mm:ssz and MM/dd/yy HH:mm:ss:SSS Z. from unity import UnityConnection, get_response_content from urllib import urlencode import CommonAppMod try: import json except ImportError: import simplejson as json import sys import re from datetime import datetime, date, time, timedelta import UnityAppMod # Create a new Unity connection to the server and login unity_connection = UnityConnection( unityadmin, unityadmin )unity_connection.login() ########################################################## # Calculates the current time, relative time, and forms timestamps ########################################################## currenttimestamp = datetime.now() # Currently chosen relative timestamp newtimestamp = currenttimestamp - UnityAppMod.getLastYear() # Format the timestamp currentformatted = UnityAppMod.formatTimestamp(currentTimestamp) newformatted = UnityAppMod.formatTimestamp(newTimestamp) # Define the logsource for the search logsource = "logsources":["type":"logsource","name":"/systemout"] # Define the output data chartdata = [] timeunit = hour timeunitformat = MM-dd HH:mm:ss.SSS Z timestamp = "timestamp":"from":" + newformatted + ","to":" + currentformatted + ","dateformat":"mm/dd/yy HH:mm:ss.SSS Z" # Parameter defaults start = 0 results = 1 ########################################################## def getsearchdata(query, facet, sortkey, attributes): #Search query to be used as content for POST body = "start": + str(start) +,"results": + str(results) + \,"filter":"range": + timestamp +, + \ query +, + \ sortkey +, + \ attributes +, + \ facet #print body if logsource: body = body +, + logsource Searching and visualizing data 49

56 body = body + #print body # Post the search request response = unity_connection.post( /Search, body, content_type= application/json; charset=utf-8 ); content = get_response_content(response) # Convert the response data to JSON data = try: data = json.loads(content) #print json.dumps(data, sort_keys=false,indent=4, separators=(,, : )) except: pass if result in data: result = data[ result ] if status in result and result[ status ] == failure : msg = result[ message ] print >> sys.stderr, msg sys.exit(1) return data ########################################################## ########################################################## def geterrorswarningsvstime(chartdata): # Define the query for the search query = "query":"severity:(w OR E)" # Define the facet to be used for the search facet = "facets":"datefacet":"date_histogram":"field":"timestamp", "interval":" + timeunit + ","outputdateformat":" + timeunitformat + ","nested_facet": "severityfacet":"terms":"field":"severity","size":10 #Define the sortkey to be used for the search results sortkey = "sortkey":["-timestamp"] # get the facetfields facetfields = [ "id":"date", "label":"timestamp", "type":"date", "id":"severity", "label":"severity", "type":"text", "id":"count", "label":"count", "type":"long" ] facetrows = [] # Define the attributes attributes = "getattributes":["timestamp","severity"] # do the query data = getsearchdata(query, facet, sortkey, attributes) if facetresults in data: # get the facet results facetresults = data[ facetresults ] #print json.dumps(facetresults, sort_keys=false,indent=4, separators=(,, : )) if datefacet in facetresults: # get the facetrows datefacet = facetresults[ datefacet ] CommonAppMod.dateSort(dateFacet) for daterow in datefacet: 50 IBM Operations Analytics Log Analysis: User's Guide

57 for severityrow in daterow[ nested_facet ][ severityfacet ] [ counts ]: facetrows.append("date":daterow[ low ], "severity":severityrow [ term ], "count":severityrow[ count ] ); #print facetrows chartdata.append( id : ErrorsWarningsVsTime, fields :facetfields, rows :facetrows) return chartdata ########################################################## # Define the timestamp for the search #timestamp = "timestamp":"from":" + newformatted + ","to":" + currentformatted + ","dateformat":"mm/dd/yy HH:mm:ss.SSS Z" # Define the logsource for the search #logsource = "logsources":["type":"logsource","name":"/systemout"] geterrorswarningsvstime(chartdata) unity_connection.logout() # # Build the final output data JSON # # build the chart data dt = data :chartdata #Print the JSON to system out #print( json.dumps(dt, sort_keys=false,indent=4, separators=(,, : ))) print json.dumps(dt, sort_keys=false,indent=4, separators=(,, : )) Application files The application file JSON can be created by implementing the structure provided in the sample JSON outlined in this topic. If you are using the Insight Pack Tooling, the application file JSON can be created in the Insight Pack project in the subfolder src-files/unity_apps/apps. The parameters defined for application files are: name type Specify the name of the application file. This is the name of the.app file that displays in the Custom Search Dashboards pane in the Search workspace. (Optional) Where applicable, specify the name of the template used. The name you specify in this field is the name of the template file excluding the file extension. description Specify the purpose of the application. customlogic Specify the details for the script including any parameters and outline the details for the output that is to be displayed. These parameters are defined for customlogic: script Specify the name of the script that you want to execute. The script Searching and visualizing data 51

58 output can be a Python or shell script, or a Java application packaged as an executable JAR file. If you are going to use a template, the script you execute must be in the same directory as the template. If you are not using a template, save the script to the same directory as the application file. description Provide a description for the script. timeout (Optional) Specify in seconds the time after which the search queries issued by the Custom Search Dashboard are stopped. This per-app parameter has no default value. parameters Complete a JSON array of parameters that are passed to the script. This array can be empty. The parameters you pass depend on the requirements of the script that you are executing. In the first set of parameters in the sample, the parameters for a search query are passed. This section outlines the nature of the output that you want to display for your Custom Search Dashboard. type Specify Data as the value for this parameter. visualization Add dashboard or searchfilters as a sub-parameter for this parameter. This determines what is displayed when you execute a Custom Search Dashboard. The dashboard requires that you specify the layout for the dashboard that you are creating. Under the Dashboard parameter, add a columns sub-parameter and specify a numeric value to indicate the number of columns that you require. Add a chart as an additional sub-parameter for the dashboard parameter and add an array of JSON objects to indicate what charts you want to include in your dashboard. Include only valid chart types and also include the required parameters for each chart. For more information, see the Charts section of this document. The searchfilters parameter is for the search filter app and requires a relationalquery as the input parameter. The relational query is JSON object that consists of JSON keys: For more information see Search filter app. Example Dashboard visualizations This example references the ExecuteSearchWithParameters.py script and specifies a bubble chart based on data from the chartdata01 data set. The chartdata01 data set id is defined in the ExecuteSearchWithParameters.py script and referenced in the chart specification. "name": "Search Results from python script- Parameters", "type": "SearchApps", "description": "Captures long running queries in maximo logs", "customlogic": "script": "ExecuteSearchWithParameters.py", "description": "View chart on search results", 52 IBM Operations Analytics Log Analysis: User's Guide

59 "timeout": "parameters": [ "name": "search", "type": "SearchQuery", "value": "start": 0, "results": 10, "filter": "range": "timestamp": "from": "12/03/ :21: India Standard Time", "to": "12/03/ :21: India Standard Time", "dateformat": "dd/mm/yyyy HH:mm:ss.SSS Z", "logsources": ["type": "tag","name": "*"], "query": "*", "name": "url", "type": "String", "value": "http: // : 9988/Unity/" ], "output": "type": "Data", "visualization": "dashboard": "columns": 1, "charts": [ "type": "Bubble Chart", "title": "Severity against Time", "data": "$ref": "chartdata01", "parameters": "xaxis": "timestamp","yaxis": "severity" ] Related tasks: Creating the search filter app on page 72 You can create the search filter app to create a customized search query in IBM Operations Analytics Log Analysis. Modifying the search filter app core logic on page 77 If you want to extend the default implementation, the core logic of the search filter app is described here. Charts: The chart types specified here are supported by IBM Operations Analytics Log Analysis. The chart specifications are contained in the <HOME>/AppsFrameowrk/ chartspecs directory. Displaying a chart: To display a chart, you execute your Custom Search Dashboard from the Search workspace. If you close a chart portlet, you must run the Custom Search Dashboard again to reopen the chart. These parameters are defined for all charts: type title Specify the type of chart required. The value must match the ID of a chart specification that is contained in the <HOME>/AppFramework/chartspecs directory. The supported chart types are outlined in this section. Specify the chart title. Searching and visualizing data 53

60 data Specify the ID for the data element that is represented in the chart. This ID specified must match to the ID provided in the dashboard specifications. parameters Fields to be displayed in the chart. Line chart The line chart is defined with these limitations: v Chart name: Line Chart v Parameters: xaxis and yaxis v Chart Specification: "type": "Line Chart", "title": "Line Chart ( 2 parameters )", "data": "$ref": "searchresults01", "parameters": "xaxis": "timestamp", "yaxis": "throughput" v Aggregation To aggregate the throughput parameter, define the following code: "type": "Line Chart", "title": "Line Chart ( 2 parameters )", "data": "$ref": "searchresults01", "summarizedata": "column": "throughput", "function": "sum", "parameters": "xaxis": "timestamp", "yaxis": "throughput" The summarizedata key determines whether aggregation is to be performed or not. column is the name of numeric column that uses the LONG or DOUBLE data type to perform aggregation on. sum is the aggregation function to be applied. Supported functions are sum, min, max. Bar chart The bar chart is defined with these limitations: v Chart name: Bar Chart1 v Parameters: xaxis, yaxis, and categories v Limitations: Only integer values are supported for the yaxis parameter. v Chart Specification: "type": "Bar Chart", "title": "Bar Chart ( 3 parameters )", "data": "$ref": "searchresults01", "parameters": "xaxis": "timestamp", 54 IBM Operations Analytics Log Analysis: User's Guide

61 Point chart "yaxis": "CPU", "categories": "hostname" The point chart is defined with these limitations: v Chart name: Point Chart v Parameters: xaxis and yaxis v Chart Specification: Pie chart "type": "Point Chart", "title": "Point Chart ( 2 parameters )", "data": "$ref": "searchresults01", "parameters": "xaxis": "timestamp", "yaxis": "errorcode" The pie chart is defined with these limitations: v Chart name: Pie Chart v Parameters: xaxis and yaxis v Chart Specification: "type": "Pie Chart", "title": "Pie Chart ( 2 parameters )", "data": "$ref": "searchresults03", "parameters": "xaxis": "count", "yaxis": "severity" Cluster bar chart The cluster bar chart is defined with these limitations: v Chart name: Cluster Bar v Parameters: xaxis, yaxis, and sub-xaxis v Chart Specification: "type": "Cluster Bar", "title": "Cluster Bar ( 3 parameters )", "data": "$ref": "searchresults02", "parameters": "xaxis": "hostname", "yaxis": "errorcount", "sub-xaxis": "msgclassifier" Searching and visualizing data 55

62 Bubble chart The bubble chart is defined with these limitations: v Chart name: Bubble Chart v Parameters: xaxis, yaxis, and categories v Chart Specification: "type": "Bubble Chart", "title": "Bubble Chart ( 3 parameters )", "data": "$ref": "searchresults01", "parameters": "xaxis": "timestamp", "yaxis": "CPU", "categories": "errorcode" The size of the bubble on the graph depends on the number of items in the parameter that is being represented. In some cases, for example if you have a large bubble and a small bubble, the large bubble may cover the smaller one. Tree map chart The tree map chart is defined with these limitations: v Chart name: Tree Map v Parameters: level1, level2, level3, and value v Chart Specification: "type": "Tree Map", "title": "Tree Map ( 4 parameters )", "data": "$ref": "searchresults01", "parameters": "level1": "hostname", "level2": "errorcode", "level3": "severity", "value":"cpu" Two-series line chart The two-series line chart is defined with these limitations: v Chart name: Two Series Line Chart v Parameters: xaxis, yaxis1, and yaxis2 v Chart Specification: "type": "Two Series Line Chart", "title": "Two Series Line Chart ( 3 parameters)", "data": "$ref": "searchresults01", "parameters": "xaxis": "timestamp", 56 IBM Operations Analytics Log Analysis: User's Guide

63 "yaxis1": "throughput", "yaxis2": "ResponseTime" Stacked bar chart The stacked bar chart is defined with these limitations: v Chart name: Stacked Bar Chart v Parameters: xaxis, yaxis, and categories v Chart Specification: "type": "Stacked Bar Chart", "title": "Stacked Bar Chart ( 3 parameters )", "data": "$ref": "searchresults01", "parameters": "xaxis": "hostname", "yaxis": "CPU", "categories": "severity" Stacked line chart The stacked line chart is defined with these limitations: v Chart name: Stacked Line Chart v Parameters: xaxis, yaxis, and categories v Chart Specification: Heat map "type": "Stacked Line Chart", "title": "Stacked Line Chart ( 3 parameters )", "data": "$ref": "searchresults01", "parameters": "xaxis": "threadid", "yaxis": "timestamp", "categories": "MBO" The heat map chart is defined with these limitations: v Chart name: Heat map v Parameters: xaxis, yaxis, and count v Chart Specification: "type": "Heat Map", "title": "Heat Map ( 3 parameters )", "data": "$ref": "searchresults01", "parameters": "xaxis": "messageclassifier", Searching and visualizing data 57

64 "yaxis": "hostname", "count": "throughput" Example application file: This example shows an application file that references the 3Params_was_systemout.py script example and specifies how multiple charts display the output JSON from the script. Example application file and chart dashboard This example shows the application file and the dashboard of charts it creates when you run the application. The ErrorsWarningsVsTime and ExceptionByHost data sets are defined in the 3Params_was_systemout.py script and referenced in the chart specifications in this application file. "name": "WAS Errors and Warnings Dashboard", "description": "Display a dashboard of charts that show WAS errors and warnings", "customlogic": "script": "3Params_was_systemout.py", "description": "View charts based on search results", "parameters": [ "name": "search", "type": "SearchQuery", "value": "logsources": [ "type": "logsource", "name": "SystemOut" ], "name": "relativetimeinterval", "type": "string", "value":"lastday", "name": "timeformat", "type": "data", "value": "timeunit": "hour", "timeunitformat": "MM-dd HH:mm", "name": "hostnamefield", "type": "string", "value": "logsource_hostname" ], "output": "type": "Data", "visualization": "dashboard": "columns": 2, "charts": [ "title": "Message Counts - Top 5 over Last Day", "parameters": 58 IBM Operations Analytics Log Analysis: User's Guide

65 "xaxis": "date", "yaxis": "count", "sub-xaxis": "severity", "data": "$ref": "ErrorsWarningsVsTime", "type": "Cluster Bar", "type": "Heat Map", "title": "Message Counts - Top 5 over Last Day", "data": "$ref": "ErrorsWarningsVsTime", "parameters": "xaxis": "date", "yaxis": "count", "category": "severity", "type": "Bubble Chart", "title": "Java Exception Counts - Top 5 over Last Day", "data": "$ref": "ErrorsWarningsVsTime", "parameters": "xaxis": "date", "yaxis": "count", "size": "severity",,, "type": "Two Series Line Chart", "title": "Error and Warning Total by Hostname - Top 5 over Last Day", "data": "$ref": "ErrorsWarningsVsTime", "parameters": "xaxis": "date", "yaxis1": "count", "yaxis2": "severity" "type": "Tree Map", "title": "Messages by Hostname - Top 5 over Last Day", "data": "$ref": "ErrorsWarningsVsTime", "parameters": "level1": "date", "level2": "count", "level3": "severity", "value": "count" "type": "Stacked Bar Chart", "title": "Java Exception by Hostname - Top 5 over Last Day", "data": Searching and visualizing data 59

66 "$ref": "ExceptionByHost", "parameters": "xaxis": "date", "yaxis": "count", "categories": "hostname" ] This sample shows a chart created from the example script and application files. Using a Custom Search Dashboard to display a search query or selected data: You can create a Custom Search Dashboard that displays, in a new tab, the full syntax for the current search and also displays the contents of a selected column or individual cell. Create an application and include the _query and _data parameters to ensure that your data is displayed. The purpose of these parameters is: _query _data Displays a JSON string containing the currently active search. Specify _query in the name field for the parameter. The type field must contain the value additionalparameterfromui. Displays the data from any column or cell that you select in Grid view. To display the data, select a column or cell in Grid view and then launch the 60 IBM Operations Analytics Log Analysis: User's Guide

IBM SmartCloud Analytics - Log Analysis Version Installation and Administration Guide

IBM SmartCloud Analytics - Log Analysis Version Installation and Administration Guide IBM SmartCloud Analytics - Log Analysis Version 1.1.0.3 Installation and Administration Guide IBM SmartCloud Analytics - Log Analysis Version 1.1.0.3 Installation and Administration Guide Note Before

More information

Error Message Reference

Error Message Reference Security Policy Manager Version 7.1 Error Message Reference GC23-9477-01 Security Policy Manager Version 7.1 Error Message Reference GC23-9477-01 Note Before using this information and the product it

More information

IBM Tivoli Federated Identity Manager Version Installation Guide GC

IBM Tivoli Federated Identity Manager Version Installation Guide GC IBM Tivoli Federated Identity Manager Version 6.2.2 Installation Guide GC27-2718-01 IBM Tivoli Federated Identity Manager Version 6.2.2 Installation Guide GC27-2718-01 Note Before using this information

More information

IBM Netcool Operations Insight Version 1 Release 4. Integration Guide IBM SC

IBM Netcool Operations Insight Version 1 Release 4. Integration Guide IBM SC IBM Netcool Operations Insight Version 1 Release 4 Integration Guide IBM SC27-8601-00 Note Before using this information and the product it supports, read the information in Notices on page 249. This edition

More information

Version Monitoring Agent User s Guide SC

Version Monitoring Agent User s Guide SC Tivoli IBM Tivoli Advanced Catalog Management for z/os Version 02.01.00 Monitoring Agent User s Guide SC23-7974-00 Tivoli IBM Tivoli Advanced Catalog Management for z/os Version 02.01.00 Monitoring Agent

More information

IBM Security Access Manager for Enterprise Single Sign-On Version 8.2. Administrator Guide SC

IBM Security Access Manager for Enterprise Single Sign-On Version 8.2. Administrator Guide SC IBM Security Access Manager for Enterprise Single Sign-On Version 8.2 Administrator Guide SC23-9951-03 IBM Security Access Manager for Enterprise Single Sign-On Version 8.2 Administrator Guide SC23-9951-03

More information

IBM. Network Health Dashboard Reference - BETA. Network Manager IP Edition Version 4 Release 2

IBM. Network Health Dashboard Reference - BETA. Network Manager IP Edition Version 4 Release 2 Network Manager IP Edition Version 4 Release 2 Network Health Dashboard Reference - BETA IBM Restricted Materials of IBM R4.2 E1 Network Manager IP Edition Version 4 Release 2 Network Health Dashboard

More information

Business Insight Authoring

Business Insight Authoring Business Insight Authoring Getting Started Guide ImageNow Version: 6.7.x Written by: Product Documentation, R&D Date: August 2016 2014 Perceptive Software. All rights reserved CaptureNow, ImageNow, Interact,

More information

IBM Tivoli Monitoring for Web Infrastructure: WebSphere Application Server. User s Guide. Version SC

IBM Tivoli Monitoring for Web Infrastructure: WebSphere Application Server. User s Guide. Version SC IBM Tivoli Monitoring for Web Infrastructure: WebSphere Application Server User s Guide Version 5.1.1 SC23-4705-01 IBM Tivoli Monitoring for Web Infrastructure: WebSphere Application Server User s Guide

More information

OBIEE. Oracle Business Intelligence Enterprise Edition. Rensselaer Business Intelligence Finance Author Training

OBIEE. Oracle Business Intelligence Enterprise Edition. Rensselaer Business Intelligence Finance Author Training OBIEE Oracle Business Intelligence Enterprise Edition Rensselaer Business Intelligence Finance Author Training TABLE OF CONTENTS INTRODUCTION... 1 USER INTERFACE... 1 HOW TO LAUNCH OBIEE... 1 TERMINOLOGY...

More information

IBM Operations Analytics - Log Analysis Version 1.3. Installation and Administration Guide

IBM Operations Analytics - Log Analysis Version 1.3. Installation and Administration Guide IBM Operations Analytics - Log Analysis Version 1.3 Installation and Administration Guide IBM Operations Analytics - Log Analysis Version 1.3 Installation and Administration Guide Note Before using this

More information

IBM Tivoli Directory Server

IBM Tivoli Directory Server IBM Tivoli Directory Server White Pages Version 6.1 SC23-7837-00 IBM Tivoli Directory Server White Pages Version 6.1 SC23-7837-00 Note Before using this information and the product it supports, read the

More information

IBM. PDF file of IBM Knowledge Center topics. IBM Operations Analytics for z Systems. Version 2 Release 2

IBM. PDF file of IBM Knowledge Center topics. IBM Operations Analytics for z Systems. Version 2 Release 2 IBM Operations Analytics for z Systems IBM PDF file of IBM Knowledge Center topics Version 2 Release 2 IBM Operations Analytics for z Systems IBM PDF file of IBM Knowledge Center topics Version 2 Release

More information

ZENworks Reporting System Reference. January 2017

ZENworks Reporting System Reference. January 2017 ZENworks Reporting System Reference January 2017 Legal Notices For information about legal notices, trademarks, disclaimers, warranties, export and other use restrictions, U.S. Government rights, patent

More information

SAS Visual Analytics 8.2: Getting Started with Reports

SAS Visual Analytics 8.2: Getting Started with Reports SAS Visual Analytics 8.2: Getting Started with Reports Introduction Reporting The SAS Visual Analytics tools give you everything you need to produce and distribute clear and compelling reports. SAS Visual

More information

ER/Studio Enterprise Portal User Guide

ER/Studio Enterprise Portal User Guide ER/Studio Enterprise Portal 1.1.1 User Guide Copyright 1994-2009 Embarcadero Technologies, Inc. Embarcadero Technologies, Inc. 100 California Street, 12th Floor San Francisco, CA 94111 U.S.A. All rights

More information

Product Documentation. ER/Studio Portal. User Guide. Version Published February 21, 2012

Product Documentation. ER/Studio Portal. User Guide. Version Published February 21, 2012 Product Documentation ER/Studio Portal User Guide Version 1.6.3 Published February 21, 2012 2012 Embarcadero Technologies, Inc. Embarcadero, the Embarcadero Technologies logos, and all other Embarcadero

More information

Service Configuration Guide

Service Configuration Guide Business Service Manager Version 6.1 Service Configuration Guide SC23-6041-07 Business Service Manager Version 6.1 Service Configuration Guide SC23-6041-07 Note Before using this information and the product

More information

Cisco TEO Adapter Guide for Microsoft System Center Operations Manager 2007

Cisco TEO Adapter Guide for Microsoft System Center Operations Manager 2007 Cisco TEO Adapter Guide for Microsoft System Center Operations Manager 2007 Release 2.3 April 2012 Americas Headquarters Cisco Systems, Inc. 170 West Tasman Drive San Jose, CA 95134-1706 USA http://www.cisco.com

More information

Federated Identity Manager Business Gateway Version Configuration Guide GC

Federated Identity Manager Business Gateway Version Configuration Guide GC Tivoli Federated Identity Manager Business Gateway Version 6.2.1 Configuration Guide GC23-8614-00 Tivoli Federated Identity Manager Business Gateway Version 6.2.1 Configuration Guide GC23-8614-00 Note

More information

Cisco TEO Adapter Guide for Microsoft Windows

Cisco TEO Adapter Guide for Microsoft Windows Cisco TEO Adapter Guide for Microsoft Windows Release 2.3 April 2012 Americas Headquarters Cisco Systems, Inc. 170 West Tasman Drive San Jose, CA 95134-1706 USA http://www.cisco.com Tel: 408 526-4000 800

More information

Extended Search Administration

Extended Search Administration IBM Lotus Extended Search Extended Search Administration Version 4 Release 0.1 SC27-1404-02 IBM Lotus Extended Search Extended Search Administration Version 4 Release 0.1 SC27-1404-02 Note! Before using

More information

Sitecore Experience Platform 8.0 Rev: September 13, Sitecore Experience Platform 8.0

Sitecore Experience Platform 8.0 Rev: September 13, Sitecore Experience Platform 8.0 Sitecore Experience Platform 8.0 Rev: September 13, 2018 Sitecore Experience Platform 8.0 All the official Sitecore documentation. Page 1 of 455 Experience Analytics glossary This topic contains a glossary

More information

Using the VMware vcenter Orchestrator Client. vrealize Orchestrator 5.5.1

Using the VMware vcenter Orchestrator Client. vrealize Orchestrator 5.5.1 Using the VMware vcenter Orchestrator Client vrealize Orchestrator 5.5.1 You can find the most up-to-date technical documentation on the VMware website at: https://docs.vmware.com/ If you have comments

More information

Netcool Configuration Manager Version 6 Release 4. Reference Guide R2E3

Netcool Configuration Manager Version 6 Release 4. Reference Guide R2E3 Netcool Configuration Manager Version 6 Release 4 Reference Guide R2E3 Netcool Configuration Manager Version 6 Release 4 Reference Guide R2E3 Note Before using this information and the product it supports,

More information

Pure Storage FlashArray Management Pack for VMware vrealize Operations Manager User Guide. (Version with Purity 4.9.

Pure Storage FlashArray Management Pack for VMware vrealize Operations Manager User Guide. (Version with Purity 4.9. Pure Storage FlashArray Management Pack for VMware vrealize Operations Manager User Guide (Version 1.0.139 with Purity 4.9.x or higher) Sunday, November 27, 2016 16:13 Pure Storage FlashArray Management

More information

SAS Web Report Studio 3.1

SAS Web Report Studio 3.1 SAS Web Report Studio 3.1 User s Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2006. SAS Web Report Studio 3.1: User s Guide. Cary, NC: SAS

More information

Griffin Training Manual Grif-WebI Introduction (For Analysts)

Griffin Training Manual Grif-WebI Introduction (For Analysts) Griffin Training Manual Grif-WebI Introduction (For Analysts) Alumni Relations and Development The University of Chicago Table of Contents Chapter 1: Defining WebIntelligence... 1 Chapter 2: Working with

More information

Tivoli Monitoring Agent for IBM Tivoli Monitoring 5.x Endpoint

Tivoli Monitoring Agent for IBM Tivoli Monitoring 5.x Endpoint Tivoli Monitoring Agent for IBM Tivoli Monitoring 5.x Endpoint Version 6.1.0 User s Guide SC32-9490-00 Tivoli Monitoring Agent for IBM Tivoli Monitoring 5.x Endpoint Version 6.1.0 User s Guide SC32-9490-00

More information

ER/Studio Enterprise Portal User Guide

ER/Studio Enterprise Portal User Guide ER/Studio Enterprise Portal 1.0.3 User Guide Copyright 1994-2009 Embarcadero Technologies, Inc. Embarcadero Technologies, Inc. 100 California Street, 12th Floor San Francisco, CA 94111 U.S.A. All rights

More information

Table of Contents. Table of Contents

Table of Contents. Table of Contents Powered by 1 Table of Contents Table of Contents Dashboard for Windows... 4 Dashboard Designer... 5 Creating Dashboards... 5 Printing and Exporting... 5 Dashboard Items... 5 UI Elements... 5 Providing

More information

Business Service Manager Version Scenarios Guide SC

Business Service Manager Version Scenarios Guide SC Business Service Manager Version 6.1.0.1 Scenarios Guide SC23-6043-08 Business Service Manager Version 6.1.0.1 Scenarios Guide SC23-6043-08 Note Before using this information and the product it supports,

More information

Netcool Configuration Manager Version Administration Guide R2E4

Netcool Configuration Manager Version Administration Guide R2E4 Netcool Configuration Manager Version 6.4.1 Administration Guide R2E4 Netcool Configuration Manager Version 6.4.1 Administration Guide R2E4 Note Before using this information and the product it supports,

More information

Manage and Generate Reports

Manage and Generate Reports Report Manager, page 1 Generate Reports, page 3 Trust Self-Signed Certificate for Live Data Reports, page 4 Report Viewer, page 4 Save an Existing Stock Report, page 7 Import Reports, page 7 Export Reports,

More information

Web Dashboard User Guide

Web Dashboard User Guide Web Dashboard User Guide Version 10.6 The software supplied with this document is the property of RadView Software and is furnished under a licensing agreement. Neither the software nor this document may

More information

Cisco TEO Adapter Guide for

Cisco TEO Adapter Guide for Release 2.3 April 2012 Americas Headquarters Cisco Systems, Inc. 170 West Tasman Drive San Jose, CA 95134-1706 USA http://www.cisco.com Tel: 408 526-4000 800 553-NETS (6387) Fax: 408 527-0883 Text Part

More information

IBM Tivoli Composite Application Manager for WebSphere Application Server Version 7.1. Installation Guide

IBM Tivoli Composite Application Manager for WebSphere Application Server Version 7.1. Installation Guide IBM Tivoli Composite Application Manager for WebSphere Application Server Version 7.1 Installation Guide IBM Tivoli Composite Application Manager for WebSphere Application Server Version 7.1 Installation

More information

Widgets for SAP BusinessObjects Business Intelligence Platform User Guide SAP BusinessObjects Business Intelligence platform 4.1 Support Package 2

Widgets for SAP BusinessObjects Business Intelligence Platform User Guide SAP BusinessObjects Business Intelligence platform 4.1 Support Package 2 Widgets for SAP BusinessObjects Business Intelligence Platform User Guide SAP BusinessObjects Business Intelligence platform 4.1 Support Package 2 Copyright 2013 SAP AG or an SAP affiliate company. All

More information

Service Configuration Guide

Service Configuration Guide Business Service Manager Version 6.1.1 Service Configuration Guide SC23-6041-09 Business Service Manager Version 6.1.1 Service Configuration Guide SC23-6041-09 Note Before using this information and the

More information

IBM Leads Version 9 Release 1 October 25, User Guide

IBM Leads Version 9 Release 1 October 25, User Guide IBM Leads Version 9 Release 1 October 25, 2013 User Guide Note Before using this information and the product it supports, read the information in Notices on page 35. This edition applies to version 9,

More information

Tivoli IBM Tivoli Monitoring for Network Performance

Tivoli IBM Tivoli Monitoring for Network Performance Tivoli IBM Tivoli Monitoring for Network Performance Version 2 Release 1 Operator Guide SC31-6365-00 Tivoli IBM Tivoli Monitoring for Network Performance Version 2 Release 1 Operator Guide SC31-6365-00

More information

ER/Studio Enterprise Portal 1.1 User Guide

ER/Studio Enterprise Portal 1.1 User Guide ER/Studio Enterprise Portal 1.1 User Guide Copyright 1994-2009 Embarcadero Technologies, Inc. Embarcadero Technologies, Inc. 100 California Street, 12th Floor San Francisco, CA 94111 U.S.A. All rights

More information

IDOL Site Admin. Software Version: User Guide

IDOL Site Admin. Software Version: User Guide IDOL Site Admin Software Version: 11.5 User Guide Document Release Date: October 2017 Software Release Date: October 2017 Legal notices Warranty The only warranties for Hewlett Packard Enterprise Development

More information

Monitor Qlik Sense sites. Qlik Sense Copyright QlikTech International AB. All rights reserved.

Monitor Qlik Sense sites. Qlik Sense Copyright QlikTech International AB. All rights reserved. Monitor Qlik Sense sites Qlik Sense 2.1.2 Copyright 1993-2015 QlikTech International AB. All rights reserved. Copyright 1993-2015 QlikTech International AB. All rights reserved. Qlik, QlikTech, Qlik Sense,

More information

Client Installation and User's Guide

Client Installation and User's Guide IBM Tivoli Storage Manager FastBack for Workstations Version 7.1 Client Installation and User's Guide SC27-2809-03 IBM Tivoli Storage Manager FastBack for Workstations Version 7.1 Client Installation

More information

Kendo UI. Builder by Progress : Using Kendo UI Designer

Kendo UI. Builder by Progress : Using Kendo UI Designer Kendo UI Builder by Progress : Using Kendo UI Designer Copyright 2017 Telerik AD. All rights reserved. December 2017 Last updated with new content: Version 2.1 Updated: 2017/12/22 3 Copyright 4 Contents

More information

Hands-on Lab Session 9909 Introduction to Application Performance Management: Monitoring. Timothy Burris, Cloud Adoption & Technical Enablement

Hands-on Lab Session 9909 Introduction to Application Performance Management: Monitoring. Timothy Burris, Cloud Adoption & Technical Enablement Hands-on Lab Session 9909 Introduction to Application Performance Management: Monitoring Timothy Burris, Cloud Adoption & Technical Enablement Copyright IBM Corporation 2017 IBM, the IBM logo and ibm.com

More information

IBM i2 Analyst s Notebook Quick Start Guide

IBM i2 Analyst s Notebook Quick Start Guide IBM i2 Analyst s Notebook Quick Start Guide Provided with IBM i2 Analyst s Notebook 8.9 May 202 - - Copyright 0. This edition applies to version 8, release 9 of IBM i2 Analyst s Notebook (product number

More information

Using the VMware vrealize Orchestrator Client

Using the VMware vrealize Orchestrator Client Using the VMware vrealize Orchestrator Client vrealize Orchestrator 7.0 This document supports the version of each product listed and supports all subsequent versions until the document is replaced by

More information

Product Documentation. ER/Studio Portal. User Guide 2nd Edition. Version 2.0 Published January 31, 2013

Product Documentation. ER/Studio Portal. User Guide 2nd Edition. Version 2.0 Published January 31, 2013 Product Documentation ER/Studio Portal User Guide 2nd Edition Version 2.0 Published January 31, 2013 2013 Embarcadero Technologies, Inc. Embarcadero, the Embarcadero Technologies logos, and all other Embarcadero

More information

User s Guide for Software Distribution

User s Guide for Software Distribution IBM Tivoli Configuration Manager User s Guide for Software Distribution Version 4.2.1 SC23-4711-01 IBM Tivoli Configuration Manager User s Guide for Software Distribution Version 4.2.1 SC23-4711-01 Note

More information

Workspace Administrator Help File

Workspace Administrator Help File Workspace Administrator Help File Table of Contents HotDocs Workspace Help File... 1 Getting Started with Workspace... 3 What is HotDocs Workspace?... 3 Getting Started with Workspace... 3 To access Workspace...

More information

Network Problem Resolution Guide

Network Problem Resolution Guide Tivoli Network Manager IP Edition Version 3 Release 8 Network Problem Resolution Guide GC23-9903-02 Tivoli Network Manager IP Edition Version 3 Release 8 Network Problem Resolution Guide GC23-9903-02

More information

Understanding the Relationship with Domain Managers

Understanding the Relationship with Domain Managers 4 CHAPTER Understanding the Relationship with Domain Managers Prime Central for HCS reports the events generated by underlying domain managers. Domain managers may also discover topology and relationships

More information

Embarcadero DB Optimizer 1.0 Evaluation Guide. Published: July 14, 2008

Embarcadero DB Optimizer 1.0 Evaluation Guide. Published: July 14, 2008 Published: July 14, 2008 Embarcadero Technologies, Inc. 100 California Street, 12th Floor San Francisco, CA 94111 U.S.A. This is a preliminary document and may be changed substantially prior to final commercial

More information

Working with Reports

Working with Reports The following topics describe how to work with reports in the Firepower System: Introduction to Reports, page 1 Risk Reports, page 1 Standard Reports, page 2 About Working with Generated Reports, page

More information

SAS Model Manager 2.2. Tutorials

SAS Model Manager 2.2. Tutorials SAS Model Manager 2.2 Tutorials The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2009. SAS Model Manager 2.2: Tutorials. Cary, NC: SAS Institute Inc. SAS Model Manager

More information

vrealize Operations Manager Customization and Administration Guide vrealize Operations Manager 6.4

vrealize Operations Manager Customization and Administration Guide vrealize Operations Manager 6.4 vrealize Operations Manager Customization and Administration Guide vrealize Operations Manager 6.4 vrealize Operations Manager Customization and Administration Guide You can find the most up-to-date technical

More information

VMware vrealize Operations for Horizon Administration

VMware vrealize Operations for Horizon Administration VMware vrealize Operations for Horizon Administration vrealize Operations for Horizon 6.2 This document supports the version of each product listed and supports all subsequent versions until the document

More information

The following topics describe how to work with reports in the Firepower System:

The following topics describe how to work with reports in the Firepower System: The following topics describe how to work with reports in the Firepower System: Introduction to Reports Introduction to Reports, on page 1 Risk Reports, on page 1 Standard Reports, on page 2 About Working

More information

Network Manager IP Edition Version 3 Release 9. Network Troubleshooting Guide IBM R2E2

Network Manager IP Edition Version 3 Release 9. Network Troubleshooting Guide IBM R2E2 Network Manager IP Edition Version 3 Release 9 Network Troubleshooting Guide IBM R2E2 Network Manager IP Edition Version 3 Release 9 Network Troubleshooting Guide IBM R2E2 Note Before using this information

More information

SAS Publishing SAS. Forecast Studio 1.4. User s Guide

SAS Publishing SAS. Forecast Studio 1.4. User s Guide SAS Publishing SAS User s Guide Forecast Studio 1.4 The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2006. SAS Forecast Studio 1.4: User s Guide. Cary, NC: SAS Institute

More information

Contents. Common Site Operations. Home actions. Using SharePoint

Contents. Common Site Operations. Home actions. Using SharePoint This is a companion document to About Share-Point. That document describes the features of a SharePoint website in as much detail as possible with an emphasis on the relationships between features. This

More information

A Guide to Quark Author Web Edition 2015

A Guide to Quark Author Web Edition 2015 A Guide to Quark Author Web Edition 2015 CONTENTS Contents Getting Started...4 About Quark Author - Web Edition...4 Smart documents...4 Introduction to the Quark Author - Web Edition User Guide...4 Quark

More information

Version 2 Release 2. IBM i2 Enterprise Insight Analysis Upgrade Guide IBM SC

Version 2 Release 2. IBM i2 Enterprise Insight Analysis Upgrade Guide IBM SC Version 2 Release 2 IBM i2 Enterprise Insight Analysis Upgrade Guide IBM SC27-5091-00 Note Before using this information and the product it supports, read the information in Notices on page 35. This edition

More information

End User s Guide Release 5.0

End User s Guide Release 5.0 [1]Oracle Application Express End User s Guide Release 5.0 E39146-04 August 2015 Oracle Application Express End User's Guide, Release 5.0 E39146-04 Copyright 2012, 2015, Oracle and/or its affiliates. All

More information

Table of Contents 1-4. User Guide 5. Getting Started 6. Report Portal 6. Creating Your First Report Previewing Reports 11-13

Table of Contents 1-4. User Guide 5. Getting Started 6. Report Portal 6. Creating Your First Report Previewing Reports 11-13 Table of Contents Table of Contents 1-4 User Guide 5 Getting Started 6 Report Portal 6 Creating Your First Report 6-11 Previewing Reports 11-13 Previewing Reports in HTML5 Viewer 13-18 Report Concepts

More information

Adlib PDF FileNet Connector Guide PRODUCT VERSION: 5.1

Adlib PDF FileNet Connector Guide PRODUCT VERSION: 5.1 Adlib PDF FileNet Connector Guide PRODUCT VERSION: 5.1 REVISION DATE: January 2014 Copyright 2014 Adlib This manual, and the Adlib products to which it refers, is furnished under license and may be used

More information

SAS Model Manager 2.3

SAS Model Manager 2.3 SAS Model Manager 2.3 Administrator's Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2010. SAS Model Manager 2.3: Administrator's Guide. Cary,

More information

Enterprise Data Catalog for Microsoft Azure Tutorial

Enterprise Data Catalog for Microsoft Azure Tutorial Enterprise Data Catalog for Microsoft Azure Tutorial VERSION 10.2 JANUARY 2018 Page 1 of 45 Contents Tutorial Objectives... 4 Enterprise Data Catalog Overview... 5 Overview... 5 Objectives... 5 Enterprise

More information

IBM. Tips and Troubleshooting Guide. IBM Emptoris Contract Management. Version SC

IBM. Tips and Troubleshooting Guide. IBM Emptoris Contract Management. Version SC IBM Emptoris Contract Management IBM Tips and Troubleshooting Guide Version 10.0.4 SC27-5345-03 IBM Emptoris Contract Management IBM Tips and Troubleshooting Guide Version 10.0.4 SC27-5345-03 ii IBM Emptoris

More information

DEPLOYING VMWARE TOOLS USING SCCM USER GUIDE TECHNICAL WHITE PAPER - DECEMBER 2017

DEPLOYING VMWARE TOOLS USING SCCM USER GUIDE TECHNICAL WHITE PAPER - DECEMBER 2017 DEPLOYING VMWARE TOOLS USING SCCM USER GUIDE TECHNICAL WHITE PAPER - DECEMBER 2017 Table of Contents Intended Audience 3 Document conventions 3 Support 3 Deployment Workflow 4 System Requirements 5 Software

More information

EMC SourceOne for Microsoft SharePoint Version 6.7

EMC SourceOne for Microsoft SharePoint Version 6.7 EMC SourceOne for Microsoft SharePoint Version 6.7 Administration Guide P/N 300-012-746 REV A01 EMC Corporation Corporate Headquarters: Hopkinton, MA 01748-9103 1-508-435-1000 www.emc.com Copyright 2011

More information

IBM InfoSphere Information Server Version 8 Release 7. Reporting Guide SC

IBM InfoSphere Information Server Version 8 Release 7. Reporting Guide SC IBM InfoSphere Server Version 8 Release 7 Reporting Guide SC19-3472-00 IBM InfoSphere Server Version 8 Release 7 Reporting Guide SC19-3472-00 Note Before using this information and the product that it

More information

Intelligent Video Analytics V1.5 Version 1 Release 5. Intelligent Video Analytics V1.5 User's Guide

Intelligent Video Analytics V1.5 Version 1 Release 5. Intelligent Video Analytics V1.5 User's Guide Intelligent Video Analytics V1.5 Version 1 Release 5 Intelligent Video Analytics V1.5 User's Guide Intelligent Video Analytics V1.5 Version 1 Release 5 Intelligent Video Analytics V1.5 User's Guide Note

More information

In this lab, you will build and execute a simple message flow. A message flow is like a program but is developed using a visual paradigm.

In this lab, you will build and execute a simple message flow. A message flow is like a program but is developed using a visual paradigm. Lab 1 Getting Started 1.1 Building and Executing a Simple Message Flow In this lab, you will build and execute a simple message flow. A message flow is like a program but is developed using a visual paradigm.

More information

IBM Optim. Edit User Manual. Version7Release3

IBM Optim. Edit User Manual. Version7Release3 IBM Optim Edit User Manual Version7Release3 IBM Optim Edit User Manual Version7Release3 Note Before using this information and the product it supports, read the information in Notices on page 79. Version

More information

Integration Service. Admin Console User Guide. On-Premises

Integration Service. Admin Console User Guide. On-Premises Kony MobileFabric TM Integration Service Admin Console User Guide On-Premises Release 7.3 Document Relevance and Accuracy This document is considered relevant to the Release stated on this title page and

More information

IBM emessage Version 9 Release 1 February 13, User's Guide

IBM emessage Version 9 Release 1 February 13, User's Guide IBM emessage Version 9 Release 1 February 13, 2015 User's Guide Note Before using this information and the product it supports, read the information in Notices on page 471. This edition applies to version

More information

IBM Network Performance Insight Document Revision R2E1. Using Network Performance Insight IBM

IBM Network Performance Insight Document Revision R2E1. Using Network Performance Insight IBM IBM Network Performance Insight 1.2.3 Document Revision R2E1 Using Network Performance Insight IBM Note Before using this information and the product it supports, read the information in Notices on page

More information

SAS IT Resource Management 3.8: Reporting Guide

SAS IT Resource Management 3.8: Reporting Guide SAS IT Resource Management 3.8: Reporting Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2017. SAS IT Resource Management 3.8: Reporting Guide.

More information

Getting Started Guide. ProClarity Analytics Platform 6. ProClarity Professional

Getting Started Guide. ProClarity Analytics Platform 6. ProClarity Professional ProClarity Analytics Platform 6 ProClarity Professional Note about printing this PDF manual: For best quality printing results, please print from the version 6.0 Adobe Reader. Getting Started Guide Acknowledgements

More information

Task Management User Guide

Task Management User Guide Task Management User Guide Version 18 April 2018 Contents About This Guide... 5 Tasks Overview... 5 Create a Project for Task Management... 5 Project Templates Overview... 5 Add a Project Template...

More information

SAS. Information Map Studio 3.1: Creating Your First Information Map

SAS. Information Map Studio 3.1: Creating Your First Information Map SAS Information Map Studio 3.1: Creating Your First Information Map The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2006. SAS Information Map Studio 3.1: Creating Your

More information

Intellicus Enterprise Reporting and BI Platform

Intellicus Enterprise Reporting and BI Platform Designing Adhoc Reports Intellicus Enterprise Reporting and BI Platform Intellicus Technologies info@intellicus.com www.intellicus.com Designing Adhoc Reports i Copyright 2012 Intellicus Technologies This

More information

Veritas NetBackup OpsCenter Reporting Guide. Release 8.0

Veritas NetBackup OpsCenter Reporting Guide. Release 8.0 Veritas NetBackup OpsCenter Reporting Guide Release 8.0 Veritas NetBackup OpsCenter Reporting Guide Legal Notice Copyright 2016 Veritas Technologies LLC. All rights reserved. Veritas and the Veritas Logo

More information

SAP BusinessObjects Integration Option for Microsoft SharePoint Getting Started Guide

SAP BusinessObjects Integration Option for Microsoft SharePoint Getting Started Guide SAP BusinessObjects Integration Option for Microsoft SharePoint Getting Started Guide SAP BusinessObjects XI3.1 Service Pack 4 Copyright 2011 SAP AG. All rights reserved.sap, R/3, SAP NetWeaver, Duet,

More information

Embarcadero PowerSQL 1.1 Evaluation Guide. Published: July 14, 2008

Embarcadero PowerSQL 1.1 Evaluation Guide. Published: July 14, 2008 Embarcadero PowerSQL 1.1 Evaluation Guide Published: July 14, 2008 Contents INTRODUCTION TO POWERSQL... 3 Product Benefits... 3 Product Benefits... 3 Product Benefits... 3 ABOUT THIS EVALUATION GUIDE...

More information

Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners.

Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners. Primavera Portfolio Management 9.0 What s New Copyright 1999-2011, Oracle and/or its affiliates. The Programs (which include both the software and documentation) contain proprietary information; they are

More information

Table of Contents HOL-SDC-1635

Table of Contents HOL-SDC-1635 Table of Contents Lab Overview - - vrealize Log Insight... 2 Lab Guidance... 3 Module 1 - Log Management with vrealize Log Insight - (45 Minutes)... 7 Overview of vrealize Log Insight... 8 Video Overview

More information

WebSphere Commerce Enterprise Commerce Professional

WebSphere Commerce Enterprise Commerce Professional WebSphere Commerce Enterprise Commerce Professional Version 6.0 Installation Guide for Linux GC10-4258-06 WebSphere Commerce Enterprise Commerce Professional Version 6.0 Installation Guide for Linux GC10-4258-06

More information

Managing Your Website with Convert Community. My MU Health and My MU Health Nursing

Managing Your Website with Convert Community. My MU Health and My MU Health Nursing Managing Your Website with Convert Community My MU Health and My MU Health Nursing Managing Your Website with Convert Community LOGGING IN... 4 LOG IN TO CONVERT COMMUNITY... 4 LOG OFF CORRECTLY... 4 GETTING

More information

Adlib PDF FileNet Connector Guide PRODUCT VERSION: 5.3

Adlib PDF FileNet Connector Guide PRODUCT VERSION: 5.3 Adlib PDF FileNet Connector Guide PRODUCT VERSION: 5.3 REVISION DATE: June 2015 Copyright 2015 Adlib This manual, and the Adlib products to which it refers, is furnished under license and may be used or

More information

Version Installation and User Guide

Version Installation and User Guide IBM Cognos 8 Business Intelligence Map Manager Version 8.4.1 Installation and User Guide Product Information This document applies to IBM Cognos 8 Version 8.4.1 and may also apply to subsequent releases.

More information

Client Installation and User's Guide

Client Installation and User's Guide IBM Tivoli Storage Manager FastBack for Workstations Version 7.1.1 Client Installation and User's Guide SC27-2809-04 IBM Tivoli Storage Manager FastBack for Workstations Version 7.1.1 Client Installation

More information

Vizit Essential for SharePoint 2013 Version 6.x User Manual

Vizit Essential for SharePoint 2013 Version 6.x User Manual Vizit Essential for SharePoint 2013 Version 6.x User Manual 1 Vizit Essential... 3 Deployment Options... 3 SharePoint 2013 Document Libraries... 3 SharePoint 2013 Search Results... 4 Vizit Essential Pop-Up

More information

IBM FileNet Business Process Framework Version 4.1. Explorer Handbook GC

IBM FileNet Business Process Framework Version 4.1. Explorer Handbook GC IBM FileNet Business Process Framework Version 4.1 Explorer Handbook GC31-5515-06 IBM FileNet Business Process Framework Version 4.1 Explorer Handbook GC31-5515-06 Note Before using this information and

More information

COGNOS (R) ENTERPRISE BI SERIES COGNOS REPORTNET (TM)

COGNOS (R) ENTERPRISE BI SERIES COGNOS REPORTNET (TM) COGNOS (R) ENTERPRISE BI SERIES COGNOS REPORTNET (TM) GETTING STARTED Cognos ReportNet Getting Started 07-05-2004 Cognos ReportNet 1.1MR1 Type the text for the HTML TOC entry Type the text for the HTML

More information

Using vrealize Log Insight

Using vrealize Log Insight vrealize Log Insight 4.3 This document supports the version of each product listed and supports all subsequent versions until the document is replaced by a new edition. To check for more recent editions

More information

User Guide. Web Intelligence Rich Client. Business Objects 4.1

User Guide. Web Intelligence Rich Client. Business Objects 4.1 User Guide Web Intelligence Rich Client Business Objects 4.1 2 P a g e Web Intelligence 4.1 User Guide Web Intelligence 4.1 User Guide Contents Getting Started in Web Intelligence 4.1... 5 Log into EDDIE...

More information