Serena Business Manager 11.1 vs : Comparative Performance Test Results

Similar documents
Scaling for the Enterprise

Mashup the Development Process with ALF (Eclipse Application Lifecycle Framework project)

Increasing Performance for PowerCenter Sessions that Use Partitions

Performance Benchmark and Capacity Planning. Version: 7.3

We are ready to serve Latest Testing Trends, Are you ready to learn.?? New Batches Info

HP ALM Performance Center

Using the vrealize Orchestrator Operations Client. vrealize Orchestrator 7.5

Introduction to ALM, UFT, VuGen, and LoadRunner

webmethods Task Engine 9.9 on Red Hat Operating System

QLIKVIEW SCALABILITY BENCHMARK WHITE PAPER

ALM Lab Management. Lab Management Guide. Software Version: Go to HELP CENTER ONLINE

BMC Remedy OnDemand

EMC Unisphere for VMAX Database Storage Analyzer

Micro Focus Enterprise View. Installing Enterprise View

ALM. Tutorial. Software Version: Go to HELP CENTER ONLINE

Sync Services. Server Planning Guide. On-Premises

Sync Services. Server Planning Guide. On-Premises

Test Methodology We conducted tests by adding load and measuring the performance of the environment components:

COMMUNITIES USER MANUAL. Satori Team

TIBCO Complex Event Processing Evaluation Guide

HP Intelligent Management Center SOM Administrator Guide

BlackBerry Enterprise Server for Microsoft Exchange Version: 5.0. Performance Benchmarking Guide

McAfee epolicy Orchestrator Release Notes

Microsoft SQL Server Fix Pack 15. Reference IBM

Managing System Administration Settings

Market Data Publisher In a High Frequency Trading Set up

HP ALM Lab Management

Service Manager 9.31 web client end-to-end performance test report

PEOPLESOFT ELM 9.0 SELF-SERVICE

SYSTEM REQUIREMENTS M.APP ENTERPRISE

Overview of the Performance and Sizing Guide

Patrice M. Anderson Instructional Designer

An Oracle White Paper Released April 2008

The Magic of Microsoft Office SharePoint Services & Office 2007

Business Process Testing

Cherwell Service Management

Operations Orchestration 10.x Flow Authoring (OO220)

Qlik Sense Performance Benchmark

DevSuite Admin Guide. Date:

Definitions English Version Axway Global. English Version. Definitions. Version 1.0 of Thursday, February 09, 2017 Status: final

Ivanti Service Desk and Asset Manager Technical Specifications and Architecture Guidelines

User Manual. Admin Report Kit for IIS 7 (ARKIIS)

Education Course Catalog

Goverlan Reach Server Hardware & Operating System Guidelines

HP ALM. Software Version: Tutorial

Intellicus Enterprise Reporting and BI Platform

Performance Best Practices Paper for IBM Tivoli Directory Integrator v6.1 and v6.1.1

MB Microsoft Dynamics CRM 2016 Online Deployment.

Vendor: IBM. Exam Code: C Exam Name: IBM Security Identity Manager V6.0 Implementation. Version: Demo

Managing System Administration Settings

Load Runner. Sanjay Kumar Load Runner Tutorials. Introduction

System Requirements for Microsoft Dynamics SL 2018

ALM. What's New. Software Version: Go to HELP CENTER ONLINE

HP ALM. Software Version: Tutorial

HP ALM Overview. Exercise Outline. Administration and Customization Lab Guide

System Administration of PTC Windchill 11.0

QC MOCK TEST QC MOCK TEST

Sage Compatibility guide. Last revised: August 20, 2018

Revision History Overview Feature Summary Knowledge Management Policy Automation Platform Agent Browser Workspaces Agent Browser Desktop Automation

QLIKVIEW SCALABILITY BENCHMARK WHITE PAPER

DEVELOPING MICROSOFT SHAREPOINT SERVER 2013 ADVANCED SOLUTIONS. Course: 20489A; Duration: 5 Days; Instructor-led

FIREFLY ARCHITECTURE: CO-BROWSING AT SCALE FOR THE ENTERPRISE

SpiraTeam Feature Comparison

LOADRUNNER INTERVIEW QUESTIONS

Sage Accpac. Sage Accpac ERP 5.6. Compatibility Guide (Revised as of: October 22, 2009) Sage Accpac ERP 5.6. Important

COMPARISON OF ORACLE APPLICATION SERVER, WEBLOGIC AND WEBSPHERE USING PEOPLESOFT ENTERPRISE CAMPUS SOLUTIONS 8.9

Monitoring Agent for Tomcat 6.4 Fix Pack 4. Reference IBM

Contents Overview of the Performance and Sizing Guide... 5 Architecture Overview... 7 Performance and Scalability Considerations...

Informatica Developer Tips for Troubleshooting Common Issues PowerCenter 8 Standard Edition. Eugene Gonzalez Support Enablement Manager, Informatica

This hot fix provides four registry keys to hide redundant notification/log created for cached messages.

Getting Started With Intellicus. Version: 7.3

112-WL. Introduction to JSP with WebLogic

Oracle Primavera P6 Enterprise Project Portfolio Management Performance and Sizing Guide. An Oracle White Paper April 2011

User Scripting April 14, 2018

Certified Tester Foundation Level Performance Testing Sample Exam Questions

System Requirements for Microsoft Dynamics SL 2015

An Oracle White Paper October Benchmarking Oracle Real-Time Decisions 3.2 on Exalytics Hardware

V5.0 for DLP+ HyBoost User Manual. V5.0 for DLP+ HyBoost. [User Manual V1.4] Copyrightc2015 Somansa. All rights reserved

Getting Started with Intellicus. Version: 16.0

IKAN ALM Architecture. Closing the Gap Enterprise-wide Application Lifecycle Management

Managing System Administration Settings

Verint Knowledge Management Solution Brief Overview of the Unique Capabilities and Benefits of Verint Knowledge Management

Course Syllabus: Getting Started with ibolt V3.x

Capacity Planning for Application Design

Oracle Application Testing Suite: Introduction Student Guide

HP ALM Synchronizer for Agile Manager

Adept 8/8.1 System Requirements

Vendor: Microsoft. Exam Code: MB Exam Name: Microsoft Dynamics CRM Online Deployment. Version: Demo

Contents About This Guide... 5 Architecture Overview... 5 Performance and Scalability Considerations... 7 Deployment Considerations...

Symantec Workflow Solution 7.1 MP1 Installation and Configuration Guide

MY MEDIASITE.

Modern Requirements4TFS 2018 Update 1 Release Notes

Workspace Secure Container for Mobile Devices

ExtraView 11.1 Key New Features

WHITE PAPER AGILOFT SCALABILITY AND REDUNDANCY

GroupWise Architecture and Best Practices. WebAccess. Kiran Palagiri Team Lead GroupWise WebAccess

Silk Central Release Notes

C exam. Number: C Passing Score: 800 Time Limit: 120 min IBM C IBM Cloud Platform Application Development

Hitachi NEXT 2018 Automating Service Maintenance with Hitachi Automation Director (HAD)

Sprinter. User Guide. Software Version: Go to SPRINTER HELP CENTER

Transcription:

SERENA SOFTWARE Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results Author: RT Tangri 2016-08-11

Table of Contents Who Should Read This Paper?... 3 Test Methodology... 3 Runtime Test Architecture... 4 200 Virtual Users Load Test Scenario... 4 80 Virtual Users Load Test Scenario... 7 About the Data Set... 8 Performance Test Results... 9 Additional Information... 72 Reference... 74 2 Serena Software

Who Should Read This Paper? This paper is intended for system administrators or others interested in comparing the performance test results for Serena Business Manager 11.1 with SBM 11.0.1. This paper discusses the performance tests results of 200 virtual users (scaled to 80 virtual users for SBM Orchestration Engine) in the on-premise version of Serena Work Center. The performance tests were executed on Windows 2008 R2 and the following databases: SQL Server 2008 R2 SQL Server 2014 SP1 Oracle 11g R2 Oracle 12c Test Methodology Testing was conducted in a private enterprise performance testing lab using HP's LoadRunner 12.00. The tests measured application response time, throughput, and system resource usage under a load of 200 virtual users against a large enterprise data set with a think time pause of 10 seconds between transactions. This test involved making LoadRunner generate a load by creating virtual users (one unique virtual user was added every 5 seconds until the desired load level was reached) to model user activity. Each virtual user then performed scripted tasks and sent crafted HTTP requests. LoadRunner then used a threading execution model to create multiple instances of unique virtual users to create load and concurrency on the SBM application. During this test, the 200 virtual users iterated the set of use cases (transactions) in the performance suite. The scripts and the datasets used in this testing can be provided for reference upon request. Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 3

Runtime Test Architecture Runtime Test Architecture The performance test was performed in a distributed SBM installation using 5 separate 64-bit servers. The environment included a dedicated Serena License Manager version 2.2.0 installation with 80,000 seat licenses. Server 1 SBM Application Engine Web Server (Win2008R2-IIS 7 or Win2012R2-IIS 8.5) with 8GB memory Server 2 SBM Tomcat7-1: Single Sign-On (SSO), SBM Mail Services, SBM Orchestration Engine with 8GB total and 4GB memory allocated to Tomcat7 Server 3 SBM Common Services Tomcat7-2, (Smart Search, Resource Mgmt, Relationship Service, SSF Services, SLS, SBMProxy, AuthInjector) with 8GB total and 4GB memory allocated to Tomcat7 Server 4 SQL Server 2008 R2, SQL Server 2014 SP1, or Oracle 11g R2 database with 16GB of total memory Server 5 SBM Logging Services (Mongod64) with 8GB of total memory SOAPUI 5.2.0 XML RESTService (http://10.248.50.1:8091/resource) (8GB memory) 200 Virtual Users Load Test Scenario Load tests simulated scaling a number of common tasks performed by 200 virtual users. Using a standard defect tracking application that enables a team to track product defects, the tests replicated the life cycle of defects from submittal, approval, and assignment. Notes and attachments were added to each item as it moved through the process. The Notification Server sent e-mails via SMTP to a test SMTP Mail Server repository. Shared views of activity, backlog, calendar feeds, and shared dashboard views of multi-view reports were run against the system. Tests were performed by a set number of 200 unique virtual users over a 20 minute baseline. The thinktime between each use case (LoadRunner Transaction ) was 10 seconds (with exceptions for transactions 4 Serena Software

Serena Work Center (SBM Application Engine) Use Cases from Serena Work Center: 60 second think-time before indexer transactions; 90 second think-time before notification transactions). The performance workflow consisted of the following actions: LoadRunner virtual users ran in multiple threads of execution 200 unique virtual users logged in via SSO to Serena Work Center in vuser_init Repeated iterate (defect tracking application) use cases 200 unique virtual users logged off via SSO from Serena Work Center in vuser_end SERENA WORK CENTER (SBM APPLICATION ENGINE) USE CASES Serena Work Center use cases run during virtual user initialization against a large enterprise data set: Web SSO Login VUser Init Notification Remove All Settings Save Preferred Project Settings Save Items Per Page 40 And Notification Poll Interval 90 Second SWC Home Add Report Widget (Of Multi View Report #1) SWC Home Add Activity Widget (My Activity) Serena Work Center use cases run during virtual user exit against a large enterprise data set: SWC Home Delete Report Widget (Of Multi View Report #1) SWC Home Delete Activity Widget (My Activity) VUser End Notification Remove All Web SSO Logout Serena Work Center use cases iterated by each virtual user against a large enterprise data set: Pin Up Defect Tracking (Added in SBM 11.1) Select Defect Tracking App (Added in SBM 11.1) Submit-Form Submit-Form OK (Submit Item) Transition Owner In Defect Tracking Activities Side Bar Defect Tracking My Activity After Owner IN Defect Tracking Shared Activity Feed View After Owner IN Defect Tracking Calendars Side Bar Defect Tracking My Calendar Defect Tracking Shared Calendar Feed View After Owner IN Transition Owner Out Transition Owner Out Ok Transition Secondary Owner In My Activity After Secondary Owner In Shared Activity View All Items I Own Primary and Secondary Transition Secondary Owner Out Transition Secondary Owner Out Ok Transition REST GRID WIDGET (REST Grid Widget is inserted as a Regular Transition in AE Form) Transition REST GRID WIDGET OK (calls XML REST Mock Service to exercise AuthInjector component via SBMProxy in CommonServices Tomcat7) Transition-SyncTest (makes Synchronous AE-WebServices call) AddNote Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 5

200 Virtual Users Load Test Scenario Add Attachment In Note (27KB JPEG File From Disk) AddNote-Ok (Text with HTML5 Tags Includes 27KB JPG + 2KB Note) AddAttachment AddAttachment-OK (11KB TXT File From Disk) Email_Link Email_Link SearchUser Email_Link-OK (Message with HTML5 Tags) Transition Assign To CCB Transition Assign To CCB OK Transition Assign To Area Owner Transition Assign To Area Owner OK Social View Of BUG BUGID Lucene Text Search BUGStar Lucene Text Search BUGIDStar Lucene Text Search Attachment Lucene Text Search TitleDescription Lucene Text Search TitleDescriptionStar Lucene Text Search Submitter Lucene Text Search User Profile Card ContactCard AD Client Logging Add then Delete From Favorites PinUp Defect Tracking (Application) (Removed in SBM 11.1) Select Defect Tracking PinUp (Removed in SBM 11.1) Defect Tracking Manage Views (Removed in SBM 11.1) Defect Tracking Dashboards Side Bar (Added in SBM 11.1) Defect Tracking My Dashboard Defect Tracking Shared Dashboard View (Of Multi View Report #1) Defect Tracking Backlogs Side Bar Defect Tracking Shared Backlog Feed View Report Center (Removed in SBM 11.1) All Reports Tab (Added in SBM 11.1) CTOSDS Elapsed Time Duration CTOSDS Time in State Duration CTOSDS Average Time To State Duration CTOSDS Open And Completed Trend CTOSDS Entering A State Trend CTOSDS State Activity Trend Multi-View Report 1 Static Listing For Performance EG Static Listing For Multi-View Performance All Issues By Project and State All Issues by Issue Type Listing Report All Open Issues My Open Issues 6 Serena Software

Serena Work Center (SBM Application Engine, Orchestration Engine) Use Cases All Issues by Project and Issue Type All Issues by State All Issues by Owner Notification UI View All Notification UI MarkRead and Remove N (For Each Notification Object) Each Iteration Notification Remove All 80 Virtual Users Load Test Scenario The SBM Orchestration Engine load test simulated a repeated, scaling number of synchronous and asynchronous orchestration internal events initiated by virtual users as they submitted items into a Complex Load OE application. During Submit Item, the orchestration was initiated as a transition that repeatedly updated items with orchestration results by making the following AE Web Service calls: AEWebServices71.GetItemsByQuery AEWebServices71.UpdateItem AEWebServices71.UpdateItemWithName Tests were performed by a set number of unique virtual users over a 20 minute baseline. The think-time between virtual user clicks was 10 seconds. To test scalability, a set of up to 80 unique virtual users logged in to the SBM User Workspace, submitted an item into the system to initiate the orchestrations, and then logged out of the system. The performance workflow consisted of the following actions: LoadRunner virtual users ran in multiple threads of execution 80 unique virtual users logged in via SSO to Serena Work Center in vuser_init Repeated iterate use cases (each is a LoadRunner "Transaction") 80 unique virtual users logged off via SSO from Serena Work Center in vuser_end SERENA WORK CENTER (SBM APPLICATION ENGINE, ORCHESTRATION ENGINE) USE CASES Serena Work Center use cases run during virtual user initialization against a large enterprise data set: Web SSO Login VUser Init Notification Remove All Select LoadOE App PinUp LoadOE App Serena Work Center use cases run during virtual user exit against a large enterprise data set: VUser End Notification Remove All Web SSO Logout Serena Work Center use cases iterated by each virtual user against a large enterprise data set: LoadOE Submit Form LoadOE Complex Submit OK (Initiates Synchronous and Asynchronous Orchestrations as Transition to Submit Form) Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 7

About the Data Set About the Data Set The large enterprise data set that was used for testing is summarized below. SBM Entity Count Workflows (define the process that items follow) 31 Projects (store process items, such as issues and incidents) 5,302 Users 21,100 SBM groups 138 Folders (store personal and group favorites) 147,570 Issues (process items that follow a workflow) 1,130,000 Contacts (similar to address book entries) 20,053 Resources 24,249 Attachments 36,960 Note: In addition, the database included the following configuration settings: processes = 300 (Newer versions of Application Engine require more sessions per user to power Work Center) READ_COMMITTED_SNAPSHOT = OFF 8 Serena Software

Average Transaction Response Time Performance Test Results The following graphs summarize the results from the performance tests. AVERAGE TRANSACTION RESPONSE TIME The response times for individual transactions are summarized below. Note the following: SQL Server 2014 SP1 provides the best performance. No significant performance impact was observed when the Rich Text Editor was enabled for Memo/ Note Fields in submit and transition forms. Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 9

Performance Test Results 10 Serena Software

Orchestration Engine Transaction Response Time ORCHESTRATION ENGINE TRANSACTION RESPONSE TIME The response times for individual transactions are summarized below. Note that Oracle 11g R2 provides the best performance for Load OE Complex Submit OK. Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 11

Performance Test Results 12 Serena Software

Orchestration Engine Transaction Response Time Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 13

Performance Test Results TRANSACTION SUMMARY The number of transactions iterated by 200 virtual users is summarized below. Note that with changes to Work Center in 11.1, more transactions were needed to accomplish the same UI task. Comparing transactions iterated between SBM 11.1 and SBM 11.0.1: 5% less transactions iterated with SQL Server 2008 R2 in SBM 11.1 5% less transactions iterated with SQL Server 2014 SP1 in SBM 11.1 5% less transactions iterated with Oracle 11g R2 in SBM 11.1 14 Serena Software

Transaction Summary Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 15

Performance Test Results 16 Serena Software

Orchestration Engine Transaction Summary ORCHESTRATION ENGINE TRANSACTION SUMMARY The number of transactions iterated by 80 virtual users during the Complex Load OE baseline is summarized below. Comparing transactions iterated between SBM 11.1 and SBM 11.0.1: 5% more transactions iterated with SQL Server 2008 R2 in SBM 11.1 Slightly less transactions iterated with SQL Server 2014 SP1 in SBM 11.1 5% less transactions iterated with Oracle 11g R2 in SBM 11.1 Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 17

Performance Test Results 18 Serena Software

Orchestration Engine Transaction Summary Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 19

Performance Test Results 20 Serena Software

Web Server CPU Usage WEB SERVER CPU USAGE Web server CPU usage is summarized below. Note that with 5% less transactions executed in SBM 11.1, the 10% improvement in web server CPU usage was the result of Application Engine improvements made in SBM 11.1. Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 21

Performance Test Results 22 Serena Software

Web Server CPU Usage Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 23

Performance Test Results ORCHESTRATION ENGINE WEB SERVER CPU USAGE Orchestration Engine web server CPU usage is summarized below. At least 10% improvement of the web server CPU usage in SBM 11.1 was the result of Application Engine and ODE performance improvements made in SBM 11.1. 24 Serena Software

Orchestration Engine Web Server CPU Usage Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 25

Performance Test Results 26 Serena Software

Web Server Memory Usage WEB SERVER MEMORY USAGE Web server memory usage is summarized below. Note that with 5% less transactions executed in SBM 11.1, the 25% improvement in web server memory usage was the result of Application Engine performance improvements made in SBM 11.1. Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 27

Performance Test Results 28 Serena Software

Web Server Memory Usage Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 29

Performance Test Results ORCHESTRATION ENGINE WEB SERVER MEMORY USAGE Orchestration Engine web server memory usage is summarized below. The 25% improvement in web server memory usage was the result of Application Engine and ODE performance improvements made in SBM 11.1. 30 Serena Software

Orchestration Engine Web Server Memory Usage Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 31

Performance Test Results 32 Serena Software

Web Server Handle Count WEB SERVER HANDLE COUNT Web server handle count is summarized below. Note that with 5% less transactions executed in SBM 11.1, improvement in the web server handle count (especially with SQL Server 2014 and Oracle) was the result of Application Engine performance improvements made in SBM 11.1. Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 33

Performance Test Results 34 Serena Software

Web Server Handle Count Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 35

Performance Test Results ORCHESTRATION ENGINE WEB SERVER HANDLE COUNT The Orchestration Engine web server handle count is summarized below. The 10% improvement in the Orchestration Engine web server handle count in SBM 11.1 was the result of Application Engine and ODE performance improvements made in SBM 11.1. 36 Serena Software

Orchestration Engine Web Server Handle Count Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 37

Performance Test Results 38 Serena Software

Web Server Virtual Bytes WEB SERVER VIRTUAL BYTES Web server virtual bytes are summarized below. Note that with 5% less transactions executed in SBM 11.1, the 10% improvement in the web server virtual bytes was the result of Application Engine performance improvements made in SBM 11.1. Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 39

Performance Test Results 40 Serena Software

Common Services CPU Usage COMMON SERVICES CPU USAGE Common Services Tomcat 7 CPU usage is summarized below. Note that Oracle 11g R2 provides the best performance. The 20% higher Common Services CPU usage in SBM 11.1 was the result of Tomcat 7 Ehcache and Lucene SBM Search cache changes. The spike in CPU usage at the beginning of the 200 virtual user baseline was the result of the Common Services caches being populated. Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 41

Performance Test Results 42 Serena Software

Common Services CPU Usage Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 43

Performance Test Results COMMON SERVICES MEMORY USAGE Common Services Tomcat 7 memory usage is summarized below. Note that Oracle 11g R2 provides the best performance. The higher Common Services memory usage was the result of Tomcat 7 Ehcache and Lucene SBM Search cache changes. 44 Serena Software

Common Services Memory Usage Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 45

Performance Test Results 46 Serena Software

Notification Server CPU Usage NOTIFICATION SERVER CPU USAGE Notification Server CPU usage is summarized below. Note that SQL Server 2014 SP1 provides the best performance. The 25% improvement in Notification Server CPU usage was the result of Notification Server performance improvements made in SBM 11.1. The spike in the CPU usage at the beginning of the 200 virtual user baseline was the result of the Notification Server Caches being populated. Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 47

Performance Test Results 48 Serena Software

Notification Server CPU Usage Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 49

Performance Test Results 50 Serena Software

Orchestration Engine ODE CPU Usage ORCHESTRATION ENGINE ODE CPU USAGE ODE Tomcat 7 CPU usage is summarized below. Note that SQL Server 2014 SP1 provides the best performance. Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 51

Performance Test Results 52 Serena Software

Notification Server Memory Usage NOTIFICATION SERVER MEMORY USAGE Notification Server Tomcat 7 memory usage is summarized below. Note that Oracle 11g R2 provides the best performance. At least 10% improvement in the Notification Server memory usage was the result of Notification Server performance improvements made in SBM 11.1 and the Concurrent Mark Sweep garbage collection policy. Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 53

Performance Test Results 54 Serena Software

Notification Server Memory Usage Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 55

Performance Test Results ORCHESTRATION ENGINE ODE MEMORY USAGE ODE Tomcat 7 memory usage is summarized below. Note that SQL Server 2014 SP1 provides the best performance. At least 25% improvement in the ODE memory usage was the result of ODE performance improvements made in SBM 11.1 and the Concurrent Mark Sweep garbage collection policy. 56 Serena Software

Orchestration Engine ODE Memory Usage Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 57

Performance Test Results 58 Serena Software

Database Server CPU Usage DATABASE SERVER CPU USAGE Database Server CPU usage is summarized below. Note that SQL Server 2008 R2provides the best performance. The lower database CPU usage for the 200 virtual user baseline was the result of the performance improvements made in SBM 11.1. Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 59

Performance Test Results 60 Serena Software

Database Server CPU Usage Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 61

Performance Test Results ORCHESTRATION ENGINE DATABASE SERVER CPU USAGE Orchestration Engine database server CPU usage is summarized below. SQL Server 2014 SP1 provides the best performance. At least 10% improvement in the Orchestration Engine SQL Server or Oracle CPU usage was the result of Application Engine and ODE performance improvements made in SBM 11.1. 62 Serena Software

Orchestration Engine Database Server CPU Usage Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 63

Performance Test Results 64 Serena Software

Database Server Memory Usage DATABASE SERVER MEMORY USAGE Database server memory usage is summarized below. Note that SQL Server 2014 SP1 provides the best performance. The database server memory usage for the 200 virtual user baseline was the result of the performance improvements made in SBM 11.1. Note: The SQL Server database server memory usage was reported low on Windows because SQL Server breaks down memory usage across Windows components. Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 65

Performance Test Results 66 Serena Software

Database Server Memory Usage Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 67

Performance Test Results ORCHESTRATION ENGINE DATABASE SERVER MEMORY USAGE Database server memory usage is summarized below. Note that SQL Server 2008 R2 provides the best performance. The lower database server memory usage for the 80 virtual user baseline was the result of the performance improvements made in SBM 11.1. Note: The SQL Server database server memory usage was reported low on Windows because SQL Server breaks down memory usage across Windows components. 68 Serena Software

Orchestration Engine Database Server Memory Usage Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 69

Performance Test Results 70 Serena Software

Mongod64 on Separate Server MONGOD64 ON SEPARATE SERVER Screenshots for Mongod64 CPU usage and Mongod64 memory usage are not included. At the INFO logging level, Mongod64 CPU usage is around 1% and Mongod64 memory usage is up to 700 MB for both SBM 11.1 and SBM 11.0.1 for the 200 virtual user (SBM) and 80 virtual user (Orchestration Engine) performance baselines. TOMCAT 7 ODE EVENT PROCESSING PERFORMANCE In SBM 11.1 with SQL Server 2008 R2, 11,553 events were processed by ODE with SUCCESS status within 30 minutes, 44 seconds. Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 71

Additional Information In SBM 11.0.1 with SQL Server 2008 R2, 11,494 events were processed by ODE with SUCCESS status within 30 minutes, 40 seconds. In SBM 11.1 with SQL Server 2014 SP1, 11,498 events were processed by ODE with SUCCESS status within 30 minutes, 52 seconds. In SBM 11.0.1 with SQL Server 2014 SP1, 11,524 events were processed by ODE with SUCCESS status within 30 minutes, 55 seconds. In SBM 11.1 with Oracle, 11,541 events were processed by ODE with SUCCESS status within 28 minutes, 33 seconds. In SBM 11.0.1 with Oracle, 11,522 events were processed by ODE with SUCCESS status within 30 minutes, 38 seconds. RELATIONSHIP SERVICE PERFORMANCE Results for one Neo4J instance (not clustered), Tomcat 7 3GB Heap + Neo4J 1GB Heap, default 10 worker threads are summarized below. Neo4J Java CPU% peak was 25% during the initial load (most CPU intensive period). Initial load totaled 1.2 million records (ExportToCSV + ImportToGraphDB), which took 108 minutes to finish. Replicating/synchronization of new records in TS_CHANGEACTIONS table was processed at a rate of 2,400 records per minute. For 1,137,446 records (and 14,118,424 relationships), Neo4J database disk usage was 3.9GB. SMART SEARCH INDEXER PERFORMANCE Results of the Smart Search indexer (Lucene), 4 GB Common Services Java Heap are summarized below. Initial index or re-index operations are now multi-threaded. 6 worker threads of execution quickly processed all primary tables sequentially. After the initial index or re-index finished, the indexer periodically polled the database every 30 seconds (by default) to update the index with new records from the TS_CHANGEACTIONS table, which was then searchable via Serena Work Center. Initial index or reindex of 1,102,276 items (with an associated 24,249 resources and 36,960 attachments) in the TTT_ISSUES table took no more than 39 minutes to complete (indexdir folder size was 230MB). The indexer finished the update of 46,357 change items in 17 minutes and 45 seconds. Additional Information Note the following performance testing information. CONFIGURATION NOTES For details on preventing file locks by anti-virus software on the Tomcat 7 server file system, see D22079 in the Serena knowledge base. You can improve Oracle 12c back-end performance by: Defining the correct cache size for the Automatic Memory Manager Enabling Automatic Storage Manager by defining file system and volume manager optimized for Oracle database files Enabling Oracle Data Partitioning If TRACE level logging in Active Diagnostics is enabled for debugging SBM components on a 64-bit system with 8GB RAM, the mongod64 process will compete for memory with SBM Common Services and prevent Tomcat from using the 4GB Heap that is allocated to it. 72 Serena Software

Configuration Notes The default performance settings in SBM Configurator are used for load tests: Notification Server ForceSend = OFF Throttling of Orchestrations = OFF On the server that hosts SBM Common Services, the indexer settings are changed for load tests in SBM Configurator: Expected Search Usage = High Database Polling Interval = 30 seconds Serena recommends that you optimize table statistics for large SBM tables with an Oracle database (not required for SQL Server): exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf', tabname => 'TS_PROJECTANCESTRYDENORM',cascade => true, estimate_percent => 100, granularity => 'ALL', degree => 8); exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf', tabname => 'TS_TIMEINSTATE',cascade => true, estimate_percent => 100, granularity => 'ALL', degree => 8); exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf', tabname => 'TS_TIMEZONEGMTOFFSETDENORM',cascade => true, estimate_percent => 100, granularity => 'ALL', degree => 8); exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf', tabname => 'TS_CALENDARS',cascade => true, estimate_percent => 100, granularity => 'ALL', degree => 8); exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf', tabname => 'TS_STATES',cascade => true, estimate_percent => 100, granularity => 'ALL', degree => 8); exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf', tabname => 'TS_CHANGEACTIONS',cascade => true, estimate_percent => 10, granularity => 'ALL', degree => 8); exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf', tabname => 'TTT_ISSUES',cascade => true, estimate_percent => 10, granularity => 'ALL', degree => 8); exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf', tabname => 'TS_PROJECTANCESTRYDENORM',cascade => true, estimate_percent => 100, granularity => 'ALL', degree => 8); exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf', tabname => 'TS_TIMEINSTATE',cascade => true, estimate_percent => 100, granularity => 'ALL', degree => 8); exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf', tabname => 'TS_TIMEZONEGMTOFFSETDENORM',cascade => true, estimate_percent => 100, granularity => 'ALL', degree => 8); exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf', tabname => 'TS_CALENDARS',cascade => true, estimate_percent => 100, granularity => 'ALL', degree => 8); exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'perf', tabname => 'TS_STATES',cascade => true, estimate_percent => 100, granularity => 'ALL', degree => 8); Oracle ODBC client connection to Oracle server configuration tips: Another variable for number of sessions used depends on what type of ODBC connection Application Engine is making as a client to the Oracle server. ODBC connection as Client10 (SSL) or via Oracle Advanced Security requires an increase in processes/sessions on the Oracle server. To avoid server refused the connection errors in the Application Engine Event Viewer and JBDC connection exceptions in the Tomcat 7 server.log in the performance lab at 200vu load Serena Business Manager 11.1 vs. 11.0.1: Comparative Performance Test Results 73

Reference against Work Center, increasing the number of processes to 300 and sessions to 400 in Oracle is sufficient: sqlplus system/testauto@hpq5 > alter system set processes=300 scope=spfile; Verify change: sqlplus system/testaut0@hpq5 > show parameters processes; > show parameters sessions; SBM 11.1 PERFORMANCE DEFECTS The following defects have been submitted against the GA build of SBM 11.1: DEF286744 SWC PERF 200vu: Submitter Lucene Keyword Search Performance Has Regressed In SBM 11.1 PERFORMANCE TEST EXCLUSIONS The following items were excluded from performance testing. System report transactions were not included in 200vu load test. New SBM 11.1 feature transactions were not included in this comparison. Reference ABOUT SERENA SOFTWARE Serena Software provides Orchestrated application development and release management solutions to the Global 2000. Our 2,500 active enterprise customers, including a majority of the Fortune 100, have made Serena the largest independent Application Lifecycle Management (ALM) vendor and the only one that orchestrates DevOps, the processes that bring together application development and operations. Headquartered in Silicon Valley, Serena serves enterprise customers across the globe. Serena is a portfolio company of HGGC, a leading middle market private equity firm. CONTACT Web site: http://www.serena.com/company/contact-us.html Copyright 2016 Serena Software, Inc. All rights reserved. Serena, TeamTrack, StarTool, PVCS, Comparex, Dimensions, Prototype Composer, Mariner, and ChangeMan are registered trademarks of Serena Software, Inc. The Serena logo and Version Manager are trademarks of Serena Software, Inc. All other products or company names are used for identification purposes only, and may be trademarks of their respective owners. 74 Serena Software