Key Performance Indicators

Similar documents
ROBI 3G Optimization Project- KPI Verification

ETSI TS V ( ) Technical Specification

Technical Specification LTE; Evolved Universal Terrestrial Radio Access (E-UTRA); Layer 2 - Measurements (3GPP TS version 11.0.

3GPP TS V ( )

MME SGW PGW. 17-Feb-14 21:15 (Page 1) This sequence diagram was generated with EventStudio Sytem Designer -

ETSI TS V9.5.0 ( ) Technical Specification

ETSI TS V ( )

Customer Training Catalog Training Programs WCDMA RNP&RNO

Cell Broadcast Service USER DESCRIPTION

5G voice network evolution aspects. Voice over NR in a 5G System and migration from Evolved Packet System Fallback. Paper 3

5G NSA for MME. Feature Summary and Revision History

Virtual Evolved Packet Core (VEPC) Placement in the Metro Core- Backhual-Aggregation Ring BY ABHISHEK GUPTA FRIDAY GROUP MEETING OCTOBER 20, 2017

ENG Advanced Mobile Networks: QoS, QoE & Technical Aspects

ETSI TS V ( )

7/27/2010 LTE-WIMAX BLOG HARISHVADADA.WORDPRESS.COM. QOS over 4G networks Harish Vadada

DAY 2. HSPA Systems Architecture and Protocols

Long Term Evolution - Evolved Packet Core S1 Interface Conformance Test Plan

Virtual Mobile Core Placement for Metro Area BY ABHISHEK GUPTA FRIDAY GROUP MEETING NOVEMBER 17, 2017

An In-depth Study of LTE: Effect of Network Protocol and Application Behavior on Performance

LTE RAN Operation (ARI0001) Course Outline

Possibilities to Optimize QoS with Next SON Versions

3GPP TR V4.0.1 ( )

Virtual Mobile Core Placement for Metro Area BY ABHISHEK GUPTA FRIDAY GROUP MEETING JANUARY 5, 2018

ETSI TS V8.9.0 ( ) Technical Specification

3GPP TS V8.1.0 ( )

ETSI TS V1.1.1 ( )

ETSI TS V (201

ETSI TS V ( )

5G Non Standalone for SAEGW

EXPERIMENT N0: 06 AIM:TO DESIGN UMTS NETWORK USING OPNET MODELER APPARATUS: OPNET MODELER 14.0

ETSI TS V ( )

ETSI TS V8.2.0 ( ) Technical Specification

DRAFT - QoS Sensitive Roaming Principles 1.0 August 2004

ETSI TS V ( )

3G/4G Mobile Communications Systems. Dr. Stefan Brück Qualcomm Corporate R&D Center Germany

ETSI TR V1.1.1 ( ) Technical Report

S11U Interface Support on S-GW for CIoT Devices

LTE Radio Interface Architecture. Sherif A. Elgohari

07/08/2016. Sami TABBANE. I. Introduction II. Evolved Packet Core III. Core network Dimensioning IV. Summary

Implementing a Session Aware Policy Based Mechanism for QoS Control in LTE

ETSI TS V4.1.0 ( )

IxLoad LTE Evolved Packet Core Network Testing: enodeb simulation on the S1-MME and S1-U interfaces

Call Establishment and Handover Procedures of PS Calls using HSDPA

3GPP TS V8.7.0 ( )

ETSI TS V1.1.1 ( )

ETSI TS V (201

Corning SpiderCloud SCRN-310 Radio Node for Enterprise Radio Access Network (E-RAN)

3GPP TR V4.0.0 ( )

INTRODUCTION TO LTE. ECE MOBILE COMMUNICATION Monday, 25 June 2018

SIMULATION FRAMEWORK MODELING

1.1 Beyond 3G systems

Dual Connectivity in LTE

LTE EPC Emulators v10.0 Release Notes - Page 1 of 15 -

ETSI TR V ( )

Audience. Is This Content For Me? The Amphigean Approach. Workshop Structure

3GPP TR V7.0.0 ( )

NETWORK DIAGNOSTICS Testing HSDPA, HSUPA for 3G mobile apps

Load Tester v11.2 Release Notes - Page 1 of 16 -

ETSI TS V ( ) Technical Specification

Resource Management at the WCDMA HNB

3GPP TS V ( )

ETSI TR V7.7.0 ( ) Technical Report

ETSI GS MEC 026 V2.1.1 ( )

3GPP TS V1.3.1 ( )

ETSI TS V ( )

5G NSA(Non-Standalone Architecture)

ETSI TS V (201

ETSI TS V (201

MPEG4 VIDEO OVER PACKET SWITCHED CONNECTION OF THE WCDMA AIR INTERFACE

Key Performance Aspects of an LTE FDD based Smart Grid Communications Network

Nexus8610 Traffic Simulation System. Intersystem Handover Simulation. White Paper

ETSI TS V (201

ETSI TS V2.1.3 ( )

ETSI TS V3.3.0 ( )

Cisco Ultra Traffic Optimization

Institute of Electrical and Electronics Engineers (IEEE) PROPOSED AMENDMENTS TO [IMT.EVAL]

Quality of Service, Policy and Charging

MSF Architecture for 3GPP Evolved Packet System (EPS) Access MSF-LTE-ARCH-EPS-002.FINAL

ETSI TS V9.0.0 ( ) Technical Specification

3GPP TS V ( )

ARIB STD-T64-C.S v1.0. Unstructured Supplementary Service Data (USSD) Service Options for Spread Spectrum Systems:Service Options 78 and 79

Direct Tunnel for 4G (LTE) Networks

ETSI TS V9.0.0 ( ) Technical Specification

3GPP TS V ( )

HSPA evolution. ericsson White paper July beyond 3gpp release 10

3G Technical Evolution as an evolving broadband solution

ETSI TS V ( )

KPI Definition and Computation

RAB Assignment TS25.413v3.6.0 (2001-6) RANAP Protocol

HSPA over Iur RAN Nokia Siemens Networks RU20 Feature Training

HSPA Overview NCN-EG-07 Course Outline for HSDPA/HSUPA/HSPA

ETSI TS V3.1.0 ( )

ETSI TR V9.0.0 ( ) Technical Report

Dimensioning, configuration and deployment of Radio Access Networks. part 1: General considerations. Mobile Telephony Networks

Keywords Quality of Service (QoS), Long Term Evolution (LTE), System Architecture Evolution (SAE), Evolved Packet System (EPS).

Rab Nawaz Jadoon. Cellular Systems - II DCS. Assistant Professor. Department of Computer Science. COMSATS Institute of Information Technology

Mobile Network Evolution Part 2

3GPP TS V ( )

ETSI TS V (201

ETSI TS V ( )

Transcription:

USER DESCRIPTION 37/1553-HSC 105 50/1 Uen G

Copyright Ericsson AB 2009 2011. All rights reserved. No part of this document may be reproduced in any form without the written permission of the copyright owner. Disclaimer The contents of this document are subject to revision without notice due to continued progress in methodology, design and manufacturing. Ericsson shall have no liability for any error or damage of any kind resulting from the use of this document. Trademark List All trademarks mentioned herein are the property of their respective owners. These are shown in the document Trademark Information.

Contents Contents 1 Introduction 1 1.1 Concepts 1 1.2 Dependencies and Associated Features 5 2 Key Performance Indicators 7 2.1 Accessibility 7 2.2 Retainability 9 2.3 Integrity 11 2.4 Mobility 15 2.5 Availability 16

Introduction 1 Introduction This document describes the Key Performance Indicators (KPIs) for the LTE Radio Access Network (RAN) used to measure the contribution to subscriber perceived quality and system performance. Monitoring the RAN performance is a very important task for operation and maintenance personnel, network engineers, and management. KPIs can be used for the following tasks: Monitoring and optimizing the radio network performance to provide better subscriber-perceived quality or better use of installed resources Rapidly detecting unacceptable performance in the network, enabling the operator to take immediate actions to preserve the quality of the network Providing radio network planners with the detailed information required for dimensioning and configuring the network for optimal use Troubleshooting on cell clusters of interest The information in this document reflects the KPIs at the time of its release. Note: In the Managed Object Model RBS(MOM) there are a few MO classes, parameters, and counters listed that are there because system design considers future aspects. These MOs, parameters, and counters are therefore not relevant nor supported in the current release. For information about unsupported MO classes, parameters, and counters, see Parameter and Counter Limitations. 1.1 Concepts Concepts related to observability are described in the following sections. 1.1.1 LTE RAN Contribution to End-User Performance It is important to understand that in the LTE RAN network, KPI definition is limited to the end-user performance of which the network is aware. An LTE RAN KPI related to end-user impact is defined to measure the contribution known to the LTE RAN network. The performance of end-user applications covers a broader area than the LTE RAN part, as illustrated in the following figure: 1

Application layer UE RBS TN SGW TN PGW Server Figure 1 LTE RAN as Part of Chain to Deliver Packets between Entities on Application Layer L0000266A Abbreviations used in the illustration above are defined in the following list: PGW SGW TN UE Packet Data Network Gateway Serving Gateway Transport Network User Equipment 1.1.2 Observability in Ericsson LTE Observability covers all functions in LTE RAN that monitor and analyze the performance and characteristics of the network. This can be done on various levels with different target groups and requirements. The figure below illustrates a model for observability in LTE RAN. Key Performance Indicator (KPI) level End-user perception Performance Indicator (PI) level Procedure level Figure 2 Top-Down Approach in LTE RAN Observability System Characteristics L0000269A 2

Introduction The model shows different levels of observability targeting different purposes: Table 1 Observability Level KPI PI (2) Procedure Levels of Observability (1) Performance Management (2) Performance Indicator (3) User Equipment Description of Use The KPI represents the end-user perception of a network on a macro level. KPIs are of interest for an operator's top-level management. KPI statistics are typically used to benchmark networks against each other and to detect problem areas. KPIs are calculated from PM (1) counters. The reliability, granularity, and accuracy of the data are critical, and the data is collected continuously. The PI normally represents information at the system level that explains the KPI results. Many PIs can be based on PM counters, for example available PM counters for Root Cause Analysis, see Managed Object Model RBS. The PI can also be in the form of metrics that show how specific parts of a system perform. PIs do not necessarily have an impact on KPIs. The PI data can be used for planning and dimensioning. This data, typically PM counters, is normally collected continuously. The Procedure level represents in-depth troubleshooting and measurement system characteristics. It involves measurements on signalling, procedure and function levels to investigate problems detected at higher levels. The amount of data at the Procedure level is enormous and the measurements are generally user-initiated for a specific purpose and area of the network to limit the scope of the measurements. The typical source for this data is the UE (3) trace and the Cell trace recording functions. For further information about UE trace and cell trace, see the document Performance Management. 1.1.3 ITU-T QoS Model The International Telecommunications Union - Telecommunications (ITU-T) has described a general model for Quality of Service (QoS) from an end-user perspective to use in mobile networks. 3

The QoS categories for Serveability are: Accessibility Retainability Integrity The ability of a service to be obtained, within specified tolerances and other given conditions, when requested by the end user. The probability that a service, once obtained, continues to be provided under given conditions for a given time duration. The degree to which a service is provided without excessive impairments, once obtained. Service Integrity represents the quality experienced by the end user during the call or session. 1.1.4 LTE RAN Performance Observability Model The LTE Radio Access Network (RAN) Performance Observability model used by Ericsson combines the ITU-T QoS categories of Accessibility, Retainability, and Integrity with Mobility, Availability and Utilization, as described below: Mobility Availability Utilization The ability of the system to allow movement within the LTE RAN. The ability of an item to be in a state to perform a required function at a given instant of time within a given time interval, assuming that the external resources, if required, are provided. Describes the network use by measuring traffic level and capacity resource management, including congestion, admission and load control, and license use. Utilization information is required as input to network planning. For formulas to calculate utilization, see License and Resource Use Indicators. The ITU-T specification focuses on Circuit Switched (CS) calls, that is, one call per end user and one service only. Telecommunications have evolved past pure CS connections, and in LTE all sessions are Packet Switched (PS). The LTE network supports multiple simultaneous services per user, so the ITU-T definitions should be used as guidelines rather than rules for performance measurement. KPIs are developed for observing the network performance impact on the end user, and for observing the performance of the network itself. Each KPI is defined for observing end-user impact or system performance. 1.1.5 Aggregation of KPIs For KPIs, the majority of the equations are focused on cell level. The operator may choose to aggregate the counters over a group of cells and use the same 4

Introduction equations to calculate the KPI over a cluster of cells, one RBS, or multiple RBSs. Example of how a pmxyz(eutrancellfdd) counter in an equation can be replaced to obtain a metric over a larger area: pmxy Z (Group) = X EUtranCellFDD2 Group pmxy Z (EU trancellf DD) Equation 1 Formula for Aggregation of Metrics to Clusters of Cells For most formulas the 6 is left out. It is only in cases where it serves to clarify that it is used. The same aggregation is performed over time. Each counter is reported every 15 minutes. The operator may aggregate them over one hour, 24 hours, or another defined time period. In that case, the operator can replace the MO pmxyz(eutrancellfdd) with its time-aggregated sum. When a formula, involves the sum of fractions, the result of the formula indicates an invalid result when the results are invalid for all fractions involved. When the result is invalid for less than all fractions involved in the formula, the invalid fractions are ignored. E.g. the result of the formula in Equation 2 presents an invalid result when the results of pma/pmb, pmc/pmd and pme/pmf are invalid. E.g. the result of the formula in Equation 2 is pmc/pmd + pme/pmf when the result of pma/pmb is invalid. F ormula = pma pmb + pmc pmd + pme pmf Equation 2 Formula Example 1.2 Dependencies and Associated Features There are no specific dependencies and associated features for KPIs. 5

6

2 Key Performance Indicators The following sections describe the KPIs available, organized by category. 2.1 Accessibility In providing wireless end-user services, the first step is to get access to the wireless service. After access to the service is performed, the service can be used. The service provided by LTE RAN for accessibility is the Evolved Radio Access Bearer (E-RAB). The success rate E-RAB establishment is calculated separately depending on if the E-RAB is established with the E-RAB setup or Initial Context setup procedure, the latter depends on the successful establishment of the RRC connection and S1 signalling between the RBS and the Mobile Management Entity (MME). For additional information about the Initial Context Setup and E-RAB Setup procedures, see the document Radio Bearer Service. 2.1.1 Initial E-RAB Establishment Success Rate This KPI measures the impact on the end user. The following equation gives the accessibility success rate for end-user services that are carried by E-RABs included in the Initial UE Context setup procedure. The counters are on cell level. Initial E 0 RAB Establishment SR [%] = = 1002 pmrrcconnestabsucc (pmrrcconnestabatt 0 pmrrcconnestabattreatt) 2 pms1sigconnestabsucc pms1sigc onnestabatt 2 pmerabestabsuccinit pmerabestabattinit Equation 3 Initial E-RAB Establishment Success Rate Note: The impact of multiple E-RABs in the Initial context setup procedure is ignored in the RRC and S1 signalling part contribution. 7

2.1.2 Initial E-RAB Establishment Success Rate per QCI This KPI measures the impact on the end user. The following equation gives the accessibility success rate for end-user services that are carried by E-RABs included in the Initial UE Context setup procedure. The counters are on cell level and per QCI. Initial E 0 RAB Establishment SR; QCI [%] = = 1002 pmrrcconnestabsucc (pmrrcconnestabatt 0 pmrrcconnestabattreatt) 2 pms1sigconnestabsucc pms1sigc onnestabatt 2 pmerabestabsuccinitqci pmerabestabattinitqci Equation 4 Initial E-RAB Establishment Success Rate per QCI Note: The impact of multiple E-RABs in the Initial context setup procedure is ignored in the RRC and S1 signalling part contribution. 2.1.3 Added E-RAB Establishment Success Rate This KPI measures the impact on the end user. Accessibility success rate for end-user services which is carried by E-RABs included in the E-RAB setup procedure is given by the following equation. The counters are on cell level. Added E 0 RAB Establishment SR [%] = 100 2 pmerabestabsuccadded pmerabestabattadded Equation 5 Added E-RAB Establishment Success Rate 2.1.4 Added E-RAB Establishment Success Rate per QCI This KPI measures the impact on the end user. Accessibility success rate for end-user services which is carried by E-RABs included in the E-RAB setup procedure is given by the following equation. The counters are on cell level per QCI. Added E 0 RAB Establishment SR; QCI [%] = 100 2 pmerabestabsuccaddedqci pmerabestabattaddedqci Equation 6 Added E-RAB Establishment Success Rate per QCI 8

2.2 Retainability In providing wireless end-user services, it is important that the services are not interrupted or ended prematurely. Retainability performance can be divided into two parts, the definition of abnormal and the normalizing factor. 9

Table 2 Aspect of Retainability performance Definition of an Abnormal Release Retainability Aspects Normalization factor for the Abnormal Releases Description In a packet switched system as LTE it is natural to establish and release E-RABs. E-RABs don't have to be released just because they are inactive, they can be kept to have a fast access once new data arrives. The definition of an Abnormal release shall be that the release of the E-RAB had a negative impact on the end-user, hence the definition of an abnormal release is that there shall be buffered data to be transmitted at the time of release. The following causes for release shall be excluded from the abnormal release definition: The release cause was considered "Normal", e.g. CS Fallback initiated. The E-RAB was released due to successful handover. The number of Abnormal releases have to be normalized with something to make sense. The traditional approach from the speech world have been to compare Abnormal releases with the total number of releases. This method is suitable for a system where the session length is stable, i.e. a call is a call and the only information wanted is to see if the call could be completed without an abnormal release. In a packet switched system, where the service mix in a system is unknown and hence the session length varies a lot, the comparison of Abnormal releases with the total number of releases does not work well and doesn't reflect the end-user satisfaction as well as it does for speech. E.g. one user using p2p for downloading movies during 24 hours. One abnormal release during the 24 hours is hardly considered bad from the end-user perspective, even if the 100% of the released E-RABs for this example was considered abnormal. If we for this case would compare the session length with the number of Abnormal Releases, it would well reflect the end-user satisfaction. Considering this it is recommended to keep track of two Retainability measurements, one where the normalization is session time and one where the normalization is the total number releases. 10

2.2.1 E-RAB Retainability - Session Time normalized E 0 RAB Retainability [1=s] = Equation 7 This KPI measures the impact on the end user with the purpose to reflect the probability of how long a transfer in average can be kept without an abnormal release. The Retainability rate for E-RAB is given by the following equations. The counters are on cell level. pmerabrelabnormalenbact + pmerabrelmmeact pmsessiont imeue E-RAB Retainability - Session Time normalized Note: Since the KPI measures the impact of the network on the end user, it also includes releases initiated by the MME. To observe the impact of the RBS only, remove the pmerabrelmmeact from the formula. To achieve the number of minutes per drop, the multiplicative inverse of the E-RAB Retainability together with unit transformation from seconds to minutes can be used. 2.2.2 E-RAB Retainability - Percentage This KPI measures the impact on the end user with the purpose to reflect the probability if an established E-RAB can be can be kept without an abnormal release. The Retainability rate for E-RAB is given by the following equations. The counters are on cell level. E 0 RAB Retainability [%] = 100 2 Equation 8 E-RAB Retainability - Percentage pmerabrelabnormalenbact + pmerabrelm meact pmerabrelabnormalenb + pmerabreln ormalenb + pmerabrelm m Note: Since the KPI measures the impact of the network on the end user, it also includes releases initiated by the MME. To observe the impact of the RBS only, remove the pmerabrelmmeact from the formula. 2.3 Integrity In providing wireless end-user services, it is important that the end-user performance quality meets expectations. The LTE RAN service is delivery of IP packets. Integrity performance can be divided into three parts: 11

Table 3 Aspect of Performance Quality Latency Throughput Packet Loss End-User Performance Quality Description The time it takes to schedule the first packet on the air interface, determined from the time it was received in RBS. The uplink (UL) Latency cannot be measured in LTE RAN (1) The speed at which packets can be transferred once the first packet has been scheduled on the air interface. Packet Loss Rate can be broken down into: the rate of congestion related packet losses (for example, the packets that get dropped due to active queue management functionality) the rate of non-congestion related packet losses (those are packets that get lost in transmission, for example, discarded by some link layer receiver due to CRC failure) (1) Only the UE has the information when data arrives to an empty UL buffer. Latency is defined to exclude the impact of the traffic model. The size of the first packet shall not make a difference, so the measurement considers when the packet is scheduled on the air and not when the whole packet is acknowledged. Throughput is defined to exclude the impact of the traffic model. The file size of the transfer and the amount of bursts does not make a difference, so the idle time between transfer bursts is removed, as shown in the following figure: Buffered data T_Idle Figure 3 Session Example of Burst in Transfer L0000268A The following figure shows the removal of the contribution from the last TTI in downlink to exclude the impact of the traffic model: 12

Latency sample Time used for throughput calculation The last TTI with data shall always be removed. This since the coding can be selected based on size, not radio conditions, hence not end-user impacting. Figure 4 No transmission, buffer not empty (e.g. due to contention) Failed transmission ( block error ) Successful transmission, buffer not empty Successful transmission, buffer empty Data arrives to empty DL buffer First data is transmitted to the UE The send buffer is again empty Example of Downlink Throughput and Downlink Latency for Calculating Contribution of One Burst Time (ms) L0000267A The following figure shows the method to exclude the impact of the traffic model on the uplink throughput metric (to reuse the method that is used in downlink with last TTI does not work well in uplink due to the time difference in scheduling decision by RBS and the transmission time by the UE): No receptions, buffer not empty (for example due to contention) Receptions excluded from throughput calculations Successful receptions, buffer not empty Failed receptions ( block error ) Successful receptions, buffer empty Time and volume used for throughput calculation Contribution from the last TTI and the 4 first receptions is removed. This to exclude traffic model impact. Data arrives to empty UL buffer Scheduling request sent to RBS Grants sent to UE First data is transmitted to the RBS Time (ms) The send buffer is again empty Figure 5 L0000411A Example of Uplink Throughput for Calculating Contribution of One Burst 13

In the downlink (DL), the KPI is defined to look at packet losses which have a negative impact on the end-user performance, that is, the Packet Error Loss rate on non-congestion related packet losses. In the UL, the lost packets are derived from PDCP sequence numbers from the UE, hence it is not possible for the RBS to distinguish if a packet is lost due to congestion or non-congestion related reasons. Hence the UL KPI is defined as Packet Loss (and not Packet Error Loss as in DL). 2.3.1 Downlink Latency This KPI measures the impact on the end user. DL latency for the UE is given by the following equation. The counters are on cell level. DL Latency [ms] = pmp dcplatt imedl pmp dcplatp ktt ransdl Equation 9 Average UE DL Latency 2.3.2 Downlink Latency per QCI This KPI measures the impact on the end user. DL latency for the UE is given by the following equation. The counters are on cell level per QCI. pmp dcplatt imedlqci DL Latency; Qci [ms] = pmp dcplatp ktt ransdlqci Equation 10 Average UE DL Latency per QCI 2.3.3 Downlink Throughput DL T hroughput [kbps] = This KPI measures the impact on the end user. DL throughput for the UE is given by the following equation. The counters are on cell level. = pmp dcpv oldldrb 0 pmp dcpv oldldrblastt T I + pmp dcpv oldldrbt ransu m pmu et hpt imedl = 1000 Equation 11 Average UE DL Throughput 14

2.3.4 Uplink Throughput This KPI measures the impact on the end user. UL throughput for the UE is given by the following equation. The counters are on cell level. UL Throughput[kbps] = pmu et hpv olu l pmu et hpt imeu l=1000 Equation 12 Average UE UL Throughput 2.3.5 Downlink Packet Error Loss Rate This KPI measures the impact on the end user. DL packet error loss rate for the UE is given by the following equation. The counters are all on cell level, except pmpdcppktdiscdleth which is on RBS level. DL P acket Error Loss Rate; cell [%] = pmp dcpp ktdiscdlp elr + pmp dcpp ktdiscdlp elru u + pmp dcpp ktdiscdlho + A = 100 2 pmp dcpp ktreceiveddl 0 pmp dcpp ktf wddl + A Equation 13 Average UE DL Packet Error Loss Rate, from a cell perspective where A = pmp dcpp ktdiscdleth 2 P pmp dcpp ktreceiveddl pmp dcpp ktreceiveddl RBS 2.3.6 Uplink Packet Loss Rate This KPI measures the impact on the end user. UL packet loss rate for the UE is given by the following equation. The counters are on cell level. pmp dcpp ktlostu l U L P acket Loss Rate [%] = 100 2 pmp dcpp ktlostu l + pmp dcpp ktreceivedu l Equation 14 Average UE UL Packet Loss Rate 2.4 Mobility This section describes the KPI for mobility success rate. 15

2.4.1 Mobility Success Rate M obility Success Rate [%] = This KPI measures system performance. The Mobility Success rate includes both preparation of target cell resources and move from the source cell to the target cell, as given by the following equation. The counters are on the MO EUtranCellRelation or UtranCellRelationlevel: 100 2 pmhoprepsucclteintraf + pmhop repsucclteinterf + pmhop repsuccw cdma pmhop repattlteintraf + pmhop repattlteinterf + pmhop repattw cdma 2 pmhoexesucclteintraf + pmhoexesucclteinterf + pmhoexesuccw cdma pmhoexeattlteintraf + pmhoexeattlteinterf + pmhoexeattw cdma Equation 15 Mobility Success Rate 2.5 Availability This section describes the KPI for availability. 2.5.1 Partial Cell Availability (node restarts excluded) This KPI measures system performance. Since the KPI is measured by the RBS, it does not include time when the RBS is down, i.e. node restart time is excluded. The length of time in seconds that a cell is available for service is defined as cell availability. Cell availability for a cluster of L number of cells during N reporting periods can be calculated using the following formula. The counters are on cell level. Cell Availability [%] = 100 2 N 2 L 2 900 0 X (pmc elldowntimeauto + pmc elldowntimem an) Equation 16 Cell Availability N 2 L 2 900 Note: Note: The manual blocking time of a cell is included in this KPI to show the overall availability of the cell. To remove the manual intervention impact on cell availability, remove the PM counter pmcelldowntimeman from the numerator and subtract the value of the PM counter pmcelldowntimeman from the denominator. If the files with the PM counters are missing, the time that those files represent in "NxMx900" shall be excluded from Cell Availability result. 16