Enabler Test Specification for Device Management

Similar documents
Enabler Test Specification for Device Management

Enabler Test Specification for Device Management

Client Profile of OMA Device Management v1.3

Enabler Test Specification for Device Management

Enabler Test Specification for RCS Conformance

Lightweight Machine to Machine Architecture

Enabler Release Definition for Standard Transcoding Interface

Lightweight Machine to Machine Architecture

SOAP bindings for Call Notification

Continues the Technical Activities Originated in the SyncML Initiative

Standardized Connectivity Management Objects HTTP Proxy Parameters For use with OMA Device Management

OMA Management Object for Mobile_

OMA Device Management Tree and Description Serialization

Reference Release Definition for Parlay/OSA(Open Service Access) In OMA Service Environment (PIOSE)

Enabler Release Definition for Parlay Service Access

OMA Push Management Object

Enabler Release Definition for Application Layer Security Common Functions

Enabler Validation Plan for the RESTful Network API for OMA Push

Enabler Release Definition for Converged Personal Network Service

Enabler Release Definition for LPP Extensions (LPPe)

OMA-ETS-DL-OTA-v1_ a Page 1 (24)

Enabler Release Definition for Rich Communication Centre

Client Side Content Screening Framework Architecture

OMA Management Object for MMS

Enabler Release Definition for LPP Extensions (LPPe)

NGSI Common Definitions

Parlay Service Access Architecture

Standardized Connectivity Management Objects WAP Proxy Parameters For use with OMA Device Management

Point-to-Multipoint Push Requirements

Enabler Release Definition for Smartcard-Web-Server

OMA Device Management Protocol

Standardized Connectivity Management Objects 3GPP Circuit-Switched Data Bearer Parameters For use with OMA Device Management

Firmware Update Management Object

Lightweight M2M Event Log Object (LwM2M Object EventLog)

Mobile Search Framework Architecture

Enabler Release Definition for MMS

Software Component Management Object

RESTful bindings for Parlay X Web Services - Payment

Security Common Functions Architecture

RESTful Network API for Notification Channel

OMA Device Management Standardized Objects

Enabler Test Specification for RCS Conformance

Charging Data. Candidate Version Jul Open Mobile Alliance OMA-DDS-Charging_Data-V1_ C

OMA PoC Endorsement of OMA IM TS

Software Component Management Object

Enabler Test Specification (Interoperability) for MMS 1.3 Candidate Version 15 Jun 2006

Presence SIMPLE Architecture

Push Security Requirements

Software Component Management Object (SCOMO)

OMA Device Management Bootstrap

Software and Application Control Management Object

Class Conformance Requirements

Enabler Release Definition for Mobile Location Protocol (MLP) Candidate Version Mar 2004

OMA Device Management Security

WAP-Sync-Spec. Data Synchronisation Specification Version 30-May Wireless Application Protocol WAP-234-SYNC a

RESTful Network API for Chat

White Paper on M2M Device Classification

RESTful Network API for Zonal Presence

OMA Device Management Security

SyncML OBEX Binding. Candidate Version Apr Open Mobile Alliance OMA-TS-SyncML_OBEXBinding-V1_ C

Lightweight Machine to Machine Requirements

IM XDM Specification. Candidate Version Aug Open Mobile Alliance OMA-TS-IM_XDM-V1_ C

Parlay Service Access Requirements

OneAPI Profile of RESTful Network APIs

Location in SIP/IP core Architecture Approved Version Jan 2012

Specification Change Document

OneAPI Profile of RESTful Network APIs

Enabler Test Report Smartcard Web Server v1.0. OMA TestFest (January 2008) Version 1st February 2008

OMA Device Management Bootstrap

SyncML Representation Protocol, Data Synchronization Usage

SyncML Implementation Conformance Statement Proforma. SyncML DataSync V1.1.1

OMA Offline Charging Interface

SyncML Implementation Conformance Statement

WAP General Formats Document WAP-188-WAPGenFormats Version 10-Jul-2001

[OMA-Template-Spec I]

Firmware Update Management Object

RESTful Network API for Third Party Call

Specification Information Note

SyncML Implementation Conformance Statement Proforma. SyncML DataSync V1.1.1

Cache Operation. Version 31-Jul Wireless Application Protocol WAP-175-CacheOp a

SyncML Implementation Conformance Statement

XML Document Management (XDM) Specification

Provisioning Smartcard

XML Document Management (XDM) Specification

PoC XDM Specification

OMA PoC Document Management

Wireless Profiled HTTP

SyncML Sync Protocol, version 1.0

Generic Open Terminal API Framework (GotAPI)

Management Objects for ZigBee Devices

Location Protocols. Version 12-Sept Wireless Application Protocol WAP-257-LOCPROT a

[MS-MDM]: Mobile Device Management Protocol. Intellectual Property Rights Notice for Open Specifications Documentation

General Service Subscription Management Technical Specification

CPM Interworking Function

Push using SIP. Candidate Version Apr Open Mobile Alliance OMA-TS-SIP_Push-V1_ C

Provisioning Bootstrap

Generic Open Terminal API Framework (GotAPI)

Push using SIP. Approved Version Aug Open Mobile Alliance OMA-TS-SIP_Push-V1_ A

White Paper on UAProf Best Practices Guide

Smartcard-Web-Server. Approved Version Sep Open Mobile Alliance OMA-TS-Smartcard_Web_Server-V1_1_ A

Transcription:

Enabler Test Specification for Device Management Candidate Version 1.3 08 Dec 2015 Open Mobile Alliance OMA-ETS-DM-V1_3-20151208-C

OMA-ETS-DM-V1_3-20151208-C Page 2 (175) Use of this document is subject to all of the terms and conditions of the Use Agreement located at http://www.openmobilealliance.org/useagreement.html. Unless this document is clearly designated as an approved specification, this document is a work in process, is not an approved Open Mobile Alliance specification, and is subject to revision or removal without notice. You may use this document or any part of the document for internal or educational purposes only, provided you do not modify, edit or take out of context the information in this document in any manner. Information contained in this document may be used, at your sole risk, for any purposes. You may not use this document in any other manner without the prior written permission of the Open Mobile Alliance. The Open Mobile Alliance authorizes you to copy this document, provided that you retain all copyright and other proprietary notices contained in the original materials on any copies of the materials and that you comply strictly with these terms. This copyright permission does not constitute an endorsement of the products or services. The Open Mobile Alliance assumes no responsibility for errors or omissions in this document. Each Open Mobile Alliance member has agreed to use reasonable endeavors to inform the Open Mobile Alliance in a timely manner of Essential IPR as it becomes aware that the Essential IPR is related to the prepared or published specification. However, the members do not have an obligation to conduct IPR searches. The declared Essential IPR is publicly available to members and non-members of the Open Mobile Alliance and may be found on the OMA IPR Declarations list at http://www.openmobilealliance.org/ipr.html. The Open Mobile Alliance has not conducted an independent IPR review of this document and the information contained herein, and makes no representations or warranties regarding third party IPR, including without limitation patents, copyrights or trade secret rights. This document may contain inventions for which you must obtain licenses from third parties before making, using or selling the inventions. Defined terms above are set forth in the schedule to the Open Mobile Alliance Application Form. NO REPRESENTATIONS OR WARRANTIES (WHETHER EXPRESS OR IMPLIED) ARE MADE BY THE OPEN MOBILE ALLIANCE OR ANY OPEN MOBILE ALLIANCE MEMBER OR ITS AFFILIATES REGARDING ANY OF THE IPR S REPRESENTED ON THE OMA IPR DECLARATIONS LIST, INCLUDING, BUT NOT LIMITED TO THE ACCURACY, COMPLETENESS, VALIDITY OR RELEVANCE OF THE INFORMATION OR WHETHER OR NOT SUCH RIGHTS ARE ESSENTIAL OR NON-ESSENTIAL. THE OPEN MOBILE ALLIANCE IS NOT LIABLE FOR AND HEREBY DISCLAIMS ANY DIRECT, INDIRECT, PUNITIVE, SPECIAL, INCIDENTAL, CONSEQUENTIAL, OR EXEMPLARY DAMAGES ARISING OUT OF OR IN CONNECTION WITH THE USE OF DOCUMENTS AND THE INFORMATION CONTAINED IN THE DOCUMENTS. 2015 Open Mobile Alliance Ltd. All Rights Reserved. Used with the permission of the Open Mobile Alliance Ltd. under the terms set forth above.

OMA-ETS-DM-V1_3-20151208-C Page 3 (175) Contents 1. SCOPE... 7 2. REFERENCES... 8 2.1 NORMATIVE REFERENCES... 8 2.2 INFORMATIVE REFERENCES... 8 3. TERMINOLOGY AND CONVENTIONS... 9 3.1 CONVENTIONS... 9 3.2 DEFINITIONS... 9 3.3 ABBREVIATIONS... 9 4. INTRODUCTION... 10 5. DEVICE MANAGEMENT CLIENT CONFORMANCE TEST CASES... 11 5.1 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #1... 11 5.1.1 DeviceManagement-v1.3-client-con-0102... 11 5.1.2 DeviceManagement-v1.3-client-con-0103... 12 5.1.3 DeviceManagement-v1.3-client-con-0104... 13 5.2 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #2... 14 5.2.1 DeviceManagement-v1.3-client-con-0201... 14 5.3 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #3... 15 5.3.1 DeviceManagement-v1.3-client-con-0301... 15 5.3.2 DeviceManagement-v1.3-client-con-0302... 16 5.3.3 DeviceManagement-v1.3-client-con-0303... 18 5.3.4 DeviceManagement-v1.3-client-con-0304... 19 5.4 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #4... 20 5.4.1 DeviceManagement-v1.3-client-con-0401... 20 5.5 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #5... 22 5.5.1 DeviceManagement-v1.3-client-con-0501... 22 5.5.2 DeviceManagement-v1.3-client-con-0502... 23 5.5.3 DeviceManagement-v1.3-client-con-0503... 25 5.6 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #6... 26 5.6.1 DeviceManagement-v1.3-client-con-0601... 26 5.6.2 DeviceManagement-v1.3-client-con-0602... 30 5.7 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #7... 31 5.7.1 DeviceManagement-v1.3-client-con-0701... 31 5.8 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #8... 32 5.8.1 DeviceManagement-v1.3-client-con-0801... 32 5.8.2 DeviceManagement-v1.3-client-con-0802... 34 5.9 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #9... 35 5.9.1 DeviceManagement-v1.3-client-con-0901... 35 5.9.2 DeviceManagement-v1.3-client-con-0902... 36 5.9.3 DeviceManagement-v1.3-client-con-0903... 37 5.10 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #10... 38 5.10.1 DeviceManagement-v1.3-client-con-1001... 38 5.11 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #11... 40 5.11.1 DeviceManagement-v1.3-client-con-1101... 40 5.12 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #12... 44 5.12.1 DeviceManagement-v1.3-client-con-1201... 44 5.12.2 DeviceManagement-v1.3-client-con-1202... 45 5.12.3 DeviceManagement-v1.3-client-con-1203... 47 5.13 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #13... 56 5.13.1 DeviceManagement-v1.3-client-con-1301... 56 5.13.2 DeviceManagement-v1.3-client-con-1302... 58 5.13.3 DeviceManagement-v1.3-client-con-1303... 60 5.13.4 DeviceManagement-v1.3-client-con-1304... 61 5.13.5 DeviceManagement-v1.3-client-con-1305... 62

OMA-ETS-DM-V1_3-20151208-C Page 4 (175) 5.13.6 DeviceManagement-v1.3-client-con-1306... 63 5.13.7 DeviceManagement-v1.3-client-con-1307... 64 5.13.8 DeviceManagement-v1.3-client-con-1308... 66 5.14 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #14... 67 5.14.1 DeviceManagement-v1.3-client-con-1401... 67 5.15 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #15... 69 5.15.1 DeviceManagement-v1.3-client-con-1501... 69 5.16 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #16... 71 5.16.1 DeviceManagement-v1.3-client-con-1601... 71 5.17 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #17... 73 5.17.1 DeviceManagement-v1.3-client-con-1701... 73 5.17.2 DeviceManagement-v1.3-client-con-1702... 74 5.17.3 DeviceManagement-v1.3-client-con-1703... 76 5.17.4 DeviceManagement-v1.3-client-con-1704... 77 5.18 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #18... 78 5.18.1 DeviceManagement-v1.3-client-con-1801... 78 5.19 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #19... 79 5.19.1 DeviceManagement-v1.3-client-con-1901... 79 5.20 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #20... 79 5.20.1 DeviceManagement-v1.3-client-con-2001... 79 5.21 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #21... 80 5.21.1 DeviceManagement-v1.3- client-con-2101... 80 5.21.2 DeviceManagement-v1.3-client-con-2102... 81 5.22 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #22... 81 5.22.1 DeviceManagement-v1.3-client-con-2201... 81 5.23 DEVICE MANAGEMENT CLIENT CONFORMANCE TESTGROUP #23... 83 5.23.1 DeviceManagement-v1.3-client-con-2301... 83 5.24 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #24... 84 5.24.1 DeviceManagement-v1.3-client-con-2401... 84 5.25 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #25... 85 5.25.1 DeviceManagement-v1.3-client-con-2501... 85 5.26 DEVICE MANAGEMENT CLIENT CONFORMANCE TEST GROUP #26... 85 5.26.1 DeviceManagement-v1.3-client-con-2601... 85 5.26.2 DeviceManagement-v1.3-client-con-2602... 87 6. DEVICE MANAGEMENT SERVER CONFORMANCE TEST CASES... 88 6.1 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #1... 88 6.1.1 DeviceManagement-v1.3-server-con-0101... 88 6.2 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #2... 88 6.2.1 DeviceManagement-v1.3-server-con-0201... 88 6.2.2 DeviceManagement-v1.3-server-con-0202... 89 6.2.3 DeviceManagement-v1.3-server-con-0203... 89 6.2.4 DeviceManagement-v1.3-server-con-0204... 89 6.3 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #3... 90 6.3.1 DeviceManagement-v1.3-server-con-0301... 90 6.4 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #4... 90 6.4.1 DeviceManagement-v1.3-server-con-0401... 90 6.5 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #5... 91 6.5.1 DeviceManagement-v1.3-server-con-0501... 91 6.6 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #6... 91 6.6.1 DeviceManagement-v1.3-server-con-0601... 91 6.7 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #7... 92 6.7.1 DeviceManagement-v1.3-server-con-0701... 92 6.8 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #8... 92 6.8.1 DeviceManagement-v1.3-server-con-0801... 92 6.9 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #9... 93 6.9.1 DeviceManagement-v1.3-server-con-0901... 93 6.10 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #10... 93

OMA-ETS-DM-V1_3-20151208-C Page 5 (175) 6.10.1 DeviceManagement-v1.3-server-con-1001... 93 6.10.2 DeviceManagement-v1.3-server-con-1002... 94 6.11 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #11... 94 6.11.1 DeviceManagement-v1.3-server-con-1101... 94 6.11.2 DeviceManagement-v1.3-server-con-1102... 95 6.12 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #12... 95 6.12.1 DeviceManagement-v1.3-server-con-1201... 95 6.13 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #13... 96 6.13.1 DeviceManagement-v1.3-server-con-1301... 96 6.14 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #14... 96 6.14.1 DeviceManagement-v1.3-server-con-1401... 96 6.15 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #15... 97 6.15.1 DeviceManagement-v1.3-server-con-1501... 97 6.16 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #16... 97 6.16.1 DeviceManagement-v1.3-server-con-1601... 97 6.17 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #17... 97 6.17.1 DeviceManagement-v1.3-server-con-1701... 97 6.18 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #18... 98 6.18.1 DeviceManagement-v1.3-server-con-1801... 98 6.19 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #19... 98 6.19.1 DeviceManagement-v1.3-server-con -1901... 98 6.20 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #20... 98 6.20.1 DeviceManagement-v1.3-server-con-2001... 98 6.21 DEVICE MANAGEMENT SERVER CONFORMANCE TEST GROUP #21... 99 6.21.1 DeviceManagement-v1.3-server-con-2101... 99 6.21.2 DeviceManagement-v1.3-server-con-2102... 100 7. DEVICE MANAGEMENT INTEROPERABILITY TEST CASES... 102 7.1 DEVICEMANAGEMENT-V1.3-INT-001... 102 7.2 DEVICEMANAGEMENT-V1.3-INT-002... 103 7.3 DEVICEMANAGEMENT-V1.3-INT-003... 104 7.4 DEVICEMANAGEMENT-V1.3-INT-004... 105 7.5 DEVICEMANAGEMENT-V1.3-INT-005... 106 7.6 DEVICEMANAGEMENT-V1.3-INT-006... 107 7.7 DEVICEMANAGEMENT-V1.3-INT-007... 108 7.8 DEVICEMANAGEMENT-V1.3-INT-008... 109 7.9 DEVICEMANAGEMENT-V1.3-INT-009... 110 7.10 DEVICEMANAGEMENT-V1.3-INT-010... 111 7.11 DEVICEMANAGEMENT-V1.3-INT-011... 112 7.12 DEVICEMANAGEMENT-V1.3-INT-012... 113 7.13 DEVICEMANAGEMENT-V1.3-INT-013... 114 7.14 DEVICEMANAGEMENT-V1.3-INT-014... 115 7.15 DEVICEMANAGEMENT-V1.3-INT-015... 116 7.16 DEVICEMANAGEMENT-V1.3-INT-015A... 117 7.17 DEVICEMANAGEMENT-V1.3-INT-015B... 118 7.18 DEVICEMANAGEMENT-V1.3-INT-015C... 119 7.19 DEVICEMANAGEMENT-V1.3-INT-016... 120 7.20 DEVICEMANAGEMENT-V1.3-INT-016B... 120 7.21 DEVICEMANAGEMENT-V1.3-INT-016C... 122 7.22 DEVICEMANAGEMENT-V1.3-INT-016D... 123 7.23 DEVICEMANAGEMENT-V 1.3-INT-017... 124 7.24 DEVICEMANAGEMENT-V 1.3-INT-017A... 125 7.25 DEVICEMANAGEMENT-V 1.3-INT-018... 126 7.26 DEVICEMANAGEMENT-V 1.3-INT-019... 127 7.27 DEVICEMANAGEMENT-V 1.3-INT-020... 128 7.28 DEVICEMANAGEMENT-V1.3-INT-021... 129 7.29 DEVICEMANAGEMENT-V1.3-INT-022... 129 7.30 DEVICEMANAGEMENT-V1.3-INT-023... 130

OMA-ETS-DM-V1_3-20151208-C Page 6 (175) 7.31 DEVICEMANAGEMENT-V1.3-INT-024... 131 7.32 DEVICEMANAGEMENT-V1.3-INT-025... 131 7.33 DEVICEMANAGEMENT-V1.3-INT-026... 132 7.34 DEVICEMANAGEMENT-V1.3-INT-027... 133 7.35 DEVICEMANAGEMENT-V1.3-INT-028... 134 7.36 DEVICEMANAGEMENT-V1.3-INT-029... 134 7.37 DEVICEMANAGEMENT-V1.3-INT-030... 135 7.38 DEVICEMANAGEMENT-V 1.3-INT-031... 135 7.39 DEVICEMANAGEMENT-V 1.3-INT-032... 136 7.40 DEVICEMANAGEMENT-V 1.3-INT-033... 136 7.41 DEVICEMANAGEMENT-V 1.3-INT-034... 137 7.42 DEVICEMANAGEMENT-V 1.3-INT-035... 138 7.43 DEVICEMANAGEMENT-V 1.3-INT-036... 138 7.44 DEVICEMANAGEMENT-V 1.3-INT-037... 139 7.45 DEVICEMANAGEMENT-V 1.3-INT-038... 140 7.46 DEVICEMANAGEMENT-V1.3-INT-039... 140 7.47 DEVICEMANAGEMENT-V1.3-INT-040... 141 7.48 DEVICEMANAGEMENT-V1.3-INT-041... 142 7.49 DEVICEMANAGEMENT-V1.3-INT-042... 142 APPENDIX A. CHANGE HISTORY (INFORMATIVE)... 144 A.1 APPROVED VERSION HISTORY... 144 A.2 DRAFT/CANDIDATE VERSION 1.3 HISTORY... 144 APPENDIX B. REFERENCE CONFIGURATION MESSAGES (NORMATIVE)... 145 B.1 TNDS.XML... 145 B.2 CP_PROV_DOC_1.XML... 149 APPENDIX C. OMA DM PROTOCOL PACKAGES... 151 C.1 PACKAGE 0:MANAGEMENT INITIATION ALERT FROM SERVER TO CLIENT... 151 C.2 PACKAGE 1: INITIALIZATION FROM CLIENT TO SERVER... 151 C.3 PACKAGE 2: INITIALIZATION FROM SERVER TO CLIENT... 151 C.4 PACKAGE 3: CLIENT RESPONSE SENT TO SERVER... 152 C.5 PACKAGE 4: FURTHER SERVER MANAGEMENT OPERATIONS... 152 APPENDIX D. TESTCASES APPLICABILITY... 153 D.1 INTRODUCTION... 153 D.2 CLIENT TEST CASES TESTING ONLY MANDATORY FEATURES... 153 D.3 CLIENT ICS... 153 D.4 CLIENT IXIT... 154 D.5 CLIENT ICS/IXIT TO TEST CASE MAPPING... 155 APPENDIX E. OPTIONAL MESSAGE HANDLING MACROS... 156 E.1 DM SESSION INITIALISATION MACRO... 156 E.2 DM AUTHENTICATION MACRO... 156 E.3 DM NODE CREATION MACRO... 156 APPENDIX F. SCR MAPPING TO TEST CASE (INFORMATIVE)... 157 F.1 SCR FOR DM CLIENT... 157 F.2 SCR FOR DM SERVER... 168

OMA-ETS-DM-V1_3-20151208-C Page 7 (175) 1. Scope This document describes in detail available test cases for Device Management 1.3 Enabler Release, http://www.openmobilealliance.org/. The test cases are split in two categories, conformance and interoperability test cases. The conformance test cases are aimed to verify the adherence to normative requirements described in the technical specifications. The interoperability test cases are aimed to verify that implementations of the specifications work satisfactory. If either conformance or interoperability tests do not exists at the creation of the test specification this part should be marked not available.

OMA-ETS-DM-V1_3-20151208-C Page 8 (175) 2. References 2.1 Normative References [DMBOOT] [DMNOTI] [DMPRO] OMA Device Management Bootstrap, Version 1.3. Open Mobile Alliance. OMA-TS-DM-Bootstrap-V1_2. URL:http://www.openmobilealliance.org OMA Device Management Notification Initiated Session, Version 1.3. Open Mobile Alliance. OMA-TS-DM-Notification-V1_2. URL:http://www.openmobilealliance.org OMA Device Management Protocol, Version 1.3. Open Mobile Alliance. OMA-TS-DM-Protocol-V1_2. URL:http://www.openmobilealliance.org [DMREPU] OMA Device Management Representation Protocol, Version 1.3. Open Mobile Alliance. OMA-TS-DM-RepPro-V1_2. URL:http://www.openmobilealliance.org [DMSEC] [DMSESS] [DMSTDOBJ] [DMTND] [DMTNDS] [ERELD] OMA Device Management Security, Version 1.3. Open Mobile Alliance. OMA-TS-DM-Security-V1_2. URL:http://www.openmobilealliance.org OMA Device Management Sessionless Message, Version 1.3, Open Mobile Alliance. OMA-TS- DM_Sessionless-V1_3. URL:http:www.openmobilealliance.org/ OMA Device Management Standardized Objects, Version 1.3. Open Mobile Alliance. OMA-TS- DM-StdObj-V1_2. URL:http://www.openmobilealliance.org OMA Device Management Tree and Description, Version 1.3. Open Mobile Alliance. OMA-TS- DM-TND-V1_2. URL:http://www.openmobilealliance.org OMA Device Management Tree and Description Serialization, Version 1.3. Open Mobile Alliance. OMA-TS-DM-TNDS-V1_2. URL:http://www.openmobilealliance.org Enabler Release Definition for Device Management Version 1.3, Open Mobile Alliance, ERELD-DM-V1_2. URL:http:www.openmobilealliance.org [HTTPBinding] OMA Device Management HTTP Binding Specification, Open Mobile Alliance TM, OMA-TS-DM_HTTPBinding-V1_3, URL:http://www.openmobilealliance.org/ [IOPPROC] [Meta] OMA Interoperability Policy and Process Version 1.6, Open Mobile Alliance, OMA-IOP-Process-V1_6, URL:http://www.openmobilealliance.org OMA Device Management Meta Information, Open Mobile Alliance, OMA-TS- DM_DM_MetaInfo-V1_3, URL:http:www.openmobilealliance.org/ [OBEXBinding] OMA Device Management OBEX Binding Specification, Open Mobile Alliance TM, OMA-TS-DM_OBEXBinding-V1_3, URL:http://www.openmobilealliance.org/ [PROVSC] Provisioning Smartcard, Version 1.1, Open Mobile Alliance, OMA-WAP-TS-ProvSC-V1_1, URL:http://www.openmobilealliance.org [RFC2119] Key words for use in RFCs to Indicate Requirement Levels, S. Bradner, March 1997, URL:http://www.ietf.org/rfc/rfc2119.txt [WSPBinding] OMA Device Management WSP Binding Specification, Open Mobile Alliance, OMA-TS-DM_WSPBinding-V1_3, URL:http://www.openmobilealliance.org/ 2.2 Informative References [OMADICT] Dictionary for OMA specifications Version 2.6. Open Mobile Alliance. OMA-ORG-Dictionary- V2_6. URL:http://www.openmobilealliance.org/

OMA-ETS-DM-V1_3-20151208-C Page 9 (175) 3. Terminology and Conventions 3.1 Conventions The key words MUST, MUST NOT, REQUIRED, SHALL, SHALL NOT, SHOULD, SHOULD NOT, RECOMMENDED, MAY, and OPTIONAL in this document are to be interpreted as described in [RFC2119]. All sections and appendixes, except Scope, are normative, unless they are explicitly indicated to be informative. The following numbering scheme is used: xxx-y.z-con-number where: xxx Name of enabler, e.g. MMS or Browsing y.z Version of enabler release, e.g. 1.2 or 1.2.1 con Indicating this test is a conformance test case number Leap number for the test case Or xxx-y.z-int-number where: xxx Name of enabler, e.g. MMS or Browsing y.z Version of enabler release, e.g. 1.2 or 1.2.1 int Indicating this test is a interoperability test case number Leap number for the test case 3.2 Definitions <Leaf> or <Leaf#n> <Node> SCTS Test Case Test Group 3.3 Abbreviations Leaf node(s) that is configured to the SCTS before the testing is done (e.g.. SwV and/or Name ). Test case is driven to this configured interior node. The <Leaf> can be different between different Test Cases. Path from the root to the interior node that is configured to the SCTS before the testing is done (e.g.../syncml/dmacc or./devdetail ). Test case is driven to this configured interior node. The <Node> can be different between different Test Cases. SyncML Conformance Test Suite. A Test Case is an individual test used to verify the conformance of the to a particular mandatory feature of the protocol. A 4-digit number identifies Test Cases where the first two digits denote the Test Group ID. A Test Group is a collection of Test Cases, which are executed, in a single SyncML session in SCTS conformance test tool. The implementation under test is refered to as the. In this document, the Client. DM OMA SCTS Device Management Open Mobile Alliance SyncML Conformance Test Suite

OMA-ETS-DM-V1_3-20151208-C Page 10 (175) 4. Introduction This document describes in detail available test cases for Device Management 1.3 Enabler Release, http://www.openmobilealliance.org/. The test cases are split in two categories, conformance and interoperability test cases. The conformance test cases are aimed to verify the adherence to normative requirements described in the technical specifications. The interoperability test cases are aimed to verify that implementations of the specifications work satisfactory. If either conformance or interoperability tests do not exists at the creation of the test specification this part should be marked not available. If an implementation states in their ICS that an optional feature is supported. Then the tests for the optional feature are mandatory for that implementation.

OMA-ETS-DM-V1_3-20151208-C Page 11 (175) 5. Device Management Client Conformance Test Cases 5.1 Device Management Client Conformance Test Group #1 5.1.1 DeviceManagement-v1.3-client-con-0102 DeviceManagement-v1.3-client-con-0102 Client device To check if the sent a valid Alert command. Specification Reference [DMREPU] Chapter 6.6.2 [DMREPU] Chapter 7 (Alert Codes) [DMPRO] Chapter 8.3 SCR Reference DMREPPRO-PCE-C-001 Support for sending Alert Test Tool DM 1.3 conformance test tool The client is not involved in a session with the test tool. Test Procedure 1. The Client is triggered to initiate a request with the server Pass-Criteria 2. The client sends a Setup-Request. 3. Test Tool sends OK response to the client to close Session The MUST send valid Client Initiated Alert. Step 2: The MUST send a request with a valid client initiated alert. Valid implies: 1. The Alert tag must have as sub elements a CmdID tag and a Data tag The value of the data tag must be 1201 showing that this is a Client initiated session. MESSAGE SEQUENCE Step Direction Message Comment UE SS 1 The client is triggered to initiate communication with the server. 2 Setup-Request The client sends a Setup-Request. 3 Setup-Response Test Tool sends OK response to the client to close Session

OMA-ETS-DM-V1_3-20151208-C Page 12 (175) 5.1.2 DeviceManagement-v1.3-client-con-0103 DeviceManagement-v1.3-client-con-0103 Client device To check if the sends Device Information Specification Reference [DMREPU] Chapter 6.6.11 [DMPRO] Chapter 8.3 SCR Reference DMREPPRO-PCE-C-002 Support for Replace Test Tool DM 1.3 conformance test tool The client is not involved in a session with the test tool. Test Procedure 1. The client is triggered to initiate communication with the server. Pass-Criteria MESSAGE SEQUENCE 2. The client sends a Setup-Request. 3. Test Tool sends OK response to the client to close Session The MUST send its Device Information in a Replace command Step 2: The MUST send its Device Information in a Replace command. This implies that: The setup request shall contain a Replace tag which contains a CmdID tag and elements from the./devinfo node. The latter represent Device Information. Step Direction Message Comment UE SS 1 The client is triggered to initiate communication with the server. 2 Setup-Request The client sends a Setup-Request. 3 Setup-Response Test Tool sends OK response to the client to close Session

OMA-ETS-DM-V1_3-20151208-C Page 13 (175) 5.1.3 DeviceManagement-v1.3-client-con-0104 DeviceManagement-v1.3-client-con-0104 Client device To check if the client's Source LocURI is same as the value in./devinfo/devid Specification Reference [DMREPU] Chapter 6.1.10 [DMREPU] Chapter 6.6.11 [DMPRO] Chapter 8.3 SCR Reference DMREPPRO-CUE-C-008 Support for LocURI Test Tool DM 1.3 conformance test tool The client is not involved in a session with the test tool. Test Procedure 1. The client is triggered to initiate communication with the server. Pass-Criteria 2. The client sends a Setup-Request. 3. Test Tool sends OK response to the client to close Session The value of Source LocURI in the SyncHdr sent by the client MUST be equal to the value sent in./devinfo/devid Step 2: The client MUST send a setup request as follows: 1. Setup-Request shall contain a Replace tag, and this tag contains a CmdID tag and elements from the./devinfo node. These represent Device Information. 2. The Value in the SyncHdr/Source/LocURI tag should be equal to the value in the DevInfo/DevID tag, the latter being a subelement of the Replace tag. MESSAGE SEQUENCE Step Direction Message Comment UE SS 1 The client is triggered to initiate communication with the server. 2 Setup-Request The client sends a Setup-Request. 3 Setup- Response Test Tool sends OK response to the client to close Session

OMA-ETS-DM-V1_3-20151208-C Page 14 (175) 5.2 Device Management Client Conformance Test Group #2 5.2.1 DeviceManagement-v1.3-client-con-0201 DeviceManagement-v1.3-client-con-0201 Client device To check if the can switch the authentication scheme based on the challenge (MD5). Specification Reference [DMSEC] Chapter 5.3 [DMREPU] Chapter 6.1.6 [DMPRO] Chapter 9 SCR Reference DM-SEC-C-001 Client must authenticate itself to a server Test Tool DM-SEC-C-005 DM-SEC-C-008 DM 1.3 conformance test tool Send credentials to server The client must support md5 authentication Support for OMA DM syncml:auth-md5 type authentication Test Procedure 1. The client is triggered to initiate communication with the server. Pass-Criteria 2. The client sends a Setup-Request. 3. Test Tool receives the request and responds to this by sending a Challenge (Chal). In the challenge the server specifies that it is expecting an md5 authentication in the next request and server also specifies Nonce to be used. 4. Client resends login response this time sending its credentials. 5. Test Tool sends OK response to the client with 212, telling the client that it is authenticated or otherwise if client credentials are not correct The MUST update its authentication scheme and send credentials using MD5 in the next session. Step 4: The MUST update its authentication scheme and send credentials using MD5 in the next session. This implies: 1. The test object sends its credentials as part of the <Cred> tag and using MD5 as the digest schema. 2. The credentials as sent by the test object must be the same as those saved on the server thus confirming that the test object has indeed carried out the md5 authentication correctly. MESSAGE SEQUENCE Step Direction Message Comment UE SS 1 The client is triggered to initiate communication with the server. 2 Setup-Request The client sends a Setup-Request.

OMA-ETS-DM-V1_3-20151208-C Page 15 (175) Step Direction Message Comment UE SS 3 Setup-Response + Challenge Test Tool receives the request and responds to this by sending a Challenge (Chal). In the challenge the server specifies that it is expecting an md5 authentication in the next request and server also specifies Nonce to be used. Client resends login response this time sending its credentials. 4 Authentication using MD5 5 Login response Test Tool sends OK response to the client with 212, telling the client that it is authenticated or otherwise if client credentials are not correct 5.3 Device Management Client Conformance Test Group #3 5.3.1 DeviceManagement-v1.3-client-con-0301 DeviceManagement-v1.3-client-con-0301 Client device To check if the supports the MD5 Digest authentication scheme. Specification Reference [DMSEC] Chapter 5.3 [DMREPU] Chapter 6.1.6 [DMPRO] Chapter 9 SCR Reference DM-SEC-C-001 Client must authenticate itself to a server Test Tool DM-SEC-C-005 DM-SEC-C-008 DM 1.3 conformance test tool Send credentials to server 1. The client sends a Setup-Request Support for OMA DM syncml:auth-md5 type authentication 2. Test Tool receives the request and responds to this by sending a Challenge (Chal). In the challenge the server specifies that it is expecting an md5 authentication in the next request and server also specifies Nonce to be used. Test Procedure 1. The client is triggered to initiate communication with the server. Pass-Criteria 2. The client sends a Setup-Request. 3. Credentials are correct, the server sends back the response confirming that the client has been successfully authenticated. The MUST send valid credentials encoded using the MD5 Digest authentication scheme. Step 2: 1. The test object sends its credentials as part of the <Cred> tag and using MD5 as the digest schema. 2. The credentials as sent by the test object must be the same as those saved on the server thus confirming that the test object has indeed carried out the md5 authentication correctly.

OMA-ETS-DM-V1_3-20151208-C Page 16 (175) MESSAGE SEQUENCE Step Direction Message Comment UE SS 1 The client is triggered to initiate communication with the server. 2 Setup-Request The client sends a Setup-Request. 3 Login response Credentials are correct, the server sends back the response confirming that the client has been successfully authenticated 5.3.2 DeviceManagement-v1.3-client-con-0302 DeviceManagement-v1.3-client-con-0302 Client device To check if the responds with a Results for a Get on the Root node. Specification Reference [DMREPU] Chapter 6.6.7 [DMREPU] Chapter 6.6.12 SCR Reference DMREPPRO-PCE-C-008 Support for receiving Get DMREPPRO- PCE-C-010 Support for sending Results Test Tool DM 1.3 Conformance test tool Test tool should have ACL access rights for Get on the Root node. Test Procedure 1. The test procedure starts with a DM Session Initialisation Macro (see E.1) with a Get command on the Root node (. ) in the Setup-Response (step 3 of the macro). Pass-Criteria 2. If required by the client: DM Authentication Macro. 3. Client responds to the Get with a valid package #3 message (see C.4). 4. Test tool checks the response and sends a valid package #4 message (see C.5) to close the session. The MUST respond with a Results containing at least the following element: DevInfo, DevDetail. Step 3: The client MUST send a valid package #3 (see C.4) as follows: 1. the message s SyncBody SHALL contain at least a Status tag with: a. a CmdRef tag set to the value of CmdID which the Get used b. a Data tag set to 200 the message s SyncBody SHALL contain a Results tag with a Data tag containing at least the following node names separated with a / a. DevInfo b. DevDetail

OMA-ETS-DM-V1_3-20151208-C Page 17 (175) MESSAGE SEQUENCE Step Direction Message Comment UE SS 1 The test procedure starts with a DM Session Initialisation Macro (see E.1) with a Get command on the Root node (. ) in the Setup-Response (step 3 of the macro). 2 3 Client Response 4 Server Management Operations Message If required by the client: DM Authentication Macro Client responds to the Get with a valid package #3 message (see C.4). Test tool checks the response and sends a valid package #4 message (see C.5) to close the session.

OMA-ETS-DM-V1_3-20151208-C Page 18 (175) 5.3.3 DeviceManagement-v1.3-client-con-0303 DeviceManagement-v1.3-client-con-0303 Client device To check if the responds with a Results for a Get on a leaf node. Specification Reference [DMREPU] Chapter 6.6.7 [DMREPU] Chapter 6.6.12. SCR Reference DMREPPRO-PCE-C-008 Support for receiving Get DMREPPRO- PCE-C-010 Support for sending Results Test Tool DM 1.3 Conformance test tool Test tool should have ACL access rights for Get on the leaf node. Client is not involved in a session with the server Test Procedure 1. The test procedure starts with a DM Session Initialisation Macro (see E.1) with a Get command on a leaf node (e.g../devinfo/lang ) in the Setup-Response (step 3 of the macro). Pass-Criteria 2. If required by the client: DM Authentication Macro. 3. Client responds to the Get with a valid package #3 message (see C.4). 4. Test tool checks the response and sends a valid package #4 message (see C.5) to close the session. The MUST respond with a Results. Step 3: The client MUST send a valid package #3 (see C.4) as follows: 1. the message s SyncBody SHALL contain at least a Status tag with: a. a CmdRef tag set to the value of CmdID which the Get used b. a Data tag set to 200 the message s SyncBody SHALL contain a Results tag with a Data tag containing the content of the node MESSAGE SEQUENCE Step Direction Message Comment UE SS 1 The test procedure starts with a DM Session Initialisation Macro (see E.1) with a Get command on a leaf node in the Setup-Response (step 3 of the macro). 2 If required by the client: DM Authentication Macro 3 Client Response 4 Server Management Operations Message Client responds to the Get with a valid package #3 message (see C.4). Test tool checks the response and sends a valid package #4 message (see C.5) to close the session.

OMA-ETS-DM-V1_3-20151208-C Page 19 (175) 5.3.4 DeviceManagement-v1.3-client-con-0304 DeviceManagement-v1.3-client-con-0304 Client device To check if the responds correctly for a Get on a non-existant node. Specification Reference [DMREPU] Chapter 6.6.7 SCR Reference DMREPPRO-PCE-C-008 Support for receiving Get Test Tool DM 1.3 Conformance test tool Client is not involved in a session with the server Test Procedure 1. The test procedure starts with a DM Session Initialisation Macro (see E.1) with a Get command on a Non Existant Node (e.g../nonexistanrnode ) in the Setup-Response (step 3 of the macro). Pass-Criteria 2. If required by the client: DM Authentication Macro. 3. Client responds to the Get with a valid package #3 message (see C.4). 4. Test tool checks the response and sends a valid package #4 message (see C.5) to close the session. The MUST return a 404 status code on the Get. Step 3: The client MUST send a valid package #3 (see C.4) as follows: 1. the message s SyncBody SHALL contain at least a Status tag with: a. a CmdRef tag set to the value of CmdID which the Get used b. a Data tag set to 404 MESSAGE SEQUENCE Step Direction Message Comment UE SS 1 The test procedure starts with a DM Session Initialisation Macro (see E.1) with a Get command on a Non Existant Node in the Setup-Response (step 3 of the macro). 2 If required by the client: DM Authentication Macro 3 Client Response Client responds to the Get with a valid package #3 message (see C.4). 4 Server Management Operations Message Test tool checks the response and sends a valid package #4 message (see C.5) to close the session.

OMA-ETS-DM-V1_3-20151208-C Page 20 (175) 5.4 Device Management Client Conformance Test Group #4 5.4.1 DeviceManagement-v1.3-client-con-0401 DeviceManagement-v1.3-client-con-0401 Client device To check if the uses HMAC scheme. Specification Reference [DMSEC] Chapter 5.4 SCR Reference DM-SEC-C-010 Integrity checking using HMAC-MD5 DM-SEC-C-011 DM-SEC-C-012 Inserting HMAC in transport Using HMAC for all subsequent messages Test Tool DM 1.3 Conformance test tool The client should support HMAC and use an insecure transport. Client is not involved in a session with the server Test Procedure 1. The test procedure starts with a DM Session Initialisation Macro (see E.1) with 407 code and a Chal to the client of type auth- MAC in the SyncBody of the Setup-Response (step 3 of the macro) as well as a NextNonce tag set to b64 of ixit_nextnonce. 2. Client responds to the challenge by sending the requested credentials in the transport header. 3. Test tool checks the response and compares the credentials to those saved on the server and sends a 212 (Authenticated) or 401 (Unauthorized) response to the client, together with a Get command on a test node (eg./devinfo ) 4. Client responds to the Get with a valid package #3 message (see C.4) with the credentials as before in the transport header. 5. Test tool checks the response and sends a valid package #4 message (see C.5) to close the session Pass-Criteria The MUST send valid HMAC. Step 2: The client MUST send a valid package #3 (see C.4) as follows: a. The transport header must contain the following: x-syncml-hmac: algorithm=md5, username="", mac= where: Step 4: 1. Algorithm is set to MD5 2. Username is the client s username (ixit_username) 3. mac is the digest computed as defined in the spec. a. The credentials sent by the client must be the valid according to ixit_username, ixit_userpass, ixit_nextnonce The client MUST send a valid package #3 (see C.4) as follows:

OMA-ETS-DM-V1_3-20151208-C Page 21 (175) a. The transport header must contain the following: x-syncml-hmac: algorithm=md5, username="", mac= where: 1. Algorithm is set to MD5 2. Username is the client s username (ixit_username) 3. mac is the digest computed as defined in the spec. a. The credentials sent by the client must be the valid according to ixit_username, ixit_userpass, ixit_nextnonce b. the message s SyncBody SHALL contain at least a Status tag with: a a CmdRef tag set to the value of CmdID which the Get used b a Data tag set to 200. c. the message s SyncBody SHALL contain a Results tag with: a Data tag containing Data stored in the node MESSAGE SEQUENCE Step Direction Message Comment UE SS 1 The test procedure starts with a DM Session Initialisation Macro (see E.1) with 407 code and a Chal to the client of type auth-mac in the SyncBody of the Setup- Response (step 3 of the macro) as well as a NextNonce tag set to b64 of ixit_nextnonce. 2 Client Response with HMAC Auth Header 4 Server Management Operations Message Client responds to the challenge by sending the requested credentials in the transport header. Test tool checks the response and compares the credentials to those saved on the server and sends a 212 (Authenticated) or 401 (Unauthorized) response to the client.

OMA-ETS-DM-V1_3-20151208-C Page 22 (175) 5.5 Device Management Client Conformance Test Group #5 5.5.1 DeviceManagement-v1.3-client-con-0501 DeviceManagement-v1.3-client-con-0501 Client device To check if a interior node can be Added to a client. Specification Reference [DMREPU] Chapter 6.6.1 SCR Reference DMREPPRO-PCE-C-003 Support for receiving Add Test Tool DM 1.3 Confomance Test tool The node MUST not exist on the device Test Procedure 1. The test procedure starts with a DM Session Initialisation Macro (see E.1) with an Add command for an interior test node in the Setup-Response (step 3 of the macro). 2. If required by the client: DM Authentication Macro. 3. Client responds to the Add with a valid package #3 message (see C.4). 4. Test tool checks the response and if it is a 200 (OK) response it sends a Get command on the interior node just created. If the response is 405 (Command Not Allowed) it sends a valid package #4 message (see C.5) to close the session. 5. Client responds to the Get command with a valid package #3 message (see C.4). 6. Test tool checks the response and sends a valid package #4 message (see C.5) to close the session. Pass-Criteria The MUST return either a 200 or 405 status code.if the status code is 200, the new interior node MUST exist. Step 3: The client MUST send a valid package #3 (see C.4) as follows: Step 5: 1. the message s SyncBody SHALL contain at least a Status tag with: a. a CmdRef tag set to the value of CmdID which the Add used b. a Data tag set to 200 or 405 The client MUST send a valid package #3 (see C.4) as follows: 1. the message s SyncBody SHALL contain at least a Status tag with: a. a CmdRef tag set to the value of CmdID which the Get used b. a Data tag set to 200 MESSAGE SEQUENCE

OMA-ETS-DM-V1_3-20151208-C Page 23 (175) Step Direction Message Comment UE SS 1 The test procedure starts with a DM Session Initialisation Macro (see E.1) with an Add command for an interior test node in the Setup-Response (step 3 of the macro). 2 3 Client Response 4 Server Management Operations Message 5 Client Response 6 Server Response If required by the client: DM Authentication Macro Client responds to the with a valid package #3 message (see C.4). Test tool checks the response and if it is a 200 (OK) response it sends a Get command on the interior node just created. If the response is 405 (Command Not Allowed) it sends a valid package #4 message (see C.5) to close the session. Client responds to the Get command with a valid package #3 message (see C.4). Test tool checks the response and sends a valid package #4 message (see C.5) to close the session. 5.5.2 DeviceManagement-v1.3-client-con-0502 DeviceManagement-v1.3-client-con-0502 Client device To check if a leaf node can be Added to a client. Specification Reference [DMREPU] Chapter 6.6.1 SCR Reference DMREPPRO-PCE-C-003 Support for receiving Add Test Tool DM 1.3 Conformance test-tool Client is not involved in a session with the server Test tool has the requires ACL rights and Accesstype privileges to add interior and leaf node on the client Test Procedure 1. The test procedure starts with a DM Session Initialisation Macro (see E.1) with an Add command on a test interior node (e.g. The one created in 501) in the Setup-Response (step 3 of the macro). 2. If required by the client: DM Authentication Macro. 3. Client responds to the Add with a valid package #3 message (see C.4). 4. Test tool checks the response.and responds with an Add command on a leaf node under the interior node on which the previous Add was carried out. 5. Client responds to the Add with a valid package #3 message (see C.4). 6. Test tool checks the response and responds with a Get command on the newly added leaf node. 7. Client responds to the Get with a valid package #3 message (see

OMA-ETS-DM-V1_3-20151208-C Page 24 (175) Pass-Criteria C.4). 8. Test tool checks the response and sends a valid package #4 message (see C.5) to close the session. The MUST return a 200 status code and the new leaf node MUST exist. Step 3: The client MUST send a valid package #3 (see C.4) as follows: Step 5: 1. the message s SyncBody SHALL contain at least a Status tag with: a. a CmdRef tag set to the value of CmdID which the Add used b. a Data tag set to 200 or 418 The client MUST send a valid package #3 (see C.4) as follows: Step 7: 1. the message s SyncBody SHALL contain at least a Status tag with: a. a CmdRef tag set to the value of CmdID which the Add used b. a Data tag set to 200 The client MUST send a valid package #3 (see C.4) as follows: 1. the message s SyncBody SHALL contain at least a Status tag with: a. a CmdRef tag set to the value of CmdID which the Get used b. a Data tag set to 200 2. the message s SyncBody SHALL contain a Results tag with a Data tag containing the content of the node. This Data should equal the Data that was added using the Add command on the leaf node MESSAGE SEQUENCE Step Direction Message Comment UE SS 1 The test procedure starts with a DM Session Initialisation Macro (see E.1) with an Add command on a test interior node (e.g. The one created in 501) in the Setup-Response (step 3 of the macro). 2 3 Client Response 4 Server Management Operations Message If required by the client: DM Authentication Macro Client responds to the Add with a valid package #3 message (see C.4). Test tool checks the response.and responds with an Add command on a leaf node under the interior node on which the previous Add was carried out.

OMA-ETS-DM-V1_3-20151208-C Page 25 (175) Step Direction Message Comment UE SS 5 Client Client responds to the Add with a valid package #3 message Response (see C.4). 6 Server Management Operations Message 7 Client Response 8 Server Response Test tool checks the response and responds with a Get command on the newly added leaf node Client responds to the Get with a valid package #3 message (see C.4). Test tool checks the response and sends a valid package #4 message (see C.5) to close the session. 5.5.3 DeviceManagement-v1.3-client-con-0503 DeviceManagement-v1.3-client-con-0503 Client device To check if the returns a status code of 418 (Already Exists) for a Add on a existing leaf node. Specification Reference [DMREPU] Chapter 6.6.1 SCR Reference DMREPPRO-PCE-C-003 Support for receiving Add Test Tool DM 1.3 Conformance test tool Test case 502 should have passed with a 200 status code. Client is not involved in a session with the server Test Procedure 1. The test procedure starts with a DM Session Initialisation Macro (see E.1) with an Add command on the leaf node created by testcase 502 in the Setup-Response (step 3 of the macro). Pass-Criteria 2. If required by the client: DM Authentication Macro. 3. Client responds to the Add with a valid package #3 message (see C.4). 4. Test tool checks the response and sends a valid package #4 message (see C.5) to close the session. The MUST return a 418 status code. Step 3: The client MUST send a valid package #3 (see C.4) as follows: 1. the message s SyncBody SHALL contain at least a Status tag with: a. a CmdRef tag set to the value of CmdID which the Add used b. a Data tag set to 418

OMA-ETS-DM-V1_3-20151208-C Page 26 (175) MESSAGE SEQUENCE Step Direction Message Comment UE SS 1 The test procedure starts with a DM Session Initialisation Macro (see E.1) with an Add command on the leaf node created by testcase 502 in the Setup-Response (step 3 of the macro). 2 3 Client Response 4 Server Response If required by the client: DM Authentication Macro Client responds to the Add with a valid package #3 message (see C.4). Test tool checks the response and sends a valid package #4 message (see C.5) to close the session. 5.6 Device Management Client Conformance Test Group #6 5.6.1 DeviceManagement-v1.3-client-con-0601 DeviceManagement-v1.3-client-con-0601 Client device To check if the handles a Replace. Specification Reference [DMREPU] Chapter 6.6.11 SCR Reference DMREPPRO-PCE-C-002 Support for Replace Test Tool DM 1.3 Confomance Test tool The client must not be involved in a session with the server The client must allow adding of nodes Test Procedure 1. The test procedure starts with a DM Session Initialisation Macro (see E.1) with a Replace command for a test leaf node in the Setup-Response (step 3 of the macro) to change the data to some value x. 2. If required by the client: DM Authentication Macro. 3. Client responds to the Replace with a valid package #3 message (see C.4). 4. Test tool checks the response and if it is a 200 (OK) response go to Step 6. If the response is 425 (Permission Denied) then go to step 4a. Otherwise if the response code is 404 then go to Step 5. 4a. Test tool sends a Replace command on the ACL rights of the test leaf node so that Replace is allowed (eg Replace=*) 4b. Client responds to the Replace with a valid package #3 message (see C.4). 4c. Test tool checks the response and sends a Replace command on the Data of the leaf node to change the data to some value x

OMA-ETS-DM-V1_3-20151208-C Page 27 (175) 4d. Client responds to the Replace with a valid package #3 message (see C.4). Go to Step 6. 5. Test tool sends an Add command to the client with the path of an interior test node. 5a. Client responds to the Add with a valid package #3 message (see C.4). 5b. Test tool checks the response and sends an Add command on a leaf node under the interior node created in Step 5. 5c. Client responds to the Add with a valid package #3 message (see C.4). 5d. Test tool checks the response and sends a Replace command on the ACL rights of the newly created leaf node such that Replace is enabled (eg Replace=*) 5e. Client responds to the Replace with a valid package #3 message (see C.4). 5f. Test tool checks the response and sends a Replace command on the Data of the leaf node to change the data to some value x. 5g. Client responds to the Replace with a valid package #3 message (see C.4). 6. Test tool checks the response and sends a Get command on the leaf node whose data has been replaced. 7. Client responds to the Get with a valid package #3 message (see C.4). 8. Test Tool checks the response and compares the value of the node to the value x. It sends a valid package #4 message (see C.5) to close the session. Pass-Criteria The MUST return a 200 status code. Step 3: The client MUST send a valid package #3 (see C.4) as follows: 1. the message s SyncBody SHALL contain at least a Status tag with: Step 4b/5e: a. a CmdRef tag set to the value of CmdID which the Replace used b. a Data tag set to 200, 404 or 425 The client MUST send a valid package #3 (see C.4) as follows: 1. the message s SyncBody SHALL contain at least a Status tag with: Step 5a/5c: a. a CmdRef tag set to the value of CmdID which the Replace used with?prop=acl at the end b. a Data tag set to 200,

OMA-ETS-DM-V1_3-20151208-C Page 28 (175) The client MUST send a valid package #3 (see C.4) as follows: 1. the message s SyncBody SHALL contain at least a Status tag with: Step 4d/5g: a. a CmdRef tag set to the value of CmdID which the Add used b. a Data tag set to 200 The client MUST send a valid package #3 (see C.4) as follows: Step 7: 1. the message s SyncBody SHALL contain at least a Status tag with: a. a CmdRef tag set to the value of CmdID which the Replace used b. a Data tag set to 200. The client MUST send a valid package #3 (see C.4) as follows: 1. the message s SyncBody SHALL contain at least a Status tag with: a. a CmdRef tag set to the value of CmdID which the Get used b. a Data tag set to 200. 2. the message s SyncBody SHALL contain a Results tag with: a. a Data tag containing the Data stored inside the node which must be equal to x. MESSAGE SEQUENCE Step Direction Message Comment UE SS 1 The test procedure starts with a DM Session Initialisation Macro (see E.1) with a Replace command for a test leaf node in the Setup-Response (step 3 of the macro) to change the data to some value x 2 If required by the client: DM Authentication Macro 3 Client Response 4 Server Checks Data received 4a Server Management Operations Message 4b Client Response 4c Server Management Client responds to the Replace with a valid package #3 message (see C.4). Test tool checks the response and if it is a 200 (OK) response go to Step 6. If the response is 425 (Permission Denied) then go to step 4a. Otherwise if the response code is 404 then go to Step 5. Test tool sends a Replace command on the ACL rights of the test leaf node so that Replace is allowed (eg Replace=*) Client responds to the Replace with a valid package #3 message (see C.4). Test tool checks the response and sends a Replace

OMA-ETS-DM-V1_3-20151208-C Page 29 (175) Step Direction Message Comment UE SS Operations Message command on the Data of the leaf node to change the data to some value x 4d Client Response 5 Server Management Operations Message 5a Client Response 5b Server Management Operations Message 5c Client Response 5d Server Management Operations Message 5e Client Response 5f Server Management Operations Message 5g Client Response 6 Server Management Operations Message 7 Client Response 8 Server Response Client responds to the Replace with a valid package #3 message (see C.4). Go to Step 6. Test tool sends an Add command to the client with the path of an interior test node. Client responds to the Add with a valid package #3 message (see C.4). Test tool checks the response and sends an Add command on a leaf node under the interior node created in Step 5. Client responds to the Add with a valid package #3 message (see C.4). Test tool checks the response and sends a Replace command on the ACL rights of the newly created leaf node such that Replace is enabled (eg Replace=*) Client responds to the Replace with a valid package #3 message (see C.4). Test tool checks the response and sends a Replace command on the Data of the leaf node to change the data to some value x. Client responds to the Replace with a valid package #3 message (see C.4). Test tool checks the response and sends a Get command on the leaf node whose data has been replaced Client responds to the Get with a valid package #3 message (see C.4). Test Tool checks the response and compares the value of the node to the value x. It sends a valid package #4 message (see C.5) to close the session.