Comparison of Multi Criteria Decision Making Algorithms for Ranking Cloud Renderfarm Services

Similar documents
Virtual Machine Placement in Cloud Computing

Proximity Prestige using Incremental Iteration in Page Rank Algorithm

Collaborative Filtering using Euclidean Distance in Recommendation Engine

A cognitive Approach for Evaluating the Usability of Storage as a Service in Cloud Computing Environment

Weighting Selection in GRA-based MADM for Vertical Handover in Wireless Networks

Research Article QOS Based Web Service Ranking Using Fuzzy C-means Clusters

HETEROGENEOUS NETWORKS BASED ON SAW AND WPM METHODS

Network Selection Decision Based on Handover History in Heterogeneous Wireless Networks

Double Threshold Based Load Balancing Approach by Using VM Migration for the Cloud Computing Environment

PRIORITIZATION OF WIRE EDM RESPONSE PARAMETERS USING ANALYTICAL NETWORK PROCESS

ScienceDirect. Rendering-as-a-Service: Taxonomy and Comparison

Online Optimization of VM Deployment in IaaS Cloud

An Effective Load Balancing through Mobile Agents using Aglet System

Effective Data Transfers through Parallelism in Cloud Networks

ASIAN JOURNAL OF MANAGEMENT RESEARCH Online Open Access publishing platform for Management Research

An algorithmic method to extend TOPSIS for decision-making problems with interval data

A Reliable Seamless Handoff Scheme based Wireless Networks using TOPSIS and WPM Methods

Novel Hybrid k-d-apriori Algorithm for Web Usage Mining

COMPARATIVE ANALYSIS OF POWER METHOD AND GAUSS-SEIDEL METHOD IN PAGERANK COMPUTATION

Star: Sla-Aware Autonomic Management of Cloud Resources

Selection of Best Web Site by Applying COPRAS-G method Bindu Madhuri.Ch #1, Anand Chandulal.J #2, Padmaja.M #3

Trusted Network Selection using SAW and TOPSIS Algorithms for Heterogeneous Wireless Networks

A Methodology for Assigning Access Control to Public Clouds

Aggregation of Pentagonal Fuzzy Numbers with Ordered Weighted Averaging Operator based VIKOR

QoS Guided Min-Mean Task Scheduling Algorithm for Scheduling Dr.G.K.Kamalam

Creating Web Server in Android Mobile and Easy Serving of Information to Clients

Secure Data Storage and Data Retrieval in Cloud Storage using Cipher Policy Attribute based Encryption

Efficient Technique for Allocation of Processing Elements to Virtual Machines in Cloud Environment

A Novel Approach for Dynamic Load Balancing with Effective Bin Packing and VM Reconfiguration in Cloud

New Optimal Load Allocation for Scheduling Divisible Data Grid Applications

ABSTRACT I. INTRODUCTION

A Generalized Multi Criteria Decision Making Method Based on Extention of ANP by Enhancing PAIR WISE Comparison Techniques

Figure 1: Virtualization

Elastic Resource Provisioning for Cloud Data Center

Level Based Task Prioritization Scheduling for Small Workflows in Cloud Environment

An Optimized Virtual Machine Migration Algorithm for Energy Efficient Data Centers

A Load Balancing Approach to Minimize the Resource Wastage in Cloud Computing

JAVA IEEE TRANSACTION ON CLOUD COMPUTING. 1. ITJCC01 Nebula: Distributed Edge Cloud for Data Intensive Computing

Scheduling of Independent Tasks in Cloud Computing Using Modified Genetic Algorithm (FUZZY LOGIC)

PRIVACY PRESERVING IN DISTRIBUTED DATABASE USING DATA ENCRYPTION STANDARD (DES)

ADAPTIVE HANDLING OF 3V S OF BIG DATA TO IMPROVE EFFICIENCY USING HETEROGENEOUS CLUSTERS

EFFECTIVE INTRUSION DETECTION AND REDUCING SECURITY RISKS IN VIRTUAL NETWORKS (EDSV)

Simulation of Cloud Computing Environments with CloudSim

A Comparative Study of Various Scheduling Algorithms in Cloud Computing

A Review on Cloud Service Broker Policies

Performance Evaluation of CoAP and UDP using NS-2 for Fire Alarm System

Efficient Vertical Handoff in Heterogeneous Wireless Network

Analytic Hierarchy Process for Resource Allocation in Cloud Environment

Collaborative Rough Clustering

INTERNATIONAL JOURNAL OF COMPUTER ENGINEERING & TECHNOLOGY (IJCET)

Framework for Preventing Deadlock : A Resource Co-allocation Issue in Grid Environment

CATEGORIZATION OF THE DOCUMENTS BY USING MACHINE LEARNING

Efficient Task Scheduling Algorithms for Cloud Computing Environment

Rank Similarity based MADM Method Selection

An Exploration of Multi-Objective Scientific Workflow Scheduling in Cloud

A Survey on Resource Allocation policies in Mobile ad-hoc Computational Network

WEB STRUCTURE MINING USING PAGERANK, IMPROVED PAGERANK AN OVERVIEW

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

A Comparative Study of Various Computing Environments-Cluster, Grid and Cloud

Nowadays data-intensive applications play a

Web Service Recommendation Using Hybrid Approach

PROJECT SCHEDULING FOR NETWORK PROBLEMS USING JOB SEQUENCING TECHNIQUE

AN APPROACH FOR LOAD BALANCING FOR SIMULATION IN HETEROGENEOUS DISTRIBUTED SYSTEMS USING SIMULATION DATA MINING

Developing the ERS Collaboration Framework

A QoS Load Balancing Scheduling Algorithm in Cloud Environment

Weighted Page Rank Algorithm based on In-Out Weight of Webpages

A SURVEY OF VARIOUS SCHEDULING ALGORITHM IN CLOUD COMPUTING ENVIRONMENT

PAijpam.eu SECURE SCHEMES FOR SECRET SHARING AND KEY DISTRIBUTION USING PELL S EQUATION P. Muralikrishna 1, S. Srinivasan 2, N. Chandramowliswaran 3

Distributed System Framework for Mobile Cloud Computing

A Multicriteria Approach in the Selection of a SAP UI Technology

Combining TCP and UDP for Secure Data Transfer

A NOVEL APPROACH FOR TEST SUITE PRIORITIZATION

A Simulation Based Comparative Study of Normalization Procedures in Multiattribute Decision Making

The Near Greedy Algorithm for Views Selection in Data Warehouses and Its Performance Guarantees

Inverted Index for Fast Nearest Neighbour

Unsupervised Clustering of Requirements

Load Balancing in Cloud Computing System

GRID SIMULATION FOR DYNAMIC LOAD BALANCING

Type-2 Fuzzy Logic Approach To Increase The Accuracy Of Software Development Effort Estimation

A Level-wise Priority Based Task Scheduling for Heterogeneous Systems

CLOUD COMPUTING: SEARCH ENGINE IN AGRICULTURE

AN EFFICIENT SERVICE ALLOCATION & VM MIGRATION IN CLOUD ENVIRONMENT

A TOPSIS Method-based Approach to Machine Tool Selection

A Hybrid Intrusion Detection System Of Cluster Based Wireless Sensor Networks

New Concept based Indexing Technique for Search Engine

Keywords: clustering algorithms, unsupervised learning, cluster validity

Vertical Handover in Vehicular Ad-hoc Networks A Survey

Multi-Criteria Strategy for Job Scheduling and Resource Load Balancing in Cloud Computing Environment

Two-Level Dynamic Load Balancing Algorithm Using Load Thresholds and Pairwise Immigration

CLOUD WORKFLOW SCHEDULING BASED ON STANDARD DEVIATION OF PREDICTIVE RESOURCE AVAILABILITY

Optimization of Multi-server Configuration for Profit Maximization using M/M/m Queuing Model

Dynamic Resource Allocation on Virtual Machines

Improving the Performance of the Peer to Peer Network by Introducing an Assortment of Methods

PRIVACY PRESERVING RANKED MULTI KEYWORD SEARCH FOR MULTIPLE DATA OWNERS. SRM University, Kattankulathur, Chennai, IN.

Distributed Face Recognition Using Hadoop

Improved Effective Load Balancing Technique for Cloud

New Multi Access Selection Method Based on Mahalanobis Distance

RESERVATION OF CHANNELS FOR HANDOFF USERS IN VISITOR LOCATION BASED ON PREDICTION

Intercloud Security. William Strickland COP 6938 Fall 2012 University of Central Florida 10/08/2012

2. The Proposed Method of Rapid Failure Recovery in Databases

Transcription:

Indian Journal of Science and Technology, Vol 9(31), DOI: 10.17485/ijst/2016/v9i31/93467, August 2016 ISSN (Print) : 0974-6846 ISSN (Online) : 0974-5645 Comparison of Multi Criteria Decision Making Algorithms for Ranking Cloud Renderfarm Services J. Ruby Annette 1*, Aisha Banu 2 and P. Subash Chandran 3 1 IT Department, B.S. Abdur Rahman University, Chennai - 600048, Tamil Nadu, India; Rubysubash2010@gmail.com 2 CSE Department, B.S. Abdur Rahman University, Chennai 600048, Tamil Nadu, India; aisha@bsauniv.ac.in 3 Niteo Technologies, Chennai - 600017, Tamil Nadu, India; subash24@gmail.com Abstract Cloud services that provide a complete environment for the animators to render their files using the resources in the cloud are called Cloud Renderfarm Services. The objective of this work is to rank and compare the performance of these services using two popular Multi Criteria Decision Making (MCDM) Algorithms namely the Analytical Hierarchical Processing () and (Simple Additive Weighting) methods. The performance of three real time cloud renderfarm services are ranked and compared based on five Quality of Service (QoS) attributes that are important to these services namely the Render Node Cost, File Upload Time, Availability, Elasticity and Service Response Time. The performance of these cloud renderfarm services are ranked in four different simulations by varying the assigned for each QoS attribute and the ranking obtained are compared. The results show that and assigned similar ranks to all three cloud renderfarm services for all simulations. Keywords:, Comparison of MCDM Algorithms, Cloud Renderfarm Services, Multi Criteria Decision Making, Ranking Cloud Services, 1. Introduction Rendering is an inevitable time consuming process in the animation industry that is usually completed using a cluster of computers in the animation studio called an Offline render farm. An Offline renderfarm is a collection of render nodes. A render node is nothing but the individual computer in the cluster in which the animation scene file that is split into individual frames is rendered in a distributed manner at the same time. In order to reduce the rendering time, new technologies like grid 1-5 and cloud computing were also explored 6-10 and found to be fruitful. Among cloud services, the PaaS type of cloud renderfarm services model is gaining popularity. The advantage of using the PaaS type of cloud renderfarm services is that the animators need no prior technical knowledge about the cloud environment and they need to pay only the render node cost for every hour of rendering. To render the files in cloud based renderfarms, the animators upload the animation files to be rendered onto the service provider server. The service Provider uses software called the Rendering Job Manager (RJM), to perform the task of a queue manager and assign virtual machines in cloud based on the scheduling policy for completing the rendering tasks. The RJM scales up or scales down the number of virtual machine instantly in order to achieve the deadline. In this work, we compare the performance of three cloud renderfarm services using two Multi Criteria Decision Making (MCDM) Algorithms namely the Analytical Hierarchical Processing () 11,12 and (Simple Additive Weighting) 13 and draw useful insights. There are many works focusing on cloud services 14-23 and the ranking of Infrastructure-as-a-Service (IaaS) *Author for correspondence

Comparison of Multi Criteria Decision Making Algorithms for Ranking Cloud Renderfarm Services cloud services 24. But very few works are aboutpaas cloud renderfarm services 25 according to our literature survey. This work bridges this gap and contributes in the following ways: a) Identifies QoS attributes specific to Cloud Renderfarm services and ranks them using and (Section 2). b) Compares performance of the two MCDM algorithms based on four simulation results. C) Analyses the results obtained and provide useful insights (Section 4) and finally, conclude the work with scope for further work (Section 5). 2. Ranking Services Using and The method of ranking helps the user to specify his overall goal in the form of a hierarchical diagram in order to compare the decision alternatives efficiently. The method first decomposes the Renderfarm selection problem into various levels like the overall goal to be achieved, the important criteria, sub-criteria to be considered and the decision alternatives. The overall goal in this work is to rank and select the best cloud render farm service that satisfies the user Quality of Service (QoS) requirements. The five QoS attributes were selected based on Service Measurement Index (SMI metrics)suggested by Cloud Service Measurement Index Consortium for cloud services (CSMIC). The five criteria selected for ranking the cloud Render farm services are the Render node cost, File Upload Time, Availability, Elasticity and the Service Response Time (SRT).A hierarchy diagram of QoS attributes for cloud renderfarm services is designed as given Figure 1. Next a pair wise comparison and prioritization of the attributes is performed from the lower level to the top level by estimating the relative importance of attributes considered. The relative importance of an attribute can be calculated by assigning relative to each attribute within each level in such a way that the sum of all the relative of all the attributes in each level is equal to one. The next step in this process is to compute the relative ranking for all the Sub-level attributes for which a Relative Ranking Matrix of size NR x NR is formed. Where, NR is the total number of Services to be compared for ranking. Using this Relative Ranking Vector calculated from the above step, the relative ranking of all the services for one sub-level attribute called the Eigen value is estimated using the instruction given in 11 and 12. Finally the relative ranking vector is aggregated for each sub attribute at each level and the final relative ranking vector is computed. The Final Relative Ranking vector is sorted and the sorted list of cloud renderfarm services according to their ranks is obtained. However, the five QoS attributes considered in this work do not have any sub levels. 2.1 Method of Ranking The Simple Additive Weighting () method also known as the WSM (Weighted Sum Model) or the weighted linear Combination method or SM (Scoring method) calculates the overall score of a cloud render farm service by calculating the weighted sum average of all the attribute values. calculates an evaluation score for every alternative by multiplying the relative importance directly with the normalized value of the criteria for each alternative assigned by the user. The obtained product value is then summed up. The alternative service with the highest score is selected as the best cloud renderfarm service. The formula to calculate the overall score (S i ) for an (N) alternative service with (M) QoS attributes is given below in Equation 1. S i = j r ij (Equation 1)For i = 1, 2, 3, N Where r ij refers to the normalized rating, i indicates the i th alternative and j indicates the j th criterion. w j is the j th criterion weight. The formula to calculate benefit criteria value of r ij is given below by the equation 2. r ij = x ij / max i (x ij ) (Equation 2) Similarly, the formula to calculate the worst criteria value of r ij is given by equation 3. r ij = (1/x ij ) / max i (1/x ij ) (Equation 3) Where, x ij represents the original value of the j th criterion of the i th alternative. Figure 1. Hierarchy of QoS Attributes for Cloud Renderfarm Services. 2 Vol 9 (31) August 2016 www.indjst.org Indian Journal of Science and Technology

J. Ruby Annette, Aisha Banu and P. Subash Chandran 3. Performance Comparison of Ranking Algorithms To compare the performances of the two algorithms four different simulations with four different criteria of importance for QoS attributes were set as given above in the Table 1. The QoS details of the five attributes selected are collected for 3 real time cloud renderfarm services. The detail regarding the service providers are not provided to avoid the hindrance that may be caused to the service providers. In the simulation 1, the two QoS attributes or criteria namely the Render Node Cost and the File Upload Time are given more preference than the other criteria as given in Table 2. In Simulation 2, the Service Response Time (SRT) attribute is given more emphasize than any other QoS attribute as given in Table 3. In Table 4, it is worthy to note that, the criteria emphasized in Simulation 1 and 3 are similar as the two criteria namely the render node cost and the File upload time are given more preference than the other criteria. In Simulation 4, the file Table 1. Weight Assigned to QoS Attributes for each Simulation Sim / QoS Render Node Cost FileUpload Time Avail Elast SRT CR Sim1 0.47821 0.35242 0.04562 0.05432 0.06943 0.0000 Sim2 0.24562 0.16293 0.03241 0.02452 0.53452 0.049 Sim3 0.40251 0.30321 0.02254 0.02548 0.24626 0.049 Sim4 0.03214 0.86782 0.01235 0.01253.07516 0.048 Abbreviations: Sim Simulation, Avail- Availability, Elast Elasticity, SRT Service Response Time, CR- Consistency Ratio upload time is given more importance than the other QoS attributes as given in Table 5. 4. Results and Discussion The performance of three real time cloud renderfarm services are ranked and compared based on five Quality of Service (QoS) attributes that are important to these services namely the Render Node Cost, File Upload Time, Availability, Elasticity and Service Response Time. The performance of these cloud renderfarm services are ranked in four different simulations by varying the Table 3. Table 4. Ranking based on Simulation 2 attribute 0.8562 0.2354 0.3959 0.8469 0.2310 0.4065 Ranking based on Simulation 3 attribute 0.4852 0.8356 0.2210 0.4789 0.8267 0.2198 Table 2. Ranking based on Simulation 1 attribute Table 5. Ranking based on Simulation 4 attribute 0.4532 0.4752 0.2567 0.2412 0.8213 0.7958 0.3590 0.3502 0.2132 0.2310 0.7956 0.7902 Vol 9 (31) August 2016 www.indjst.org Indian Journal of Science and Technology 3

Comparison of Multi Criteria Decision Making Algorithms for Ranking Cloud Renderfarm Services assigned for each QoS attribute and the ranking obtained are compared. The method based ranks for Renderfarms is given in Figure 2 and the ranking of the renderfarms using the method is given in Figure 3. It can be clearly seen that, in Simulation 1 and 3, the is selected as the best alternative to achieve the criteria of low render node cost and low File upload time by both the algorithms. In simulation 2, the lowest Service Response Time criterion is best achieved by as ranked best service by both and methods. Since in Simulation 4, the file upload time is taken as the prime criteria the renderfarm service is selected as the best alternative with lowest file upload time by both and. From the above discussions, it is evident that both and has similar ranking for each simulation criteria discussed and the ranking values are also very close to each other. 5. Conclusion and Future Work This work compared and analyzed the raking of cloud renderfarm services using two Multi Criteria Decision Making (MCDM) methods. It was observed that the ranking value differs based on the assigned to each QoS attribute. The best cloud renderfarm service differs based on the QoS criteria. However, both and has similar ranking for each simulation criteria discussed and the ranking values are also very close to each other. Thus method is a good alternative for the and could be preferred instead of when no hierarchy of attributes exists as in the case discussed in this work. However, if there many hierarchy levels with sub attribute then qualifies as a better method to find out the aggregated rank value. In future, the QoS attributes that are more relevant to the cloud based rendering would be identified and the ranking would be simulated with different criterion using various other MCDM algorithms and the results would be compared to identify the efficiency and significance of the MCDM methods. Figure 2. Figure 3. Method Based Ranks for Renderfarms. Method Based Ranks for Renderfarms. 6. References 1. Glez-Morcillo C et al. A New Approach to Grid Computing for Distributed Rendering. 2011 International Conference on P2P, Parallel, Grid, Cloud and Internet Computing (3PGCIC), IEEE. 2011. 2. Patoli MZ et al. An open source grid based render farm for blender 3d. Power Systems Conference and Exposition, PSCE 09. IEEE/PES. IEEE. 2009. 3. Patoli Z et al. How to build an open source render farm based on desktop grid computing. Wireless Networks, Information Processing and Systems. Springer Berlin Heidelberg. 2009; 268 78. 4. Gkion M et al. Collaborative 3D digital content creation exploiting a Grid network. International Conference on Information and Communication Technologies, 2009. ICICT 09, IEEE. 2009. 5. Chong AS, Levinski K. Grid-based computer animation rendering, InGRAPHITE 06: Proceedings of the 4th International Conference on Computer Graphics and Interactive Techniques in Australasia and Southeast Asia, New York, NY, USA, ACM. 2006. 6. Baharon MR et al. Secure rendering process in cloud computing. 2013 Eleventh Annual International Conference on Privacy, Security and Trust (PST), IEEE. 2013. 7. Kennedy J, Healy P. A method of provisioning a Cloud-based renderfarm, EP2538328 A1 [Patent] 4 Vol 9 (31) August 2016 www.indjst.org Indian Journal of Science and Technology

J. Ruby Annette, Aisha Banu and P. Subash Chandran 8. Cho K et al. Render Verse: Hybrid Render Farm for Cluster and Cloud Environments. 2014 7th Conference on Control and Automation (CA), IEEE. 2014. 9. Carroll MD, Hadzic I, Katsak WA. 3D Rendering in the Cloud. Bell Labs Technical Journal. 2012; 17(2):55 66. 10. Srinivasa G et al. Runtime prediction framework for CPU intensive applications, U.S. Patent No. 7,168,074. 2007. 11. Tran VX, Tsuji H, Masuda R. A new QoS ontology and its QoS-based ranking algorithm for Web services. Simulation Modelling Practice and Theory. 2009; 17(8):1378 98. 12. Saaty TL. How to make a decision: the analytic hierarchy process. European Journal of Operational Research. 1990; 48(1):9 26. 13. Stevens-Navarro E, Wong VW. Comparison between vertical handoff decision algorithms for heterogeneous wireless networks. 2006. In Vehicular technology conference, (2006). VTC 2006-Spring. IEEE 63 rd. 2006; 2:947 51. 14. Kalpana V, Meena V. Study on data storage correctness methods in mobile cloud computing. Indian Journal of Science and Technology. 2015 Mar; 8(6). Doi:10.17485/ ijst/2015/v8i6/70094. 15. Durairaj M, Manimaran A. A study on security issues in cloud based e-learning. Indian Journal of Science and Technology. 2015 Apr; 8(8). Doi:10.17485/ijst/2015/ v8i8/69307. 16. Shyamala K, Sunitha Rani T. An analysis on efficient resource allocation mechanisms in cloud computing. Indian Journal of Science and Technology. 2015 May; 8(9). Doi:10.17485/ ijst/2015/v8i9/50180. 17. Rajasekaran A, Kumar A. User preference based environment provisioning in cloud. Indian Journal of Science and Technology. 2015 Jun; 8(11). Doi:10.17485/ ijst/2015/v8i11/71781. 18. John SM, Mohamed M. Novel backfilling technique with deadlock avoidance and migration for grid workflow scheduling. Indian Journal of Science and Technology. 2015 Jun; 8(12). Doi:10.17485/ijst/2015/v8i12/60755. 19. Bagheri R, Jahanshahi M. Scheduling workflow applications on the heterogeneous cloud resources. Indian Journal of Science and Technology. 2015 Jun; 8(12). Doi:10.17485/ ijst/2015/v8i12/57984. 20. Uddin M, Memon J, Alsaqour R, Shah A, Abdul Rozan MZ. Mobile agent based multi-layer security framework for cloud data centers. Indian Journal of Science and Technology. 2015 Jun; 8(12). Doi:10.17485/ijst/2015/v8i12/52923. 21. Meghana Ramya Shri J, Subramaniyaswamy V. An effective approach to rank reviews based on relevance by weighting method. Indian Journal of Science and Technology. 2015 Jun; 8(11). Doi:10.17485/ijst/2015/v8i11/61768. 22. Moayeri M, Shahvarani A, Behzadi MH, Hosseinzadeh- Lotfi F. Comparison of Fuzzy and Fuzzy TOPSIS Methods for Math Teachers Selection. Indian Journal of Science and Technology. 2015 Jul; 8(13). Doi:10.17485/ ijst/2015/v8i13/5410. 23. Venkatesan N, ArunmozhiArasan K, Muthukumaran S. An ID3 algorithm for performance of decision tree in predicting student s absenteeism in an academic year using categorical datasets. Indian Journal of Science and Technology. 2015 Jul; 8(14). Doi:10.17485/ijst/2015/v8i14/72730. 24. Garg SK, Versteeg S, Buyya R. SMICloud: a framework for comparing and ranking cloud services 2011. 2011 Fourth IEEE International Conference on Utility and Cloud Computing (UCC). 2011. p. 210 8. 25. Annette R, Aisha Banu W. Article: A Service Broker Model for Cloud based Render Farm Selection 2014. International Journal of Computer Applications, Published by Foundation of Computer Science, New York, USA. 2014; 96(24):11 4. Vol 9 (31) August 2016 www.indjst.org Indian Journal of Science and Technology 5