An Effective Approach To Find a Best Cloud Service Provider Using Ranked Voting Method

Similar documents
A Cloud Service Measure Index Framework to Evaluate Efficient Candidate with Ranked Technology

SMICloud: A Framework for Comparing and Ranking Cloud Services

An Efficient Frame of Reference for the Selection of Best Cloud Service Provider

Efficient Algorithm for Predicting QOS in Cloud Services Sangeeta R. Alagi, Srinu Dharavath

Achieve Better Ranking Accuracy Using CloudRank Framework for Cloud Services

Future Generation Computer Systems

RANKING OF CLOUD SERVICE PROVIDERS IN CLOUD

RANKING THE CLOUD SERVICES BASED ON QOS PARAMETERS

K.Niha, Dr. W.Aisha Banu, Ruby Anette

A Proposed Framework for Ranking and Reservation of Cloud Services Based on Quality of Service

SLA Aware Cost based Service Ranking in Cloud Computing

QoS based Cloud Service Provider Selection Framework

Permanent Link:

An enhanced QoS Architecture based Framework for Ranking of Cloud Services

FEDERATED CLOUD: A DEVELOPMENT IN CLOUD COMPUTING AND A SOLUTION TO EDUCATIONAL NEEDS

Exploring Resource Provisioning Cost Models in Cloud Computing

Estimating Trust Value for Cloud Service Providers using Fuzzy Logic

Resource Provisioning in Clouds via Non-Functional Requirements

Service Measurement Index Framework Version 2.1

A Framework to Improve Communication and Reliability Between Cloud Consumer and Provider in the Cloud

DESIGN OF AGENT BASED SYSTEM FOR MONITORING AND CONTROLLING SLA IN CLOUD ENVIRONMENT

International Journal of Computer & Organization Trends Volume21 Number1 June 2015 A Study on Load Balancing in Cloud Computing

Federation of Cloud Computing Infrastructure

Revealing the MAPE Loop for the Autonomic Management of Cloud Infrastructures

Grid Computing Vs. Cloud Computing

Cloud Broker for Reputation-Enhanced and QoS based IaaS Service Selection

Full Length Research Article

Optimal Service Pricing for a Cloud Cache

PERFORMANCE ANALYSIS OF PaaS CLOUD COMPUTING SYSTEM

A PERFORMANCE ANALYSIS of HADOOP CLUSTERS in OPENSTACK CLOUD and in REAL SYSTEM

VOL. 4, NO. 2, March 2014 ISSN ARPN Journal of Systems and Software AJSS Journal. All rights reserved

OCRP Implementation to Optimize Resource Provisioning Cost in Cloud Computing

Cloud Vendor Benchmark Price & Performance Comparison Among 15 Top IaaS Providers Part 1: Pricing. April 2015 (UPDATED)

How To Understand Cloud Usability

Cloud Computing Simulation Using CloudSim

NCTA Cloud Architecture

Supply Chain Platform as a Service: a Cloud Perspective on Business Collaboration

IaaS Federation. Contrail project. IaaS Federation! Objectives and Challenges! & SLA management in Federations 5/23/11

Planning the Migration of Enterprise Applications to the Cloud

A Middleware Strategy to Survive Compute Peak Loads in Cloud

CLOUD COMPUTING IN HIGHER EDUCATION

Evaluation Methodology of Converged Cloud Environments

Keywords: Cloudsim, MIPS, Gridlet, Virtual machine, Data center, Simulation, SaaS, PaaS, IaaS, VM. Introduction

DESIGN OF A PLATFORM OF VIRTUAL SERVICE CONTAINERS FOR SERVICE ORIENTED CLOUD COMPUTING. Carlos de Alfonso Andrés García Vicente Hernández

Cloud Computing Service Models, Types of Clouds and their Architectures, Challenges.

CLOUD SERVICE LEVEL AGREEMENTS Meeting Customer and Provider needs

AEIJST - June Vol 3 - Issue 6 ISSN Cloud Broker. * Prasanna Kumar ** Shalini N M *** Sowmya R **** V Ashalatha

Cloud Computing Architectures and Design Issues

Reallocation and Allocation of Virtual Machines in Cloud Computing Manan D. Shah a, *, Harshad B. Prajapati b

A Survey on Cloud Computing

CHAPTER 8 CLOUD COMPUTING

CLOUD COMPUTING. Keywords: Cloud Computing, Data Centers, Utility Computing, Virtualization, IAAS, PAAS, SAAS.

IT Security Risk Management Model for Cloud Computing: A Need for a New Escalation Approach.

A Load Balancing Model Based on Cloud Partitioning for the Public Cloud

CloudFTP: A free Storage Cloud

Design of Customer-Oriented Cloud Products

Survey on important Cloud Service Provider attributes using the SMI Framework

BUSINESS MANAGEMENT SUPPORT

OTM in the Cloud. Ryan Haney

Virtual Machine Instance Scheduling in IaaS Clouds

White Paper on CLOUD COMPUTING

Heterogeneous Workload Consolidation for Efficient Management of Data Centers in Cloud Computing

2 Prof, Dept of CSE, Institute of Aeronautical Engineering, Hyderabad, Andhrapradesh, India,

AN IMPLEMENTATION OF E- LEARNING SYSTEM IN PRIVATE CLOUD

IaaS Cloud Architectures: Virtualized Data Centers to Federated Cloud Infrastructures

International Journal of Innovative Research in Computer and Communication Engineering

VIRTUAL RESOURCE MANAGEMENT FOR DATA INTENSIVE APPLICATIONS IN CLOUD INFRASTRUCTURES

International Journal of Scientific & Engineering Research, Volume 6, Issue 5, May ISSN

How To Manage Cloud Computing

A Secure Model for Cloud Computing Based Storage and Retrieval

Comparison of Dynamic Load Balancing Policies in Data Centers

Cloud Computing and Business Intelligence

IBM Platform Computing Cloud Service Ready to use Platform LSF & Symphony clusters in the SoftLayer cloud

Performance Evaluation of Round Robin Algorithm in Cloud Environment

Towards Comparative Evaluation of Cloud Services

A SURVEY ON WORKFLOW SCHEDULING IN CLOUD USING ANT COLONY OPTIMIZATION

Mobile and Cloud computing and SE

INCREASING SERVER UTILIZATION AND ACHIEVING GREEN COMPUTING IN CLOUD

Research Paper Available online at: A COMPARATIVE STUDY OF CLOUD COMPUTING SERVICE PROVIDERS

Architectural Implications of Cloud Computing

Optimal Multi Server Using Time Based Cost Calculation in Cloud Computing

Role of Cloud Computing in Big Data Analytics Using MapReduce Component of Hadoop

SLA Driven Load Balancing For Web Applications in Cloud Computing Environment

Optimized Resource Allocation in Cloud Environment Based on a Broker Cloud Service Provider

The Hidden Extras. The Pricing Scheme of Cloud Computing. Stephane Rufer

Cloud Computing in the Enterprise An Overview. For INF 5890 IT & Management Ben Eaton 24/04/2013

Environmental and Green Cloud Computing

How To Understand Cloud Computing

Available online at APEEJAY JOURNAL OF COMPUTER SCIENCE AND APPLICATIONS ISSN: (P)

SLA BASED SERVICE BROKERING IN INTERCLOUD ENVIRONMENTS

Transcription:

An Effective Approach To Find a Best Cloud Service Provider Using Ranked Voting Method Vidyashree C.N M.Tech, Dept of CSE PESCE, Mandya vidyashreec.n57@gmail.com V Chetan Kumar Asst.Prof, Dept of CSE PESCE, Mandya write2chetankumar@rediffmail.com ABSTRACT Cloud computing is a rising field providing computation resources. Today there are more number of cloud service providers available in the cloud market offering their services in the better way. Increasing number of cloud service providers having similar functionality poses a problem to cloud users of its selection. Selection of the best cloud service provider as per user s requirements based on the QOS metrics is difficult. This paper provides a solution to user based on standard approach. Here broker system is considered in order to help the user, SMI based metrics are used to provide QOS and also Ranked Voting Method is used to rank the service providers in order to choose best service provider. Keywords-cloud computing; Quality of service; Service Measure Index. I.INTRODUCTION Cloud computing, emerged as a new paradigm for utility computing, is growing very rapidly and gaining attention by not only large organizations but academic organizations, government organizations, small and medium organizations also. It offers broadly three types of services: Software as a service (SaaS), Platform as a service (PaaS) and Infrastructure as a service (IaaS) and three deployment models: Public Cloud, Private Cloud and Hybrid Cloud. Customers need to choose the appropriate cloud provider for fulfilling their requirements. So he needs to spend more time on analyzing all available providers. Customers are also bothered about some of the criteria s such as minimum response time, efficiency, accuracy and security. Few customers will have constraints such as budget and deadline. So based on all these types of parameters choosing efficient and appropriate cloud provider is time consuming. A broker which can act as a middleware between customer and cloud service provider can solve this up to an extent. Broker can get the needed requirements from customer and help the customer by listing out suitable cloud providers. Cloud broker has an important role to find out the best and most cost efficient provider. Major advantages are cost and time savings and achieving appropriate provider. Plenty of providers are available in market now. As the number grows it will be a hectic task for organizations to find the best cloud service provider. So this makes many of the organizations to depend on broker which will help them to spend more time with cloud rather than searching for the best cloud. For selection of a best Cloud provider, QOS metrics plays important role for this, Cloud Services Measurement Initiative Consortium (CSMIC) was formed in 2010 in Carnegie Mellon University. CSMIC is a group of globally established organizations. Professionals, from these organizations, have developed a standard measurement framework called SMI[2] (Service Measurement Index). SMI includes seven major characteristics, each characteristic with 3 or more attributes. SMI clearly defines each attribute which helps decision makers to measure QoS requirement of customers, compare this to offerings of different Cloud service providers and to choose a best Cloud service provider. In this work, metrics has been designed which considers not only metrics defined by SMI but also others metric found in analysis of QoS parameters. Here metrics are categorized into two parts: application dependent metrics and user dependent metrics. It gives a clear idea to quantify each metrics separately. To rank the Cloud providers, this work uses ranked voting method of voting system. Each identified metric will act as a voter and compare its required value with providers providing value of that metrics to rank providers accordingly. Values of metrics can be of different types like numeric, Boolean etc. This work also gives rules for metric comparison. Finally the best provider is obtained as per the requirements specified by user.

II. LITERATURE REVIEW Qos parameters plays an important role in ranking service provider. Customer may require an efficient,cost effective, most suitable for the application. Since there are many providers who will provide same kind of services with different cost, it will be difficult for a customer to choose. To avoid the time consumption, this paper proposes a ranking mechanism based on Qos parameters for cloud service. After obtain the evaluation information from monitoring or benchmark tools, with utilize this information can rank the cloud services proportionate with user s requirement. This section peruses ranking approaches in cloud computing services based on quality of service. A. Service Mapper Approach This approach uses Singular Value Decomposition (SVD) technique, which is a statistical technique for cloud service ranking. The proposed approach introduces a service mapper called cloud service provider mapper for service ranking. This provides the user with knowledge about priority of the services and selects a service more knowingly. Although, this approach doesn t consider all of the qualitative attributes of services and service comparison is based on limited attributes. Value decomposition technique on large matrixes is time consuming and sometimes it is infeasible. So, this algorithm may not get the correct answer. Other limitations for this approach is this approach can be applicable in situations that the service providers are few and comparison measures are limited and clear. B. SRS Approach A system called Service Ranking System (SRS). This system considers two static and dynamic states for cloud service ranking. In the static ranking system, all available cloud service providers are ranked without considering user requirements. In dynamic ranking, system tries to find and rank suitable services based on user requirements. This algorithm offers a good method for static ranking of services. But, it is not efficient for dynamic ranking, because service comparison is performed based on just seven key attributes. The most important disadvantage of this approach is forcing the user to enter weights of attributes in such way that their sum should be equal to one. If the user violates this condition, the correct results won t be achieved. C. SLA Matching Approach This work aims to define the process of identifying compatible cloud provider for a given requirements by matching SLA parameters. This approach is mainly considers the model. Analyzing the models and keeps that model for different steps is difficult. D. Aggregation Approach This approach is based on the usage of benchmark. It also applies user feedback for service ranking. This approach gets two types of information from user and benchmark and aggregates them for comparing and ranking of services. The information received from benchmark is known as objective assessment and the other information is known as subjective assessment. Hence, fuzzy method is the only choice which can transform this information to scientific and comparable data. This approach is comparing and ranking of services with limited qualitative measures, just like the previous approaches. E. CloudRank Approach In this approach, ranking is performed based on prediction of qualitative values. This work has mentioned that qualitative value of services should be measured before service comparison. This task entails a high time complexity and cost. Also, calling usually would not achieve a correct answer due to Internet s unpredictable connections. The most important limitation of this approach is its dependency on train data. If a wrong data exists in database, system will always answer based on that wrong data. So, it can be considered that this approach usually achieves almost correct answer. Also, this approach does not consider all of the qualitative values for comparing services. F. SMI Cloud In a framework is proposed for service ranking based on analytical hierarchical process (AHP). This approach applies standards which are proposed by consortium CSMIC for extracting qualitative values.

Table 1: Comparison of existing approaches for ranking in cloud environment (n is number of cloud services, m is number of users) In table 1, all of the reviewed approaches are summarized and their features are presented. III. SERVICE MEASUREMENT INDEX Cloud Service Measurement Consortium (CSMIC) has determined some attributes for measuring cloud computing services and adopted these attributes in the form of Service Measurement Index (SMI). These attributes are applied for comparing different cloud services and they are designed based on International Standard Organization (ISO). These attributes provide a standard model for measuring and comparing business services. In fact, this segmentation represents a common view of qualitative attributes. Generally, seven primary attributes exist for service comparison, each of which has series of sub attributes shown in Figure 1. These attributes are defined as follow. A. Accountability This category contains attributes used to measure the properties related to the service provider organization. These properties may be independent of the service being provided. B. Agility Indicates the impact of a service upon a client s ability to change direction, strategy, or tactics quickly and with minimal disruption. C. Assurance This category indicates key attributes that indicates how likely it is that the service will be available as specified. D. Financial This indicates the amount of money spent on the service by the client. E. Performance This category covers the features and functions of the provided services. F. Security and Privacy This category includes attributes that indicate the effectiveness of a service provider s controls on access to services, service data and the physical facilities from which services are provided. G. Usability This indicates the ease with which a service can be used

Figure 1: Categories and Attributes of Service Measure Index (SMI) IV. QOS METRICS SMI has been developed to compare all type of service providers: IaaS, PaaS and SaaS providers[2]. Also to notice that some metrics are application dependent like Virtualization technique, Platforms supported etc., while some metrics (stability, suitability etc.) can be measured by experience of existing users only and some metrics are sub-attribute dependent like cost, certifications etc. Thus, it is a complex task to create a list of metrics that should cover all the discussed aspects. To tackle all discussed issues this work divides metrics in to two categories; application dependent and user dependent. It defines application dependent metrics on the basis of application s requirement and user dependent metrics on user s requirement basis. A. Application dependent metrics Reliability with the increase in software and hardware components, more failures are likely to occur in the system. Thus, one should understand failure behavior in the Cloud environment in order to better utilize the Cloud resources. In this work, MTBF (mean time between failures) is considered for reliability measure. It is assumed that MTBF will be provided by the service provider for each Cloud service. MTBF applies to a service that is going to be repaired and returned to operate, and is defined as(1) MTBF = Total time Number of failures (1) Security Cloud providers take different types of security measures. Different application may have different requirement for security measures. Four types of security measures discussed are considered for service measures; (i) Crypto algorithms and key management, (ii) Physical security support, (iii) Network security support, (iv) Data security support. Very few providers are providing all security measures Data centers Performance is highly affected by the speed-of-light latency, TCP latency, (both of which are directly correlated to circuit distance between user and files), as well as packet loss. By placing the files closer to the user, both speed of-light latency and TCP latency are minimized. Packet loss is also minimized because the probability of packet loss increases as distance increases. So, number of data centers their inter distance and the area must be criteria to rank the service providers.

Cost Cost is an important factor to rank providers. Basically cost is a function of requirement of resources like CPU, memory, storage and data transfer in an application. Generally two types of pricing plan are used; Price bundling and Unbundling. In price bundling plan, price is charged for each characteristic separately e.g. $45 per CPU or $10 per 1GB RAM. In unbundling plan, providers offer predefined bundles e.g. 2.4 GHz Vcpu, 1GB RAM and 80GB HDD. Operating Systems Support Providers support different OS like Mac OS X, Windows, and Open SUSE Linux etc. It is possible that one provider support some OS and other provider supports some other OS like Windows Azure supports only Windows operating system, while GoGrid supports Windows server 2003/2008, Red hat Linux 5.1/5.4 etc. Different applications require different OS support. Application can rank providers based on decision whether provider provides required platform or not. Platforms Supported Same as operating systems, different providers support different type of platforms. For example, CloudSigma supports Java, PHP, WinDev, Dot Net. Application may rank providers like it ranks providers based on Operating System. Customer support facility Type of support, response time for support and the charge for customer support are important factors which define customer support facility. Generally new users prefer a provider with a good support system. Some providers offer free customer support service, but mostly the providers charge accordingly. GoGrid provides free 24/7 phone support and free 24/7 premium support. Amazon AWS provides premium Support (Urgent - 1 hour, High - 4 business hours, Normal 1 business day, Low - 2 business days). So to measure customer support facility metric, the metric is assumed as unordered set which may contain elements like Free 24/7 phone support, Urgent support, Basic support, Low support, Diagnostic tools etc. Response Time Response time represents difference between time of request for service and time when service is available. Response time depends on infrastructure as well as application for which request has been generated. Capacity Capacity means maximum amount of resources that a service provider can provide at peak times. Capacity of a service provider can be quantified by four metrics; CPU capacity metric (measured by multiplication of number of CPU and frequency of CPU), memory capacity metric (measured in GB), storage capacity metric (measured in GB) and network capacity metric (measured by bandwidth). B. User Dependent Metrics Some of the user dependant metrics are as follows. Reputation Reputation measures trustworthiness of a Cloud provider. It is based on the users experience for service providers. Different users may have different opinions for the same provider. Reputation of a provider can be calculated taking average of rank assigned to a provider by different users. Client Interface Client interface is also an important criterion that a user may consider for the selection of a provider. We observed different providers and find that they support different types of client supports. Different client interfaces which a user can expect are Web Access, API, FTP Access, Website, Management Console etc. User can rank providers on the basis of client interface.

Monitoring Monitoring is a subjective metric. Three levels to measure this metric is being considered; Poor, Average, and Extensive. Poor means provider is not providing any monitoring tool, monitoring is responsibility of user itself. Average means simple integrated monitoring tools are provided by provider. It may have few indicators or few alerts. In extensive, provider provides complete integrated monitoring tool with no cost and monitoring is not an issue for the user. Free Trial Some Providers provide free trial to test their services. It is very beneficial for users. User can test services before deployment. Definitely, provider with free trial service will get higher rank. API Whether provider is providing an API to interact with server or not is also an important parameter. It is also a subjective metric. Just like monitoring metric, again three levels to measure this metric is being considered; None, Average and Extensive. Certification Different providers have different certifications for industry regulatory compliance e.g. Amazon has certification of SAS 70 TYPE II, ISO 27001, and HAPP etc. Windows Azure has certifications of Safe Harbor policy, ISO 27001 etc. A user may prefer one certification over other. So, user can rank providers on the basis of the type of certification it expects from provider. User can give weights to certifications according to its need eventually to rank providers. Sustainability Sustainability has three dimensions; economic, social and environmental impact of Cloud service provider. Environmental impact can be measured using metrics like carbon emission, water use and resource consumption. Economic impact can be measured using metrics such as power cost of servers, storage, and networking facility cost and support cost. Social impact can be measured using metrics like economic development and socio-political stability. Some common metrics are PUE (power usage effectiveness), CUE (carbon usage effectiveness), CUPS (compute units per second), CADE (corporate average data centre efficiency), DPPE (data centre performance per energy), and WUE (water use efficiency). User Experience A user, going to take services from Cloud, must consider the views of existing users for the services. Existing users of a service can describe accuracy, stability and transparency of the service in a better way. So, experience of a user with a service is an important factor. For the proposed framework, it is assumed that existing users rate Cloud services on level 0 to 10. Higher value of level indicates the better experience of users with that service. Elasticity Elasticity can be understood by its two fundamental elements: Time and Cost. Time means how much time a service provider takes to provision or de-provision resources and cost means whether service provider charges on per hour basis or per minute basis. Scalability It is a measure of an application system ability to provide increased throughput, reduced response time and/or support more users when hardware resources are added without modification in cost-effectively. Two basic dimensions are horizontal scaling(scale-out) and vertical scaling(scale-up). If a provider is providing vertical scaling gets higher rank.

V. PROPOSED SYSTEM At present, there are huge numbers of service providers in cloud market. They advertise about their services in their own manner. User experience also plays an important role in choosing a best provider. Architecture is presented here, that brings together all Cloud providers which makes implementation of proposed work shown in figure 2. The components of Cloud architecture are as follows. Figure 2 :Proposed system architecture A. Cloud Exchange Cloud exchange acts as central coordinator which brings together Cloud providers, Cloud coordinator, Cloud user and Cloud broker. It contains directory to provide information to broker about Cloud services. B. Directory Directory contains all information about service providers which are required in selection of a best provider. It contains two type of information: Service catalogue and Service info. Service Catalogue Using service catalogue, all service providers give details about their service on the basis of defined quality metric. Service Info Service info is a repository which contains Service log and Review log. Service log is a database which stores log records of all registered service providers. Log records keep information about history of service providers. Here, history means response time, number of successful or unsuccessful transactions, availability etc. Review log contains information about experience of user for a particular service like stability, transparency, reputation, feedback, recommendation etc. C. Cloud Broker Cloud broker takes details of requirement of user and details of service provider from directory and analyze them using proposed method in order to choose a best provider. It is assumed that user has already informed the Cloud broker about its essential QoS. If any essential QoS is not offered by a service provider, broker does not consider that provider for comparison. Since Cloud broker directly interact with users, it collects information from users about their experience for used service providers, provide service info and update service info when required. D. Cloud Coordinator Cloud coordinator acts as a representative for Cloud service provider. It is responsible to provide service catalogue and service log and periodically update both.

VI. METRIC COMPARISON RULES A metric can have different types of values like cost is a numerical value, scalability is a Boolean value, monitoring is a leveled value etc. This makes quantification of metrics, which helps voters to rank providers in order of preference, a challenge. To handle this, metric comparison rule is proposed (rank 1 is considered as highest rank and providers with same value get same rank). A. Boolean If a metric is Boolean type i.e. value is yes or no, provider with positive response will get rank one and with negative response will get rank two. For example, in case of free trial, provider with answer yes will get rank one and with answer no will get rank two. B. Range Type Suppose Ri is range provided by provider i and Ru is required range by a user. For each provider i, calculate length(ri Ru) sort length in ascending order, arrange providers in order of sorted list, then rank them from high to low. C. Numeric Case 1: If metric s value of service providers must be as large as possible e.g. availability. Suppose Ri is numeric value of metric of provider i and Ru is required value by user. Find all providers i such that Ri Ru and rank them 1. Sort remaining providers in descending order and rank them from high to low. Case 2: If metric s value of service providers must be as low as possible e.g. cost. Suppose Ri is numeric value of metric of provider i and Ru is required value by user. Find all providers i such that Ri Ru and rank them 1. Sort remaining providers in ascending order and rank from high to low. Subjective metrics are also considered as numeric. For example, three levels considered for measuring monitoring metric (Extensive, Average and Poor), can be ordered. So extensive is valued as 1, average is valued as 2 and poor is valued as 3. This method is used for all subjective metrics i.e. monitoring API. D. Unordered Set For unordered set type of metric, it is assumed that user gives not only its required set but also gives weight for each element in set i.e. importance of each element in set and sum of weight should be 1. VII. RANKED VOTING METHOD In this work, to find a best provider for a user, ranked voting method is used. In ranked voting system, voter ranks alternatives in order of preference[1]. To rank the cloud provider, Each identified metrics will act as a voter and Service provider will act as a candidate. Voter selects the candidates and rank them in order of their preferences. Compare metrics required value with providers providing value of that metrics. Here values of metrics can be of different types like numeric, boolean etc. To select best cloud service provider, Let m be the number of cloud providers in market. Where k, be the best cloud providers and k m. Let vij be the number of jth place of the candidate i where i=1 m and j=1..k. Wj be the sum of votes with certain weight. Now the preference score Zi should be calculated using the below formula, Zi= vij wj Provider with highest normalized preference score will be the winner CONCLUSION Cloud computing became an important technology for many of the organizations to deliver different kinds of services. Cloud customers find very difficulty in choosing the best providers which can satisfy their requirements. Therefore this work proposes an effective and efficient way to find best CSP based on QOS parameters. It is greatly useful for cloud users to identify best cloud provider without any confusion. By evaluating the cloud metrics a broker can effectively choose an efficient provider which satisfies user

requirements. Enabling ranking mechanism will help the customers to choose the best provider whose service provides better performance for their application. In this paper, QoS metrics are defined to measure the Cloud services and Proposed work makes various QoS attributes like location of data centre, capacity, availability, certification etc. efficiently computable. It facilitates various providers specifying their QoS offers and various customers specifying their QoS requirements at different quality levels. It helps customers to understand each service provider clearly and also helps service providers to advertise their service in a better way. In turn, it also increases a healthy competition in Cloud Market. The proposed work adopted Ranked Voting Method to formulate best provider selection problem and developed a flexible framework (i.e. user can add or remove QoS metrics easily) for selection of a best provider which can be used for different applications with different QoS requirement. User can make their own voter list as per its requirement. A set of rules is defined to compare QoS values even though they may be of different value types. User may group more than one application to run on a single service provider or user may run applications on different service providers, the proposed framework suggests a best provider. Ranked voting method does not use inefficient provider s information to discriminate efficient providers therefore order of efficient provider never changes if inefficient providers are added or removed. REFERENCES [1] T. Obata, H. Ishii, A method for discriminating efficient candidates with ranked voting, European Journal of Operational Research, vol. 151, 2003, pp.233 237. [2] J. Siegel, J. Perdue, Cloud services measures for global use: the Service Measurement Index (SMI), in Annual SRII Global Conference (SRII), IEEE, 2012, pp. 411 415. [3] R. Buyya, C. S. Yeo, S. Venugopal, J. Broberg, I. Brandic, Cloud Computing and Emerging IT Platforms: Vision, Hype, and Reality for Delivering Computing as the 5th Utility, Future Generation Computer Systems, vol. 25, 2009, pp. 599 616. [4] https://cloudsleuth.net/global-provider-view [5] S. M. Habib, S. Ries, M. Muhlhauser, Cloud Computing Landscape and Research Challenges regarding Trust and Reputation, in Ubiquitous Intelligence & Computing and 7th International Conference on Autonomic & Trusted Computing, 2010, pp. 410 415. [6] www.nirvanix.com/downloads/analyst-reports/nirvanix-vs-in-housestorage. pdf [7] S. E. Kihal,C. Schlereth, B. Skiera, "Price Comparison for Infrastructure as a Service," in European Conference on Information Systems, Barcelona, Spain, 2012. [8] http://www.cloudorado.com/ [9] G. Linden, B. Smith and J. York, Amazon.com Recommendations: Item-to-Item Collaborative Filtering, IEEE Internet Computing, vol. 7,no. 1, pp. 76-80, Jan. /Feb. 2003 [10] Z. Zheng, Y. Zhang, and M.R. Lyu, Cloud Rank: A QoS-Driven Component Ranking Framework for Cloud Computing, Proc. Int l Symp. Reliable Distributed Systems (SRDS 10), pp. 184-193, 2010 [11] Z. Zheng, H. Ma, M.R. Lyu, and I. King, QoS-Aware Web Service Recommendation by Collaborative Filtering, IEEE Trans. Service Computing, vol. 4, no. 2, pp. 140-152, Apr.-June 2011 [12] Parveen Dhillon, Vishal Arora, A Compositional Approach of Reliable and Efficient Cloud Service Selection, Volume 2, Issue 8, August 2012 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering