Cloud Broker for Reputation-Enhanced and QoS based IaaS Service Selection



Similar documents
RANKING THE CLOUD SERVICES BASED ON QOS PARAMETERS

SMICloud: A Framework for Comparing and Ranking Cloud Services

K.Niha, Dr. W.Aisha Banu, Ruby Anette

A Proposed Framework for Ranking and Reservation of Cloud Services Based on Quality of Service

An Efficient WS-QoS Broker Based Architecture for Web Services Selection

A Cloud Service Measure Index Framework to Evaluate Efficient Candidate with Ranked Technology

Future Generation Computer Systems

FEDERATED CLOUD: A DEVELOPMENT IN CLOUD COMPUTING AND A SOLUTION TO EDUCATIONAL NEEDS

RANKING OF CLOUD SERVICE PROVIDERS IN CLOUD

Optimal Service Pricing for a Cloud Cache

An Effective Approach To Find a Best Cloud Service Provider Using Ranked Voting Method

SLA BASED SERVICE BROKERING IN INTERCLOUD ENVIRONMENTS

PERFORMANCE ANALYSIS OF PaaS CLOUD COMPUTING SYSTEM

An enhanced QoS Architecture based Framework for Ranking of Cloud Services

How To Understand Cloud Usability

Estimating Trust Value for Cloud Service Providers using Fuzzy Logic

Keywords: Cloudsim, MIPS, Gridlet, Virtual machine, Data center, Simulation, SaaS, PaaS, IaaS, VM. Introduction

A BROKER-BASED WEB SERVICE ARCHITECTURE FOR SERVICE REGISTRATION AND DISCOVERY WITH QOS

Profit Maximization Of SAAS By Reusing The Available VM Space In Cloud Computing

Comparison of Trust Values using Triangular and Gaussian Fuzzy Membership Functions for Infrastructure as a Service

A Study on Analysis and Implementation of a Cloud Computing Framework for Multimedia Convergence Services

Hierarchical Trust Model to Rate Cloud Service Providers based on Infrastructure as a Service

CLOUD SERVICE LEVEL AGREEMENTS Meeting Customer and Provider needs

A Trust Evaluation Model for QoS Guarantee in Cloud Systems *

A Framework to Improve Communication and Reliability Between Cloud Consumer and Provider in the Cloud

The Hidden Extras. The Pricing Scheme of Cloud Computing. Stephane Rufer

Profit Based Data Center Service Broker Policy for Cloud Resource Provisioning

Cloud deployment model and cost analysis in Multicloud

An Efficient Frame of Reference for the Selection of Best Cloud Service Provider

SLA-aware Resource Scheduling for Cloud Storage

A Load Balancing Model Based on Cloud Partitioning for the Public Cloud

Towards Comparative Evaluation of Cloud Services

SLA Aware Cost based Service Ranking in Cloud Computing

A Study on the Cloud Computing Architecture, Service Models, Applications and Challenging Issues

Permanent Link:

EFFECTIVE DATA RECOVERY FOR CONSTRUCTIVE CLOUD PLATFORM

A Broker Based Trust Model for Cloud Computing Environment

Virtual Cloud Bank: Cloud Service Broker for Intermediating Services Based on Semantic Analysis Models

Design of Customer-Oriented Cloud Products

Dynamic Resource Pricing on Federated Clouds

QoS based Cloud Service Provider Selection Framework

VOL. 4, NO. 2, March 2014 ISSN ARPN Journal of Systems and Software AJSS Journal. All rights reserved

Resource Provisioning in Clouds via Non-Functional Requirements

Full Length Research Article

Efficient Algorithm for Predicting QOS in Cloud Services Sangeeta R. Alagi, Srinu Dharavath

NIST Cloud Computing Reference Architecture

Resource Allocation Avoiding SLA Violations in Cloud Framework for SaaS

Profit-driven Cloud Service Request Scheduling Under SLA Constraints

DESIGN OF AGENT BASED SYSTEM FOR MONITORING AND CONTROLLING SLA IN CLOUD ENVIRONMENT

Exploring Resource Provisioning Cost Models in Cloud Computing

Infrastructure as a Service (IaaS)

Auto-Scaling Model for Cloud Computing System

SERVICE BROKER ROUTING POLICES IN CLOUD ENVIRONMENT: A SURVEY

Performance Gathering and Implementing Portability on Cloud Storage Data

Cloud Computing Service Models, Types of Clouds and their Architectures, Challenges.

A COMPARATIVE STUDY OF SECURE SEARCH PROTOCOLS IN PAY- AS-YOU-GO CLOUDS

Simulation-based Evaluation of an Intercloud Service Broker

Cloud Computing Architectures and Design Issues

PRIVACY PRESERVATION ALGORITHM USING EFFECTIVE DATA LOOKUP ORGANIZATION FOR STORAGE CLOUDS

Secured Storage of Outsourced Data in Cloud Computing

Survey on important Cloud Service Provider attributes using the SMI Framework

Augmented Reality Application for Live Transform over Cloud

Saving Mobile Battery Over Cloud Using Image Processing

Minimal Cost Data Sets Storage in the Cloud

Privacy-Aware Scheduling for Inter-Organizational Processes

Framework for Ranking Service Providers of Federated Cloud using Fuzzy Logic Sets

What are you paying for? Performance benchmarking for Infrastructure-as-a-Service offerings

Heterogeneous Workload Consolidation for Efficient Management of Data Centers in Cloud Computing

An Overview on Important Aspects of Cloud Computing

Agent Based Framework for Scalability in Cloud Computing

Fig. 1 WfMC Workflow reference Model

Introduction to Track on Engineering Virtualized Services

A Survey on Cloud Computing

Optimized Resource Allocation in Cloud Environment Based on a Broker Cloud Service Provider

How to Do/Evaluate Cloud Computing Research. Young Choon Lee

International Journal of Scientific & Engineering Research, Volume 6, Issue 4, April ISSN

Security Considerations for Public Mobile Cloud Computing

SECURITY IN SERVICE LEVEL AGREEMENTS FOR CLOUD COMPUTING

Dynamic Resource Distribution Across Clouds

QOS Based Web Service Ranking Using Fuzzy C-means Clusters

Payment minimization and Error-tolerant Resource Allocation for Cloud System Using equally spread current execution load

Optimizing the Cost for Resource Subscription Policy in IaaS Cloud

Federation of Cloud Computing Infrastructure

AEIJST - June Vol 3 - Issue 6 ISSN Cloud Broker. * Prasanna Kumar ** Shalini N M *** Sowmya R **** V Ashalatha

C-Meter: A Framework for Performance Analysis of Computing Clouds

A Trust-Evaluation Metric for Cloud applications

Cloud Computing and Software Agents: Towards Cloud Intelligent Services

Customer Security Issues in Cloud Computing

A QoS-Aware Web Service Selection Based on Clustering

International Journal of Engineering Research & Management Technology

Supply Chain Platform as a Service: a Cloud Perspective on Business Collaboration

Round Robin with Server Affinity: A VM Load Balancing Algorithm for Cloud Based Infrastructure

Information Security Management System for Cloud Computing

AN IMPLEMENTATION OF E- LEARNING SYSTEM IN PRIVATE CLOUD

CloudAnalyst: A CloudSim-based Visual Modeller for Analysing Cloud Computing Environments and Applications

Comparative Study of Resource Provisioning in Clouds by an administrator to deploy applications

Cost Effective Approach for Automatic Service Provider Selection in Cloud Computing

Have We Really Understood the Cloud Yet?

COMBINE DIFFERENT TRUST MANAGEMENT TECHNIQUE: RECOMMENDATIONAND REPUTATION IN CLOUD SERVICE. B.Brithi #1, K. Kiruthikadevi *2

Transcription:

Proc. of Int. Conf. on Advances in Communication, Network, and Computing, CNC Cloud Broker for Reputation-Enhanced and QoS based IaaS Service Selection Ruby Annette.J 1, Dr. Aisha Banu.W 2 and Dr.Sriram 3 1 B.S Abdur Rahman University, Chennai, India Email: rubysubash2010@gmail.com 2 B.S Abdur Rahman University, Chennai, India Email: {aisha, sriram}@bsauniv.ac.in Abstract Cloud brokering service is gaining momentum as a promising area of research and business opportunity. As cloud computing enables the user to obtain the infrastructure, platform and software required as a service, there is no need to invest huge amount as the capital to run the business and hence more and more companies are moving to the cloud. But collecting the information from the multiple cloud service providers and comparing the various services in order to zero down the right service provider is a time consuming task given the plethora and the variety of cloud service offerings. As many cloud service providers are entering the market, the users look for the cloud brokers who could help them in the dynamic selection and aggregation of services based on their Quality of Service (QoS) requirements and the cost. The other important aspects considered by the users while selecting the service provider is their reputation. This paper introduces a Cloud Broker Service (CBS) framework that facilitates the dynamic selection of the IaaS service providers based on the QoS requirements, cost and the reputation of the service providers. The proposed CBS framework provides a ranking mechanism that evaluates the services offered by the various IaaS service providers like Amazon EC2, Rackspace etc., for various Quality of Service requirements based on the Service Measurement Index (SMI) and the reputation of the company. The evaluation of the services using the Service Measurement Index proposed by CSMIC (Cloud Service Measurement Index Consortium) to identify the various measurement indexes that are important for evaluating a cloud service enables to zero down the right service provider who could satisfy the user s requirements. The applicability of the ranking algorithm and the cloud broker frame work is shown using a case study. Index Terms Cloud Broker Service, Service Measurement Index (SMI), Reputation, IaaS Services Ranking Algorithm. I. INTRODUCTION Elasticity is the beauty of the cloud computing paradigm. Cloud computing enables the user to scale-up or scale-down the resources utilization as per the requirement of the hour. Thus, Small and Medium Enterprises (SMEs) no longer have to invest large capital in hardware to deploy their services, since the additional resources required can be acquired from the public cloud providers on demand. Though, there are a lot of Elsevier, 2014

advantages in terms of CPEX ( Capital Expenditure) converted to OPEX (Operating Expenses), one of the main challenges faced by the SME s is to select the right service provider who can fulfill their QoS requirements at minimum cost. Dealing with a cloud service provider requires knowledge of its operating environment, security levels, data recovery approaches, and the service terms and conditions. Collecting these information for multiple cloud service providers is not an easy task for service consumers given the plethora and the variety of cloud services offerings. Thus, Cloud service brokers (CSBs) play a vital role in helping the SME s identify the right service providers. The National Institute of Standards and Technology's (NIST), identifies the Cloud Broker as the actor which is in charge of service intermediation, service aggregation, and service arbitrage in its Cloud Computing Reference Model [1], [2]. According to Gartner [3], "The future of cloud computing will be permeated with the notion of brokers negotiating relationships between providers of cloud services and the service customers. In this context, a broker might be a software, appliances, platforms or suites of technologies that enhance the base services available through the cloud. Enhancement will include managing access to these services, providing greater security or even creating completely new services, ". Thus, Cloud service brokers (CSBs) are the entities that act in between the user and the service providers to provide their expertise and other value-added services like SLA negotiation process, monitoring and assessing the implementation of SLAs to assist the service consumers [4]. Various works have been carried out to enable service consumers to compare the cloud services being offered. For example, the Service Measurement Index (SMI) proposed by the Cloud Service Measurement Index Consortium (CSMIC)[5] identifies the various measurement indexes that are important for evaluating a Cloud service. These measurement indexes can be used by customers to compare different Cloud services. Other examples include, the Cloudstone [6] which is a multi-platform, multi-language benchmark and measurement tools for web 2.0, Cloudharmony [7] and the CloudCmp [8]. S. K. Garg, et al [9] proposed a frame work for ranking the cloud services named the SMICloud. The SMICloud enables the users to compare different Cloud offerings along several dimensions, priorities and select the services that are appropriate to their needs. But the reputation of the service providers is not considered during the selection of services in the SMICloud. Reputation is the public s opinion about the character or standing (such as honesty, capability, reliability) of an entity, which could be a person, an agent, a product or a service. It is objective and represents a collective evaluation of a group of people/agents. The reputation based service discovery is a widely explored topic in web services discovery [10], [11], [12],[17]. The Cloud Broker framework proposed in this work enables the cloud service users to select the right service provider based on the QoS requirements and reputation. The Service Measurement Index (SMI) is used for comparing the QoS requirements of users and the services offered by the various cloud service providers. The reputation of the cloud service providers is calculated using the consumer s feedback. The proposed service matching, ranking and selection algorithm is used to select the prospective service providers who s services adhere to the QoS requirements of the users. The customer is given the option to select a service provider based on the dominant QoS requirement and reputation or to service that offers a better QoS/Cost ratio.the service matching, ranking and selection algorithm are based on a matching algorithm proposed by Ziqiang Xu et al [13], Rajendran et al [14]. However, since it is framed for the selection of the web services, it doesn t address various other unique criteria in cloud like elasticity, VM size etc, and is different from this proposed work. The rest of the paper is organized as follows: The next session presents an overview of the proposed cloud broker framework. This session also explains in detail the algorithm used for ranking the services and the calculation of reputation of the service providers. In Section 3, the case study is explored in detail to determine the effectiveness of the proposed algorithm. A detailed discussion on the results obtained for the various kinds of user requirements and the services selected accordingly is given in section 4. The conclusion and the future work are given in section 5. II. CLOUD BROKER FRAMEWORK In the proposed Cloud Broker Framework, the user selects the SMI attributes which are the QoS requirements of the users that are to be met by the cloud service providers. The SMI attributes are designed based on International Organization for Standardization (ISO) standards by the Consortium [5]. It consists of a set of business-relevant Key Performance Indicators (KPI s) that provide a standardized method for measuring and comparing a business service. The advantage of using the SMI attributes is that, they provide a holistic view of QoS needed by the customers for selecting a Cloud service provider based on various 816

measures like the Accountability, Agility, Assurance of Service, Cost, Performance, Security and Privacy, and Usability. The SMI attribute selected by the users are called the Key Performance Indicators (KPI s). In the proposed cloud broker framework, the SMI Calculator/Evaluator daemon evaluates the services of the various cloud service providers based on the KPI s chosen by the user and selects the prospective service providers. Once the service providers are evaluated in the SMI evaluator, the QoS based ranking system ranks the potential service providers based on the KPI s of the users and provides the LS1. The proposed Cloud Broker Framework is given in the figure 1. Figure 1 Cloud Broker Framework The Reputation manager in the cloud broker framework estimates the reputation of the service provider using the feedback of the consumer s who have already used the cloud services from the providers. The feedback from the users are collected and stored in a database and the reputation manager uses the information in the database to calculate the reputation of each service provider and creates a ranking list LS2 of the service providers based on their reputation. The service selector daemon in the cloud broker framework provides the choice to the customer to select the service provider based on only the QoS ranking and reputation or the Quality to Cost ratio. The Quality to Cost ratio enables the user to select the service provider who provides the best service for the amount paid. The data flow diagram of the proposed frame work is given in figure 2. Figure 2 Data Flow Diagram of the Cloud Broker Framework 817

A. SMI calculator/estimator As discussed earlier, the SMI attributes provides a holistic view of QoS needed by the customers for selecting a Cloud service provider based on: Accountability, Agility, Assurance of Service, Cost, Performance, Security and Privacy. Accountability it is used to measure various Cloud providers specific characteristics. This is important to build the trust of a customer on any Cloud provider. Agility Agility in SMI is measured as a rate of change metric like elasticity, flexibility and adaptability. Elasticity in turn is the scalability of the resources during the peak time which is defined by two attributes: mean time taken to expand or contract the service capacity, and maximum capacity of service. The capacity is the maximum number of compute unit which can be provided at peak times. Cost Cost depends on two attributes: acquisition and on-going. It is not easy to compare different prices of services as they offer different features and thus have many dimensions thus the approach used in the SMI Cloud Framework [9] is used in this work to calculate the cost as a volume based metric. i.e. the cost of one unit of CPU, storage, RAM, and network bandwidth. Therefore, if a VM is priced at p for cpu cpu units, net network, data data, RAM for RAM, then the cost of the VM is Where a, b, c, and d are weights for each resource attribute and a + b + c + d = 1. The weight of each attribute can vary from application to application. Generally users need to transfer data which also incurs cost. Therefore, the total on-going cost can be calculated as the sum of data communication, storage and compute usage for that particular Cloud provider and service. Performance Performance is measured in terms of functionality, service response time in the cloud computing environment. The efficiency of service availability can be measured in terms of the response time. The service response time depends on various sub-factors such as average response time, maximum response time promised by the service provider, and the percentage of time this response time level is missed. Average Response Time is given by, Σ i T i /n Where T i is time between when user i requested for an IaaS service and when it is actually available and n is the total number of IaaS service requests. Maximum Response Time is the maximum promised response time by the Cloud provider for the service. Response Time Failure is given by the percentage of occasions when the response time was higher than the promised maximum response time. Assurance it indicates the likelihood of a Cloud service performing as expected or promised in the SLA. Reliability, availability and service stability are important factors in selecting Cloud services. Availability is the percentage of time a customer can access the service. It is given by: (Total service time) (Total time for which service was not available) / Total service time Security and Privacy data protection and privacy are important includes many attributes such as protecting confidentiality and privacy, data integrity and availability. The SMI Evaluator estimates the values of the KPI s selected by the user from the above mentioned SMI attributes for the various service providers. The QoS based ranking daemon uses this estimation to rank the service providers based on the QoS requirements of the users. B. QoS based Ranking System The QoS based ranking system in the cloud broker lets the user to select any one of the SMI attributes like Accountability, Agility, Assurance of Service, Cost, Performance, Security and Privacy as the dominant QoS attribute and the reputation score based on which the services have to be ranked. If customer specifies none, then the Average response time is considered as the default dominant QoS attribute. This type of ranking based one dominant QoS attribute is more convenient and realistic for the users as the user has to just decide upon just one QoS attribute instead of giving separate weights for all various QoS attributes. The calculation of QoS scores of services is performed based on the equation given below, 818

where QoSScorei is the QoS score of service i, i being the position of the service in the list of matched services, DominantQoSi is the value of the dominant QoS attribute of service i, BestDominantQoS is the highest or lowest value of the dominant QoS of the matched services when the dominant attribute is monotonically increasing or decreasing, respectively. A monotonically increasing QoS attribute means increases in the value reflects improvements in the quality, while monotonically decreasing means decreases in the value reflects improvements in the quality. After the broker receives the KPI s from the user, it contacts the SMI calculator or estimator to find services that match the customer s functional requirements, and retrieves their QoS information. The broker then uses the service matching, ranking and selection algorithm described in the service selector section to select the top M services (M is specified by the customer in the discovery request) to return to the customer. If the user specifies only the QoS requirements based ranking, the ranking is done based on the calculated QoSScorei, according to the formulae given above and the service providers list is stored in LS1. If no service is found, the broker returns an empty result to the customer. C. Reputation manager The reputation manager collects and processes service ratings from consumers and stores the service reputation scores in a RatingDatabase (Rating DB), and provides the scores when requested by the service selector. The reputation manager collects feedback regarding the QoS of the cloud services from the service consumers, calculates reputation scores, and updates these scores in the Rating DB. For this work, we assume that all ratings are available, objective and valid. Service consumers provide a rating indicating the level of satisfaction with a service after each interaction with the service. A rating is simply an integer ranging from 1 to 10, where 10 means extreme satisfaction and 1 means extreme dissatisfaction. Our service rating storage system is similar to the one proposed by Ziqiang Xu et al. [13]. A local database contains the reputation information which consists of service ID, consumer ID, rating value and a timestamp. Only the most recent rating by a customer for a service is stored in the table. New ratings from the same customers for the same service replace older ratings. The timestamp is used to determine the aging factor of a particular service rating. The reputation score (U) of a service is computed as the weighted average of all ratings the service receives from customers, where: N is the number of ratings for the service, Si is the ith service rating, λ is the inclusion factor, 0 < λ < 1, di is the age of the ith service rating in days. The inclusion factor λ is used to adjust the responsiveness of the reputation score to the changes in service activity. A smaller λ means that the more recent ratings have a larger impact on the reputation score and a larger λ means more of the ratings affect the score. Reputation is built from the aggregation of consumer ratings of a service based on historic transaction records. New services with no reputation are endorsed by trustworthy service providers or consumers before their reputation is established. D. The Service Selector The Service selector enables the customer to specify the request criteria. The service consumer has the choice to specify only QoS requirements in the request, or both QoS and reputation requirements. The Service 819

selector uses the services matching, ranking and selection algorithm to select the appropriate service provider. The services matching, ranking and selection algorithm proposed is given below in Figure 3. Figure: 3 Services Matching, Ranking and Selection Algorithm When the broker receives a request, it executes find services (line1) to identify the services that meet the QoS or the KPI requirements. If KPI requirements are specified, KPIMatch (line 3) is executed next on the set of services and it returns a subset of services LS1 that meet the KPI requirements. selectservices(line 5) always returns a list of M services to the customer where M denotes the maximum number of services to be returned. If KPI requirements are not specified, selectservices returns M randomly selected services. If only one service satisfies the selection criteria, it returns this service to the customer. In case, no reputation requirement is specified, qosrank (line 10) calculates QoS scores of the services and returns a list of services LS1 where the services are sorted in descending order based on their QoS scores only. The QoS score is calculated in the range of 0 to 1 for each service based on the dominantqos attribute value. The service with the bestdominant QoS value is assigned a score of 1. From LS3, selectservices (line 12) returns the top M services to the customer. If M is not specified, one service is randomly selected and returned from LS3 whose QoS score is greater than the user-specified thresholdlowlimit. For example, if LowLimit is 0.9, it means all services whose QoS score is greater than 0.9 will be considered in the random selection. The random selection prevents the service with the highest QoS score from always being selected, and thus helps to balance the workload among the services that provide the same functionality and similar QoS. In the case where a reputation requirement is specified, reputationrank (line 7) calculates reputation scores of the services and returns a filtered list of services LS2 containing only those services that have a reputation score equal to or above the specified required value. E. QoS /cost ratio As cost is one of the major criteria to be considered for selecting a service, the service consumer is given a QoS / Cost ratio based ranking of the service providers. The QoS / Cost ratio helps to filter the service providers who provide a better Quality of Service for a much less price than their competitors. The list LS4 gives the set of services ranked based on the QoS/ Cost ratio. III. CASE STUDY In this section, we present a case study example of the ranking mechanism presented in the previous section. The QoS data is collected from various evaluation studies for three IaaS Cloud providers: Amazon EC2 (S1), Windows Azure (S2), and Rackspace (S3) [8],[15],[16]. The unavailable data such as security level is 820

randomly assigned to each Cloud service. User weights are also randomly assigned to each QoS service attribute. The top level QoS groups are Accountability, Agility, Assurance, Cost, Performance and Security, Reputation. The reputation score is obtained from the reputation manager. The Table 1 gives the attributes and the values used in the case study for the above mentioned SMI attributes. The users are requested to select their required KPI s from these SMI attributes and the potential service providers are identified by the SMI Evaluator. TABLE I. ATTRIBUTES AND VALUES USED IN CASE STUDY TABLE II. CONSUMER / USER REQUIREMENTS The Table 2 gives the details of the user requirements (KPI s) selected by the users. The customer is assigned a customer id and is requested to select the dominant QoS attribute desired by the user from the SMI attributes given in the table 1 and the reputation level are also obtained from the user. The values for each KPI s given by the user is matched with the SMI attribute values evaluated for service provider in the SMI Evaluator and the reputation manager. The potential service providers are selected based on the results obtained from the IaaS services matching, ranking and selection algorithm as explained in the previous section. Figure: 4 Service Response Time of the Cloud Service Providers 821

Figure: 5 Upload Time of the Cloud Service Providers Figure: 6 Elasticity of the Cloud Service Providers Figure: 7 Security of the Cloud Service Providers Figure: 8 Reputation of the Cloud Service Providers 822

Figure: 9 Quality/ Cost ratio of the Cloud Service Providers The graphs given above, gives the SMI attributes evaluated in the SMI evaluator for the three service providers considered in the case study namely: Amazon EC2 (S1), Windows Azure (S2), and Rackspace (S3). Some examples of the SMI attributes evaluated in the SMI evaluator are: The service response time (Fig: 4), Upload time (Fig: 5), Elasticity (Fig: 6), Security (Fig: 7), Reputation (Fig: 8) and the Quality/Cost ratio (Fig: 9). The services that best match the QoS requirements of the user get a score of one. A comparison of the service response time of the three service providers considered is given in the graph (Fig: 4) and it could be clearly seen that the Rackspace (S3) offers a better service response time compared to Amazon EC2 (S1) and Windows Azure (S2). Whereas Amazon EC2 (S1) offers the least upload time and is faster than the other two service providers. From figure 6, one can conclude that Amazon EC2 (S1) is not only faster but also has high elasticity than Windows Azure (S2) and Rackspace (S3). Thus the users who require high speed and elasticity will prefer Amazon EC2 (S1) than the other two services. But the users who insist on the security than speed and elasticity would prefer Windows Azure (S2), since the security level is high in Windows Azure (S2) according to the (Fig: 7). Figure 8, gives a comparison of the reputation of the three service providers. Both Windows Azure (S2) and Rackspace (S3) score high in reputation than Amazon EC2 (S1). Note that Windows Azure (S2) scores high in both security and reputation. The Quality / Cost ratio helps the user to identify the service provider who gives the best value for money. From figure 9, it could be seen clearly that Rackspace (S3) has the highest Quality/ Cost ratio compared to Amazon EC2 (S1) and Windows Azure (S2). To conclude, the service provider with less service response time is S3 and the service that does not take much time to upload the files and hence with good speed is S1. Further it could also be clearly seen that the service with high scalability / elasticity is S1 and that with high security is S2. Both S2 and S3 have high reputation scores and S3 has the highest Quality/ Cost ratio. The discussion about the various QoS requirements given by the users and the services matching, ranking and selection process according to the proposed algorithm is given briefly in the next session. IV. DISCUSSION Based on the service consumers requirements given in the Table 2 and the QoS, Reputation and the Quality/ Cost ratio computations given in the graphs above, the services that would be selected for each customer are determined as follows: Since the Customer C1 has not mentioned any Dominant QoS and Reputation value, all the service providers namely S1, S2, S3 will be selected and the user will be given the option to select any of the three services providers. The Customer C2 has the Dominant QoS value as Cost and no preference for Reputation value. Hence, S1 service provider with the least cost and the S3 service provider who provides a better ratio of Quality to Cost will be promising candidates. Here the user is given the option to between the S1 least cost or S3 who provides a better ratio of Quality to Cost. The Dominant QoS and Reputation value of the Customer C3 are Performance and 8. Thus, The S3 service provider will be selected for C3. C4 has the Elasticity as the dominant QoS but no reputation preferences; hence, S1 and S3 are ideal services. The last customer C5 wants performance and the service provider with a reputation value of 9, thus S3 service provider will be selected for C5. 823

V. CONCLUSION AND FUTURE WORK Cloud computing has become an important paradigm for acquiring the required resources on demand. Currently, many Cloud providers offer different Cloud services with different QoS and SLAs. To find out the best Cloud services is the biggest challenge faced by Cloud customers. The comparison of the various cloud service providers based on certain QoS parameters is a time consuming job and the users should be able to compare them dynamically, according to the changing QoS demands to be met by them. One of the important step towards this is the SMI framework proposed by the Cloud Service Measurement Index Consortium (CSMIC). The aim of this consortium is to define each of QoS attributes given in the framework and provide a methodology for computing a relative index for comparing different Cloud services. In this context, this work presents the cloud broker framework, to systematically measure all the QoS attributes proposed by CSMIC and rank the Cloud services based on these attributes. We proposed an service matching, ranking and selection mechanism which can evaluate the Cloud services based on different applications depending on QoS requirements. Our proposed mechanism also addresses the challenge of evaluating the reputation of the service providers and thus enables the user to select the service provider based on three different requirements namely: Dominant QoS requirements only, Dominant QoS requirements and reputation and Quality / Cost ratio. According to Gartner, the Cloud Brokering Services (CBS) is a booming business and efficient services ranking, selection algorithms and value added services that could bring about new dimensions in providing the services is the need of the hour. The proposed cloud broker frame work is one of the initiatives in that line of thought. In the future we have planned to enhance the framework for various other value added services like auditing, security etc. REFERENCES [1] R. Bohn, J. Messina, F. Liu, and J. Tong, NIST Cloud Computing Reference Architecture, InProc. Of the 2011 IEEE World Congress on Services (SERVICES 2011), pp. 594-596, 2011. [2] P. Mell and T. Grance, "The NIST Definition of Cloud Computing," National Institute of Standards and Technology, Special Publication 800-145, 2011. [3] Gartner, "Gartner Says Cloud Consumers Need Brokerages to Unlock the Potential of CloudServices,"2009. http://www.gartner.com/newsroom/id/1064712 [4] R. Buyya, C. Yeo, S. Venugopal, J. Broberg, I. Brandic, Cloud computing and emerging IT platforms: vision, hype, and reality for delivering computing as the 5th utility, Future Generation Computer Systems, 2009. [5] Cloud Service Measurement Index Consortium (CSMIC), SMI framework. http://betawww.cloudcommons.com/ servicemeasurementinex. [6] W. Sobel, S. Subramanyam, A. Sucharitakul, J. Nguyen, H. Wong, S. Patil, A.Fox, D. Patterson, Cloudstone: multiplatform, multi-language benchmark and measurement tools for web 2.0, in: Proceedings of Cloud Computing and its Application, Chicago, USA, 2008. [7] C. Harmony, Cloudharmony.com, February 2012.,http://cloudharmony.com/. [8] A.Li, X. Yang, S. Kandula, M. Zhang, CloudCmp: comparing public cloud providers, in: Proceedings of the 10th Annual Conference on Internet Measurement, Melbourne, Australia, 2010. [9] S. K. Garg, et al., "Smicloud: A framework for comparing and ranking cloud services," In proc. Of the 2011 Fourth IEEE International Conference on Utility and Cloud Computing (UCC), pp. 210-218, 2011. [10] Majithia, S., Shaikhali, A., Rana, O., and Walker, D.(2004). Reputation-based Semantic Service Discovery. InProc. Of the 13th IEEE Intl. Workshops on Enabling Technologies: Infrastructure for Collaborative Enterprises (WETICE), pp.297-302, Modena, Italy. [11] Maximilien, E.M. & Singh, M.P. (2002). Reputation and Endorsement for Web Services. ACM SIGecomExchanges, Vol. 3(1), pp.24 31. [12] Wishart, R., Robinson, R., Indulska, J., and Josang, A.(2005). SuperstringRep: Reputation-enhanced Service Discovery. In Proc. of the 28th Australasian conf. on Computer Science, Vol. 38, pp.49-57. [13] Ziqiang Xu, Patrick Martin, Wendy Powley and Farhana Zulkernine, Reputation-Enhanced QoS-based Web Services Discovery.In ICWS, pages 249 256, 2007 [14] Rajendran Thangavel and Balasubramanie Palanisamy, (2009). Efficient Approach towards an Agent-Based Dynamic Web Service Discovery Framework with QoS Support. International Symposium on Computing, Communication, and Control, Proc. Of CSIT Vol.1 (2011) Singapore. [15] J. Schad, J. Dittrich, J. Quiane-Ruiz, Runtime measurements in the cloud: observing, analyzing, and reducing variance, Proceedings of the VLDBEndowment vol. 3, no.1-2, pp. 460 471, 2010 [16] A. Iosup, N. Yigitbasi, D. Epema, on the performance variability of production cloud services, in: Proceedings of IEEE/ACM International Symposium on Cluster, Cloud, and Grid Computing, CA, USA, 2011. [17] Vu, L., Hauswirth, M., and Aberer, K. (2005). QoS based service selection and ranking with trust and reputation management. In Proc. of the Intl. conf. on Cooperative Information Systems (CoopIS), Agia Napa, Cyprus. 824