+ All Categories
Home > Documents > [IEEE 2012 IEEE International Conference on Services Computing (SCC) - Honolulu, HI, USA...

[IEEE 2012 IEEE International Conference on Services Computing (SCC) - Honolulu, HI, USA...

Date post: 04-Dec-2016
Category:
Upload: nikolay
View: 214 times
Download: 0 times
Share this document with a friend
8

Click here to load reader

Transcript
Page 1: [IEEE 2012 IEEE International Conference on Services Computing (SCC) - Honolulu, HI, USA (2012.06.24-2012.06.29)] 2012 IEEE Ninth International Conference on Services Computing - Quality

Quality Evaluation within Service-Oriented Software: A Multi-Perspective Approach

Ali Owrak, Abdallah Namoun, Nikolay Mehandjiev Manchester Business School

The University of Manchester Manchester, M15 6PB, UK

[email protected]

Abstract— In the original service-oriented view of software provision, loosely-coupled services are brought together at the time of need and unbound immediately following execution, allowing service procurers to focus on selecting services that best correspond to their evolving requirements. This just-in-time approach requires the assessment of quality properties of both the software and the service provision activity in order to judge candidate services. In this paper, we propose and evaluate a multi-perspective quality evaluation model tuned to the needs of this “just-in-time” service provision vision. The proposed model uses a hierarchal structure of the quality features that characterize both the software and its provision arrangements from the perspectives of different stakeholders in the service provisioning and consumption process. The development and evaluation reported here took place in two phases: a “role playing” user study involving 15 participants to elicit the suitability, applicability and measurability of quality characteristics; and a contextual interview involving 24 users (12 software professionals and 12 general users) to uncover their mental models towards quality and evaluate the resultant characteristics identified from the first study. Our findings were twofold. Firstly, we show that a broader range of considerations encompassing both service quality and quality of service must be accounted for when dealing with software services (e.g. service functionality and service responsiveness).Secondly, we identify and explore the users’ mental model of quality within the service-oriented paradigm.

Keywords-component; service quality, quality of service, service-oriented software, software as a service, service-level agreement

I. INTRODUCTION

In this paper, we present a quality evaluation model capable of undertaking the assessment of Service-oriented Software (SOS) from a multi-perspective viewpoint. This has been developed under the assumption that SOS, where software services are discovered and bound at the point of need, executed and then disbanded [16] will be increasingly used in response to the increased dynamism of contemporary organizations and the change in their need for software support. One example to illustrate the concept of SOS is the ad-hoc creation of personal travel guide to help commuters decide on the way they would travel to work on a particular day. Such a composite application will bring together a weather forecast service, a route planning service, a real-time road status update service and a public transport schedule service to provide personalized information in the

morning of travel. Updating any of these services will also update the composite application, i.e. the guide.

In contrast to SOS, conventional software engineering methods are driven by technological advance and oriented towards developing stable software systems which get delivered and installed as software applications. This product-based software model is accepted to stifle software change, making software permanently “out of sync” with organizational needs. SOS can address this, at least at its “pure” version, where a new “up-to-date” request is made every time the support system’s functionality is required.

Making this vision a reality requires advances on many fronts: ultra-late binding and dynamic composition should be complemented by seamless approaches to service specification and selection. These need a clear model of quality of software services. Indeed, assessing the quality properties required or associated with the service is a crucial enabler to judging the suitability of candidate services and the effects of their composition on the quality of the aggregate service. Current quality evaluation models do not address the issues presented by the SOS approach in their entirety, although various dimensions of quality are recognized independently [1][2].

This paper addresses this gap by proposing a multi-perspective quality model of software services. The model covers the perspectives of different stakeholders in the process of procuring, assembling and using software services, and includes attributes related to the software provision process as well as the software delivering the service itself. In formulating our service quality evaluation model, we adopted an iterative user-centric process in which the model was initially constructed, extended, refined, and evaluated, as depicted in Figure 1. In the first phase, various quality characteristics judged to be relevant to service-oriented architecture were synthesized from existing product and service quality models and used to form an initial sub-set of quality characteristics. This set was then presented to five role-playing discussion groups, each encompassing three participants undertaking the roles of service procurer, service provider and independent quality evaluator. The role playing exercise narrowed the original list of quality characteristics into a more informed and concise set of quality criteria which enabled their clustering into coherent categories. Finally, the refined model was evaluated by 24 users to validate the characteristics of the model and to

2012 IEEE Ninth International Conference on Services Computing

978-0-7695-4753-4/12 $26.00 © 2012 IEEE

DOI 10.1109/SCC.2012.86

571

2012 IEEE Ninth International Conference on Services Computing

978-0-7695-4753-4/12 $26.00 © 2012 IEEE

DOI 10.1109/SCC.2012.86

594

Page 2: [IEEE 2012 IEEE International Conference on Services Computing (SCC) - Honolulu, HI, USA (2012.06.24-2012.06.29)] 2012 IEEE Ninth International Conference on Services Computing - Quality

provide a deeper understanding of user perspectives towards quality within SOS.

This paper is organized as follows. Section II presents SOS and explores the premise of quality as an approach to differentiating amongst candidate services. Section IIIpresents the formulation process applied to developing the SOS quality model. Section IV details our multi-perspective quality model. Section V, forms an evaluation of the proposed quality model presented through a contextual interview. Finally, Section VI concludes and provides the implications of our model and future work.

Figure 1. Approach to Formulating SOS Quality Evaluation Model.

II. SERVICE-ORIENTED SOFTWARE

Inspired from the distributed nature of today’s software and tracing its roots in Object-Oriented Programming [3],Component-Based Software Development [4] and Client Server Architectures [5] the SOS idea emphasizes the rapid composition of distributed software applications [6] using platform-agnostic modular elements. The term SOS is often appearing in different contexts with varying connotations, lending itself to various concepts [7] and models [8]. In this article, we follow the early academic perspective of the Pennine Group [16] and define SOS as a concept in which software is configured to meet a specific set of requirements at a point in time, executed and then disengaged. Although the process may be tied to a physical product, the performance is essentially intangible and does not result in ownership of any factors of production [9]. The fundamental element of this view is the utilization of services for the development of applications and solutions to IT problems. This view of provision implies a greater emphasis on software providing a service, in contrast to the traditional view of software provision which focuses solely on the provision of software as a product.

Several streams of research are concentrated on delivering an environment which supports the concepts advocated by the SOS paradigm [23][24]. Industry has primarily focused on technological drivers [10][11] as enablers to the development and delivery of the SOS model. A key approach which achieves this is commonly referred to as the Service-Oriented Architecture (SOA), and has been defined as an approach used to build distributed systems that deliver software application functionality as services to

end-user applications, or to build other added value services [12]. SOA-based approaches focus on the configuration of entities (services, registers, and contracts) to maximize loose coupling and reuse [6]. However, SOA as a current technological movement is still far away from the SOS vision articulated above. Indeed the “just-in-time” dynamic composition behind the SOS vision requires cross organizational service integration, where solutions must also cope with non-functional requirements such as reliability, security, availability, cost and convenience [13]. The consequence is that any future approach to the development and deployment of software must be interdisciplinary so that non-technical issues, such as supply contracts, terms and conditions, and certification become an integral part of the new technology [14].

Figure 2. SOS Provision Model.

A snapshot of this inter-disciplinary SOS provision model is presented in Figure 2. The user makes a request for service, selects from a number of service providers who can provide that service (probably negotiating over the non-functional aspects of provision such as price, performance, quality etc.), and invokes the service using the agreed terms,formalized within a Service Level Agreement [15]. The service provider may provide the service atomically (i.e. they have all the functionality in-house) or may sub-contract to obtain some or all of the component services. If this happens, the service provider becomes the user of another service and the above process is repeated. This forms a supply chain from atomic service providers at the bottom, to the end-user of the software at the top. As one can see, the iterative aggregation and the recursive nature of this process require aclear and explicit model of quality to be used as a reference by the multiplicity of stakeholders involved in the process. The model should accommodate differences in concerns of the different stakeholder groups, and cover both the quality of the software service itself and the quality of the software service provision process.

A. The Need to Differentiate Enabling the evaluation of qualitative properties inherent

within software services will facilitate service procurers to make informed decisions regarding their suitability. The process of service selection is initiated once the service

572595

Page 3: [IEEE 2012 IEEE International Conference on Services Computing (SCC) - Honolulu, HI, USA (2012.06.24-2012.06.29)] 2012 IEEE Ninth International Conference on Services Computing - Quality

procurer applies and undertakes specific search parameters within a mature marketplace. Succeeding results may present the user with a number of candidate services that on the surface appear to satisfy the outlined request whilst in actuality fail to exhibit basic requirements. A further instance of this scenario is how the service procurer can differentiate amongst the large number of candidate services. Presented with search results, service procurers will ask five fundamental questions before selecting a service [25]: 1. What evidence is there that the software service is likely

to operate accurately? 2. Has the software service been tested in a way that is

relevant to my intended use? 3. What other usage has this software service had in areas

similar to my intended use? 4. What will be the performance and implications of using

this software service? 5. Is the service provider a trustworthy and competent

candidate? A potential resolution to differentiating among software

services can be undertaken through the evaluation of quality properties embedded within the provision, and application of software services. The development of a quality model specific to these needs is an appealing way of structuring these descriptions. Such a model would need to satisfy all of the usual characteristics that exist within software product evaluation, whilst addressing the specific needs that are essential to the provision and delivery of services.

III. FORMULATING QUALITY WITHIN SOS

A. Current Approaches to Quality Modeling quality is regarded as a difficult process [20].

This is often attributed to the lack of structured and widespread descriptions of the domain application, and the suitability of generic quality models [1][21], which can hamper the accurate descriptions and precise statements of quality requirements. Methodologies for building structured quality models help in solving these drawbacks [17][18].Currently there are two distinct approaches to representing quality. These are often referred to as software product-oriented and service-oriented models. Software product-oriented quality models [34] are generally of a hierarchal nature that ascribe to attaining tangible measures. They are progressively more prominent within the constraints of manufacturing and in particular the field of software engineering [1][20]. This is in contrast to service quality models, which work under the principal of expectation [2][19][22], and whose application is principally ordained to the field of marketing. Independently, these quality models have been developed for specific domains with rigid boundaries and do not lend themselves to the dynamic environment of SOS. Mutually, both approaches to modeling quality provide contributions that adhere to the needs of the SOS paradigm. The challenging problem is to

develop a quality model capable of linking tangible product characteristics to intangible service properties.

B. Quality Issues within SOS In order for a quality evaluation model to be effective

within the premise of SOS it must be proficient in providing tangible results and be generically applicable. Designing and developing such a model is a task which involves the consideration of a wide number of quality issues. The shift from ownership to use removes many issues of quality that were once relevant to software products. As a consequence, the need to evaluate traditional quality attributes is eithereliminated or surpassed by specific quality attributes derived from implications that address both Service Quality (SQ) and Quality of Service (QoS), where; � SQ, purports to the attributes of the software service

independent of, or prior to delivery, which are related to quality (e.g. reliability); and

� QoS, purports to the attributes of service provision and delivery that are related to quality (e.g. reputation). In the following discussion we raise the implications that

arise when undertaking quality evaluation within the SOS paradigm. The result of the ensuing discussion coupled with the review of related literature was compiled into a list of characteristics (available at [35]) that were presented to the participants of user study one.

Traditional quality issues ignored. With the shift of emphasis from ownership to use, issues of product quality that were once relevant to the consumer are considerably less important. Services are bound, executed and disengaged following execution in order to permit the evolution of the system. In this view, the issue of maintainability becomes the concern of the service developer and not the service user [16]. In addition, SOS asserts the needs for services to be distributed across different platforms and geographical locations [16]. The capability for services to co-exist, independent of platform and environment also eradicates the need to attain measurements of portability.

Importance of documentation. One of the consequences resulting in the shift from software ownership to that of service affects the issues of documentation. The need to provide technical documentation is replaced with the need to produce user documentation detailing the operational procedures of the invoked system. The quality of the documentation provided will be notably valuable as the invocation of a service will require the user to potentially navigate through a vast array of differing interface services.

User involvement. The intricate nature of quality assessment may require users to attain the services of independent quality specialists to assist in the evaluation of services. Nevertheless, the process of undertaking quality evaluation must be quick and simple to undertake. In addition, the SOS paradigm presents the user with the prospect of influencing the QoS associated with specific service providers. Although services disengage following use, service providers will remain constant. This permits

573596

Page 4: [IEEE 2012 IEEE International Conference on Services Computing (SCC) - Honolulu, HI, USA (2012.06.24-2012.06.29)] 2012 IEEE Ninth International Conference on Services Computing - Quality

service users to rate the level of quality they perceive to have attained from service providers. Satisfaction ratings can help differentiate amongst service providers and facilitate in the selection of software services.

Development cultures. The SOS paradigm advocates the development of services through atomic sources. This form of development poses an interesting challenge in evaluating the quality of software services. Conducting quality evaluation may require access to restricted information from a number of atomic service elements that may not wish or be able to provide information. Failing to attain comprehensive levels of data to conduct quality evaluation from one atomic source will negatively affect the overall level of quality assessment attained.

Evaluation process. A factor resulting from the aforementioned issue is the need for the development of a process to facilitate the evaluation of software services. The process of quality evaluation must be subject to specific rules and regulations since assessing the quality of software services may differ due to access and restrictions placed by service providers, and atomic service developers. Differing circumstances will require specific ways by which software services can be evaluated.

C. User Study One: Exploration Phase To explore user views towards quality within SOS and

to refine the list of characteristics attained from the literature review we conducted a user study involving 15 users. Our choice of research method was motivated by the need to emphasize and communicate user insight and needs about service quality from different perspectives. Participants were divided into 5 separate groups, containing 3 people, where each person was explicitly asked to play a specific role. Role-play is a technique used in various fields such as training [27] and psychotherapy [26], where participants are instructed to change their behavior by acting out an adopted role consciously or filling a social role unconsciously [28].The use of role-play extends to HCI areas such as the design process of interactive systems [29]. It is regarded as a narrative approach for understanding users and evoking their views and experience [28], allowing designers to see their products from the target user’s perspective [29]. In our study, we combined role play with brainstorming, a research design method [30], for an improved understanding of user perspectives on service quality.

For the purpose of our study we devised three roles (adopted from the SOS lifecycle) within the context of realistic user scenarios as follows: 1. Service procurer: you will request a payment processing service capable of generating invoices via a report and would like this information to be delivered via email, and printed. 2. Service provider: you will develop a payment processing service capable of generating invoices via a report. As a priority, this service needs to print address labels and have

addressing components that can validate and improve the accuracy of invoice addressing. 3. Third party service evaluator: you will evaluate the quality of the payment processing service as requested by both the service provider and procurer.

In each group, we asked one person to fulfill each role. It is worth noting that prior to the study we recruited and assigned participants to the aforementioned roles (i.e. service procurer, provider, and evaluator) based on their background profiles that match the role they should play in the study. For instance, software developers with at least 5 years of programming experience were assigned the role of service provider. Participants with intermediate software development background were assigned the role of service evaluator. Finally, participants with no programming background were assigned the role of service procurer. For consistency reasons, the same facilitator briefed the 5 groups to explain the purpose of the study and instructed each group to perform the subsequent tasks: � Task 1. Discuss what is meant by quality within the SOS

context and infer any associated issues. � Task 2. Select, from an extended list of quality

characteristics collated from quality model literature, the relevant characteristics according to role played. The tasks aimed at capturing the diverse views held

towards quality of software services, the identification of appropriate characteristics and their categorization. The facilitator moderated and controlled the flow of discussions to ensure participants focused on the tasks outlined above. The study lasted approximately 1 hour and the group discussions were recorded for follow-up analysis. The role-play tasks generated a wealth of qualitative data encapsulating diverse user views about what SOS means and what constitutes quality. Transcriptions of the discussions were placed into a spreadsheet in the form of statements. Next, the thematic analysis technique [31] was used to analyze the data. At first, the transcribed statements were coded with specific codes to describe their precise meaning. Next these codes were categorized and grouped into general themes.

D. User Study One: Results A total of 156 statements relating to quality of software

services emerged from the data. Service providers generated highest number of these statements (39.75%), followed by service evaluators (32.69%), and service procurers (27.56%). The group discussions yielded a myriad of definitions pointing to various service characteristics. The categorization of these definitions generated 24 general themes by service providers, 22 themes by service evaluators, and 20 themes by service procurers. We calculated the reoccurrences of these themes, ranked and extracted the top 5 emerging themes, as illustrated in TABLE I. All groups linked quality to reliability and functionality of the service. However, service procurers focused more on user satisfaction, ability to function in the

574597

Page 5: [IEEE 2012 IEEE International Conference on Services Computing (SCC) - Honolulu, HI, USA (2012.06.24-2012.06.29)] 2012 IEEE Ninth International Conference on Services Computing - Quality

expected way, and usability as the primary criteria for high quality software services.

One imminent strategy service providers followed when defining quality of software services was to provide beyond the definitions, and describe how certain characteristics could be measured. For instance one developer reported: “I could measure reliability through a tracking system”. Service developers also discussed how and why certain characteristics should not be evaluated by certain end users, for instance: “responsiveness is very difficult for the evaluator to judge; it is the role of the consumer” and “satisfaction is very subjective and can only be stated and assessed by the consumer”. Another interesting point raised is that some of the quality characteristics are not instantaneous but rather acquired over time via the active involvement of right target group, for instance: “satisfaction is something that builds up over time through consumer reviews” and “security is sometimes difficult for the evaluator to assess, as the provider would probably not want the information revealed”.

TABLE I. TOP 5 THEMES FROM USER DEFINITIONS OF QUALITY WITHIN SOS (SHADES REFER TO AGREEMENT BETWEEN DIFFIRENT GROUPS)

Provider Evaluator ProcurerTop 5emerging themes

Reliability Reliability SatisfactionFunctionality Functionality FunctionalityRequirements Usability UsabilityResponsiveness Security ResponsivenessSecurity Documentation Reliability

Overall # of themes

24 22 20

% of top 5 themes

33.87% 41.17% 48.84%

Evaluators emphasized the need to understand the requirements and views of the service procurer in order to successfully match those to appropriate candidate services, for instance: “we need to work out the procurer profile and make sure that this is fulfilled”. This view is reflected in TABLE II. where evaluators indicated that the presence of adequate documentation is quite important to service quality. Service procurers were primarily concerned with whether the service works and meets their needs, and were not interested in the inner workings of the software service. One service procurer indicated that “certification and following standards” is one key feature when choosing a service. In certain occasions, reliability was directly linked to trust; indicating that trusting the service provider is central to establishing reliability.

In the second task, each participant was given a list of 25 quality characteristics and their corresponding definitions and instructed to select only those relevant to the quality of SOS independently of other group members. Similarly to the first task, we counted the participants’ selection of each quality characteristic, calculated and ranked the percentages of selection for each role. Service procurers showed more consistent results with that reported in TABLE II. than provider and evaluator groups, where satisfaction, usability

and responsiveness were chosen as key elements to quality within SOS. The other two groups focused more on reliability, accessibility and availability of services. Average selection of the top 5 characteristics exceeded 84% by each role.

TABLE II. TOP 5 CHARACTERISTICS RELEVANT TO SOS AS SELECTED BY PARTICIPANTS

Provider Evaluator ProcurerTop 5 select characteristics

Reliability Accessibility SatisfactionAccessibility Availability ReputationAvailability Regulatory ResponsivenessEfficiency Security UsabilitySecurity Reliability Empathy

Average % of selection by participants

84.43% 84.43% 88.88%

IV. QUALITY MODEL

In response to the findings outlined in the literature review and user study one, we propose a quality evaluation process that is open to individuals with different skills, expectations and motivation. This permits different access constraints, technical concerns, and regulatory conditions to be applied to the evaluation of software services. This approach to quality evaluation is effective in dealing with the dynamic needs of the SOS paradigm. Representing quality evaluation through different perspectives will clarify the constraints imposed on the different parties that wish to undertake quality evaluation. For example, the assessment of complex and important software services may place greater emphasis on quality features that a service procurer may not be able to evaluate due to access constraints. In such circumstances, it would be beneficial for the service procurer to be aware of the quality features that they are permitted to evaluate. This approach facilitates the representation of quality characteristics, sub-characteristics, attributes and metrics applicable to assessing software services within three distinct evaluation perspectives. The first seeks to administrate the role of quality evaluation from the perspective of the service procurer and as a result does not seek to undertake measures that relate to the internal aspects of SOS. The second focuses on the quality evaluation process from the perspective of the service provider, and has been constructed to exclude the views of the user. The third and final level views quality evaluation from the perspective of an independent evaluator, and as a consequence is comprehensive in its outlook to representing quality. Our approach provides a systematic way to undertake quality evaluation, enabling service procurers to make objectively informed decisions regarding the suitability of candidate services. We have developed the Service-Oriented Software Quality Evaluation Model (SOSQEM), presented in Figure 3. which is a unifying approach that enables the evaluation of multiple quality characteristics inherent within the provision, and application of software services.

575598

Page 6: [IEEE 2012 IEEE International Conference on Services Computing (SCC) - Honolulu, HI, USA (2012.06.24-2012.06.29)] 2012 IEEE Ninth International Conference on Services Computing - Quality

Figure 3. SOS Quality Evaluation Model.

The architecture of SOSQEM, identifying the relationships between the high-level quality characteristics with their corresponding quality items, sub-characteristics, attributes, metrics, items and evaluation perspectives ispresented in its entirety (due to size at [35]), whilst TABLE III. presents a summary overview of the primary quality characteristics.

TABLE III. QUALITY CHARACTERISTICS OF SOSQEM

Service Quality Characteristics Quality of Service CharacteristicsFunctionality; refers to the capability of a service to provide functions which meet stated needs. Comprised of sub-characteristics; suitability, accuracy and security.

Service Responsiveness; refers to the capability of the service provider to demonstrate willingness to assist and provide prompt service.

Reliability; refers to the capability of a service to maintain a specified level of performance. Comprised of sub-characteristics; maturity, fault tolerance and recoverability.

Service Reputation; refers to the credibility of the service provider.

Usability; refers to the capability of a service to be understood, learned, used and attractive when utilized. Comprised of sub-characteristics; understandability, learnability, operability, attractiveness and documentation.

Service Assurance; refers to the capability of the service provider to inspire trust and confidence.

Efficiency; refers to the capability of a service to provide appropriate levels of performance when invoked. Comprised of sub-characteristics; time behavior and resource utilization.

Service Empathy; refers to the capability of the service provider to deliver individualized attention to the user.

The SQ features map on to the objectively assessable properties that exist within service applications. These properties include the characteristics and attributes of the service which demonstrate quality prior to service delivery.Figure 4. presents a partial top-down view of the quality characteristic Usability. The SQ features applied to our model have been adapted from the ISO/IEC 9126 and tailored to meet the needs presented by the SOS paradigm. They present a solid foundation and understanding of what constitutes SQ. With the shift of emphasis from ownership

to use, issues of product quality that were once relevant to the consumer are considerably less important and as a result a number of modifications have been made (e.g. removal of maintainability and portability).

Figure 4. Partial Top-Down Structure of Usability.

In contrast, QoS looks beyond the technical features of a software service and instigates measures into its provision. Evaluation stems from the comparison of what service users feel should be provided (i.e. their expectations) with their perceptions of the service providers performance. In line with the conventional approach to QoS evaluation, perceived quality within the premise of SOS can be viewed as the degree and direction of discrepancy between the perception and expectation of the user [2]. This aspect of the quality model is represented in the form of four quality dimensions, each with multiple items that have been recast into two statements. One statement corresponds to expectation measures regarding the service provider being evaluated, whilst the second statement measures the perception of the particular service provider whose service quality is being assessed. As an example, Figure 5. presents a top-down view of service reputation and its corresponding sets of statement. Service perception measures can be obtained through an evaluator’s experience of a service (e.g. trial invocation) or from the service provision history of the service provider. As service providers are constant entities within the SOS marketplace, it is assumed that past perception measures will be made available to the service evaluator.

The SOSQEM combines the properties that exhibit quality within traditional software applications with those that are specific to service industries to provide a multi-perspective approach for the evaluation of software services. This is an innovative solution that caters to the ethos of SOS. This approach reduces the complexity of undertaking evaluation as accessibility and openness of assessment is derived through evaluation perspectives. Defining an evaluation perspective facilitates evaluators to identify and tailor their given quality requirements so as to ascertain subjective measures within objective parameters.

Correspond With

Quality Sub -Characteristics

Corresponds With

Quality Items

Quality Characteristics

Quality Attributes

Quality Metrics

Service Quality Quality of Service

SOSQEM

Evaluation Perspectives

576599

Page 7: [IEEE 2012 IEEE International Conference on Services Computing (SCC) - Honolulu, HI, USA (2012.06.24-2012.06.29)] 2012 IEEE Ninth International Conference on Services Computing - Quality

Figure 5. Top-Down Structure of Service Reputation.

V. USER STUDY TWO : EVALUATION PHASE

The second study took the form of a contextual interview [32] and aimed to collect the characteristics and features participants rely on when selecting services within a realistic context. To this end, we recruited 12 software professionals and 12 general users to search for and selectthree target services using the SOA4All [23] visual service search engine (Figure 6. ). Using the think aloud protocol [33], participants were encouraged to verbalize their mental thoughts and selection criteria during the tasks. In this study we were not concerned about performance data (e.g. task completion time) but rather interested in how participants selected services and what issues they encountered during the search and selection process.

Figure 6. Service Search Engine.

The transcription of user thoughts yielded 32 statements from the developers and 40 statements from the general users. Developers raised the issue of interpreting and understanding the quality rating scale in the correct intended manner. As such they felt the need to supply services with sufficient documentation describing the functionality of the service to ease the selection process (18.75% of the themes). Developers also highlighted that usability and interactivity are important features of software services (18.75%). However, concerns raised by general users centered on their inability to differentiate between the quality of candidate services and to make correct judgments regarding their suitability. From a general user perspective, reading consumer reviews and experiences is an important factor to making the right decision (17.5% of the themes). Another key criterion is the provision of a general/overall rating and

a detailed rating of all relevant characteristics to the service (17.5% of the themes). Usability and the level of service complexity also impact the quality of software services (12.5%).

Following the service search task, we asked the participants to rate the importance of 8 quality characteristics that form SOSQEM when selecting software services on a 7-point Likert scale, where 7= very important and 1= not important at all. Statistical analysis (presented in Figure 7. ) shows no significant difference between the consumers and developers rating of the selected quality characteristics. This signifies that regardless of user technical background our choice of quality characteristics fit both user profiles. Average rating of 4 characteristics exceeded 6 (i.e. =important) out of 7 in the following order: reliability, functionality, usability, and efficiency. These results correspond with that of study one reported in Table II. Only average rating of service empathy importance fell below 5 (mean= 4.79).

Figure 7. Average Rating of Importance of SOSQEM Characteristics.

VI. IMPLICATIONS OF MODEL AND FUTURE WORK

We have presented a quality evaluation model within a demand-led service-based architecture capable of assessing the value of software services. In formulating this model, we adopted an iterative user-centric process in which the model was initially constructed, extended, refined and evaluated through two user studies. Our findings were twofold. Firstly, they demonstrate how and why service quality criteria can enable service procurers to make informed decisions regarding the suitability of candidate services. Secondly, the results show that whilst service procurers emphasized the importance of user satisfaction and usability of services, service providers placed greater emphasis on low level quality features such as reliability and functionality. Finally, we conclude that whilst many quality issues are considered the same for both products and services, a broader range of considerations must be regularly accounted for when dealing with software services, as presented in our model. A shift to this approach has a number of implications for both service procurers and service providers as summarized in TABLE IV. As part of our future work we aim to produce a quality assessment framework that encompasses everything from requirement

577600

Page 8: [IEEE 2012 IEEE International Conference on Services Computing (SCC) - Honolulu, HI, USA (2012.06.24-2012.06.29)] 2012 IEEE Ninth International Conference on Services Computing - Quality

capture to evaluation and result quantification, allowing this to be translated into a format capable of performing quality assessment semi-automatically. We expect that developing a high-level representation of requirements may be a major research issue, in conjunction with producing a lightweight yet satisfactory process for eliciting and quantifying them.In addition, we intend to apply this quality evaluation framework to the problem of composing travel information services into a personalized commuter guide, and test this within the context of the EC-funded project MODUM, which aims to provide optimizations of commuter patterns by promoting sustainable modes of travel.

TABLE IV. BENEFITS OF THE SOS QUALITY EVALUATION MODEL

Service Providers Service ProcurersBenchmark for the development and provision of software services

Facilitates service selection by empowering service procurers to make informed decisions

Provides evidence of fitting software service operation and provision

Increases service procurers satisfaction and enriches the service interaction experience

Service providers are able to focus efforts and resources on characteristics relevant to their target end users

Builds trust and loyalty towards service providers; a very important principle for business excel and continuation

Allows the identification of complementary services which enhances the composition process

Broadens understanding and increases awareness of valuable SOS properties

VII. ACKNOWLEDGEMENTS

This study was partially funded by the EC project MODUM (http://modum-project.eu/).

REFERENCES

[1] ISO/IEC PDTR 9126-1: Software engineering – product quality –part 1: quality model, International standards organisation, 2002.

[2] A. Parasuraman, V.A. Zethaml, and L.L. Berry, More on Improving Service Quality Measurement, Journal of retailing, vol. 69, p. 140-147, 1993.

[3] R. A. Johnson, B.C. Hardgrave, E.R. Doke, An industry analysis of developer beliefs about object-oriented systems development, ACM SIGMIS database, vol. 30, p. 47-64, 1999.

[4] I. Crnkovic, Component-based software engineering – new challenges in software development, Software focus, vol. 2, p. 127-133, Springer, 2001.

[5] A. Berson, Client/server architecture, Mcgraw-hill series on computer communications, 1992.

[6] T. Erl, Service-oriented architecture – concepts, technology, and design, Pearson education inc., 2005.

[7] L. Prechelt, et al., A controlled experiment in maintenance comparing design patterns to simpler solutions, IEEE transactions on software engineering, vol. 27, p. 1134-1144, 2001.

[8] J. O’Sullivan, D. Edmond, and A.T. Hofstede, Formal description of non-functional service properties, Technical FIT-TR-2005-01, Queensland university of technology, Brisbane (Australia), 2005.

[9] C. Lovelock, S. Vandermerwe, and B. Lewis, Services marketing,Prentice hall Europe, 1996.

[10] E. Newcomer, and G. Lomow, Understanding SOA with Web Services, Pearson education inc., 2005.

[11] S. J. Vaughan-Nichols, Web Services: Beyond the Hype, vol. 35, p. 18-21, 2002.

[12] M. Colan, Characteristics of service-oriented architecture, IBM technical report, 2004.

[13] M. Stal, Web services: beyond component-based computing,Communications of the ACM, vol. 45, p. 71-76, 2002.

[14] N. Mehandjiev, P. Layzell, and P. Brereton, Unravelling the complexities of interdisciplinary software engineering, Proceedings of the 11th annual international workshop on software technology and engineering practice, p. 214-221, 2003.

[15] J.J.M. Trienekens, J.J. Bouman, and M.V.D. Zwan, Specification of service level agreements: problems, principles and practices,Software quality journal, vol. 12, p. 43-57, 2004.

[16] K. Bennett, et al., An architectural model for service-based flexible software, Proceedings of the 25th international computer software and applications conference on invigorating software development, p. 137-142, 2001.

[17] J.P. Carvallo, X. Franch, and C. Quer, Building and using quality models for complex software domains, Technical report LSI-03-28-R, Dept. LSI (UPC), 05.2003.

[18] J.P. Carvallo, et al., COSTUME: a method for building quality models for composite cots-based software systems, Proceedings of the 4th international conference on quality software, p. 214 – 221, 2004.

[19] H. Blake, Measuring service quality, Proceedings of the 12th annual CSU-POM conference, Sacramento (USA), 2000.

[20] R.G. Dromey, A model for software product quality, IEEE transactions on software engineering, vol. 21, p. 146-162, 1995.

[21] R. Hendriks, R.V. Vonderen, and E.V. Veenendaal, Measuring software product quality, Software quality professional, 2002.

[22] H. Kang, Measuring the service performance of information technology departments: an internal service management approach, Proceeding of the 10th australasian conference on information systems, p.462-473, 1999.

[23] M. Zuccala, SOA4All in action: enabling a web of billions of services, Towards a service-based internet – lecture notes in computer science, vol. 6481, 2010

[24] D. Hall, et al., Taverna: a tool for building and running workflows of services, Nucleic acids research, vol. 34 (web service issue), 2006.

[25] A. Owrak., Quality evaluation within the premise of SOS, PhD thesis, Manchester Business School, 2009.

[26] A. S. Bellack, R. L. Morrison, D. Lamparski, Role play for assessing the social competence of psychiatric patients, Psychological Assessment: A Journal of Consulting and Clinical Psychology, vol 2(3), p. 248-255, 1990.

[27] M. Halapi and D. Saunders, Language teaching through role-play: a Hungarian view, Simulation and Gaming 33, p. 169-178, 2002.

[28] P. Wright and J. McCarthy, Empathy and experience in HCI, Proceedings of the twenty-sixth annual SIGCHI conference on Human factors in computing systems (CHI '08), ACM, New York, NY, USA, p. 637-646, 2008.

[29] B. Medler and B. Magerko, The implications of improvisational acting and role-playing on design methodologies, Proceedings of the 28th international conference on Human factors in computing systems (CHI '10), ACM, New York, NY, USA, p.483-492, 2010.

[30] T. Kramer, G. P. Fleming, and S. M. Mannis, Improving Face-To-Face Brainstorming Through Modelling and Facilitation, Small Groups Research, 32, p.533-557, 2001.

[31] V. Braun and V. Clarke, Using thematic analysis in psychology, Qualitative Research in Psychology, vol 3, p. 77-101, 2006.

[32] H. Beyer, and K. Holtzblatt, Contextual Design: defining customer-centered systems, Morgan Kaufmann Publishers, 1998.

[33] C. H. Lewis, Using the thinking aloud method in cognitive interface design, IBM Research Report, RC-9265, 1982.

[34] B.W. Boehm, Characteristics of Software Quality, North-Holland Publishing Company, 1978.

[35] http://130.88.6.246/SCC2012/index.html

578601


Recommended