Date post: | 12-Jan-2015 |
Category: |
Education |
Upload: | hong-linh-truong |
View: | 376 times |
Download: | 0 times |
WISE 2010, 13 Dec 2010, Hong Kong 1
On Identifying and Reducing Irrelevant Information in Service Composition and
ExecutionHong-Linh Truong1, Marco Comerio2, Andrea Maurino2, Schahram
Dustdar1, Flavio De Paoli2, Luca Panziera2
1Distributed Systems Group, Vienna University of Technology
2Departmen of Informatices, Systems and CommunicationUniversity of Milano - Bicocca
[email protected]://www.infosys.tuwien.ac.at/Staff/truong
WISE 2010, 13 Dec 2010, Hong Kong 2
Overview
Motivation Irrelevant information problems in service
composition and execution Enhancing context and quality support for
information about services and their data Reducing services and resources by qualifying
non-functional information Conclusions and future work
WISE 2010, 13 Dec 2010, Hong Kong 3
Motivating examples (1)
In the Semantic WS Challenge 2009 http://sws-challenge.org/wiki/index.php/Scenario:_Logistics_Management
Logistic operators offer shipping services, each characterized by a set of NFPs (e.g., payment method, payment deadline, base price, etc.)
Assume 100 equivalent services, each offers 5 service contracts
Without the quality of information, e.g., completeness and timeliness, for service contracts Time-consuming task to detect irrelevant contracts
Missing automatic detection of irrelevant contracts potentially leads to wrong decision
→ irrelevant service information in service composition
WISE 2010, 13 Dec 2010, Hong Kong 4
Motivating examples (2)
In many cases service composition is conducted before composition execution
Temporal distance between composition and execution time potentially makes At execution time, information about services in the
composition becomes irrelevant when the composition is executed QoS-based service adaptation is only a particular example
→ irrelevant service information in service execution
WISE 2010, 13 Dec 2010, Hong Kong 5
Motivating examples (3)
Let's compose services (e.g., Flickr and Youtube) given context constraints (e.g., free for non-commercial purpose, country location, etc.) and quality of data and services No or unstructured copyright policy and data licensing
information impossible to be used for automatic service selection
No quality and context associated with provided data Impossible to be used for selecting and filtering data
→ irrelevant information in service usage
WISE 2010, 13 Dec 2010, Hong Kong 6
Our goals
Examine possible irrelevant information problems in the context of service composition and execution The typical lifecycle of service composition and execution
Information about services and their data: service description, quality of service, service context, quality of data, etc. “context” in our work: data/service usage right/licensing, law
enforcements + traditional context (e.g., location)
Focus on data-intensive services (data-as-a-service) Not just a service as a whole: the service provider may not
be the data provider
Common topics with the Web information community
WISE 2010, 13 Dec 2010, Hong Kong 7
Limitation of existing work
Generic information overloading solutions not targeted to information about services
Tag cloud-based service filtering not sure how tag clouds can be used to describe the quality of
information about services and their data
Context and QoS-based service selection approaches Do not consider quality of data
Do not consider context, QoS and QoD together
Often assume high-quality service information
Do not distinguish the service level and the data resource level
WISE 2010, 13 Dec 2010, Hong Kong 8
Service composition & execution
Type D:data delivered by data-intensive services.
Type A:Requirements about service and data schemas, NFPs, documentation, service contracts, and provenance information
Type E:information about data requested by the consumer
Type C:information about the composite service and data provided by thecomposite service.
Type B: service and data schemas, NFPs, documentation, service contracts, and provenance information
WISE 2010, 13 Dec 2010, Hong Kong 9
Irrelevant information problems
Context and quality information models
Little support for data licensing/rights and quality of data (QoD)
associated with services and data resources
Context and quality information access APIs
No/limited description of data and service usage No separate API for retrieving quality and context information of
services and their data No quality and context information associated with the requested
data
Context and quality evaluation techniques
Missing compatibility evaluation techniques for context and quality of composite services
WISE 2010, 13 Dec 2010, Hong Kong 10
Current research focuses and practice uses
Context and quality information models Often used only a fraction of context, little information about
QoD, and unstructured context and quality description
Access APIs Mainly static publishing, mainly QoS metrics at runtime but
typically at the service as a whole level
Adaptive and context-aware algorithms Mainly for adapting individual services in a composition based
on QoS and “traditional” context
Either for the consumer-service flow or the composite service-
service flow The role of data concerns? Context and quality associated with data resources? → a common topic of Web services and Web resources
WISE 2010, 13 Dec 2010, Hong Kong 11
Our suggested roadmap
Develop meta and domain-dependent semantic representations for quality and context information To enrich traditional QoS and context parameters with data-
specific parameters using the linked data model
Develop context and quality information that can be accessed via open APIs for services and data resources To support on-the-fly access to such information
Not just for the well-known “the broken SOA triangle”
Develop techniques for context and quality compatibility evaluation To focus more on data/service licensing and QoD for service
composition based on their data and control dependencies
WISE 2010, 13 Dec 2010, Hong Kong 12
Reducing services and resources by qualifying non-functional information
A particular solution to deal with untrusted/low-quality information about services Apply to information exchanged between services/service
information systems, composition engines, composition tools and developers
Our initial solutions Use quality of data metrics to characterize service information
We just utilize some basic metrics
Filter service information based on consumers' requests.
Could be integrated with other solutions
WISE 2010, 13 Dec 2010, Hong Kong 13
Some QoD metrics (1)
Completeness=1−∥NFP p∩NFPmin∥
NFPmin
Timeliness=1− AgeExpectedLifetime
,1
Completeness specifies the ratio of missing values of provided NFP information, NFP
p to the
expected minimum set of NFPs, NFPmin
Timeliness specifies how current a non-functional
property is.
Note: these metrics can be associated with data provided by services, used for reducing irrelevant results
WISE 2010, 13 Dec 2010, Hong Kong 14
Some QoD metrics (2)
Interpretability=∑ score category i×wi
∑w i
Interpretability specifies the availability of documentation and metadata for correct interpretation of service information
Category Service information Examples
schema conceptual service and data schemas
WSDL, SAWSDL, pre/post conditions, data models
documentation documents APIs explanation, best practices
NFP non-functional properties categorization, location, QoS information
contract service contracts and contract templates
service level agreements based on NFPs
provenance Provenance information versioning of schemas, NFPs, contracts
Note: The evaluation of this metric requires different techniques for different categories
WISE 2010, 13 Dec 2010, Hong Kong 15
Filtering mechanisms
Two types of filtering Interpretability and NFPs.
NFP-based filtering: Step 1: Extract and establish NFPmin and ExpectedLifetime
from the developer’s requirement;
Step 2: Evaluate QoD metrics, e.g., Completeness and Timeliness;
Step 3: Establish filtering thresholds based on QoD metrics;
Step 4: Eliminate services whose information does not meet conditions setup in Step 3;
Step 5: Refine the filtering by repeating Step 3
WISE 2010, 13 Dec 2010, Hong Kong 16
Experiment: filtering service contracts
http://bit.ly/ewsYZC
Analyze the time required for ranking of 500 WSML (Web service modeling language) contracts without filters;
applying a filtering phase on completeness; applying a filtering phase on timeliness and applying a filtering phase on completeness and timeliness.
We performed two different experiments using an Intel(R) Core(TM)2 CPU T5500 1.66GHz with 2GB RAM and Linux kernel 2.6.33 64 bits.
WISE 2010, 13 Dec 2010, Hong Kong 17
Experiments: filtering evaluation
Performance evaluation with threshold: Completeness ≥ 0.6 (Filter 1) and Timeliness > 0.2 (Filter 2).
WISE 2010, 13 Dec 2010, Hong Kong 18
Experiments: filtering evaluation
With thresholds= {0, 0.2, 0.4, 0.6, 0.8, 1} which are equivalent to {not required, optional, preferred, strong preferred, required, strict required }
WISE 2010, 13 Dec 2010, Hong Kong 19
Conclusions and future work (1)
We have identified several irrelevant information problems in the lifecycle of service composition and execution
We proposed 3 topics for enhancing context and quality support for information about services a particular solution based on information quality metrics is
illustrated
Future work Systematically extend and evaluate specific QoD metrics for
service information Integrate our QoD-based solution with existing service selection
and composition techniques/tools
WISE 2010, 13 Dec 2010, Hong Kong 20
Conclusion and future work (2)
http://www.dbai.tuwien.ac.at/sodp/ In order to reduce
irrelevant information for data services: data-as-a-service publishing needs to combine forces from (Web) data management, SOC and cloud/grid computing