+ All Categories
Home > Software > Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software...

Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software...

Date post: 06-Aug-2015
Category:
Upload: anamariadragomir
View: 157 times
Download: 0 times
Share this document with a friend
Popular Tags:
21
Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures Ana Dragomir 02.12.2014 Motivation State of the art vs. state of the practice Goals Approach Evaluation Summary and outlook
Transcript
Page 1: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

Run-time Monitoring-based Evaluation and

Communication Integrity Validation of Software

Architectures

Ana Dragomir

02.12.2014

Motivation

State of the art vs. state of the practice

Goals

Approach

Evaluation

Summary and outlook

Page 2: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

The ARAMIS Project

Motivation

“You don’t need architecture to build a dog kennel, but you’d better have some for a skyscraper”

Software architecture (SA) description is useful to understand

and meaningfully evolve a software system but…

“Is it just a shared hallucination?”

…the architecture drifts from its description!

The description:

is no longer as useful

can lead to misunderstandings

domino effect

Page 3: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

Problem Statement

SHOULD IS

VS.

Page 4: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

Stakeholders

Architects

Software Landscape Architects

Are the systems of the landscape interacting as prescribed?

Which systems are the hubs and sinks?

Software Architects

Is the system built according to its description?

Which architecture units are the hubs and sinks?

Which architecture units are “too complex”?

Software Developers

If I need to change a requirement, how can I quickly find out

how it is currently implemented?

What architecture/software units are interacting & how?

Page 5: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

State of the Art

Related Work

Page 6: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

State of the Art

Related Work

Page 7: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

State of the Practice

Our Experience: Company 1

CMMI Level 3

More than 1000 IT Employees

Large, heterogeneous systems (Java EE, Cobol, …)

System architecture Decisions and architecture are documented, though separately

Code-quality monitors (e.g., Sonarqube) are used

Enterprise architecture Inter-system information flow diagrams

Manual proces

No architecture reconstruction tools! “Read the documentation, then start making phone calls”

Reconstruction is cumbersome and must be supplemented with support for evolution

Page 8: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

State of the Practice

Our Experience: Company 2

Medium Enterprise

More than 500 IT Employees

Mainly Java-based systems

System architecture

“The Developer Handbook”

Low-level descriptions

Abstract view of the system missing

Joint attempt with SWC to reconstruct the architecture

Purchased Sonargraph Architect

Encountered terminology differences

Page 9: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

State of the Art

Flaws

The reconstruction occurs on very different abstraction levels

Structural view: layers, modules, subsystems, etc.

Behavioral view: objects, methods, etc.

Heterogeneity of terminology is not addressed

Reconstruction tools have stiff meta-models

The architects expect results that conform to their terminology

Heterogeneity of systems is not properly addressed

The interplay of heterogeneous systems is important

Reconstruction by itself is not a goal

Are the units behaving as expected? What are the deviations?

Evolution support is needed!

Page 10: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

Goals

Improve the traceability between usage scenarios, implementation

and architecture documentation

Develop/Use a minimally intrusive technical solution for run-time

monitoring and real-time visualization

Provide a means to evaluate if the predetermined architecture

rules have been respected

Easily extendable solution

Page 11: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

Multi-level Behavior Monitoring

XX

X Y

MM

MTo

From

Operation Name

Parameters

Monitoring

Syste

ms T

o M

on

ito

rA

na

lysis

m2 m3

m4m6

AA

A Bm1

m10

m7

m8

m9

Interest: communication

between AA, XX, MM

AA XX MMm2m5

m6

m8

m9

Interest: communication

within AA

A Bm1

m10

m7

Interest: High level

violations

AA MMm8

m9

Page 12: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

Architecture

ARAMIS for Communication Integrity Checking and Evaluation

Page 13: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

Architecture Meta-model

Page 14: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

Metrics @ ARAMIS

Goal: find weak points in the analyzed behavior

Step 1: Characterize behavior

Step 2: Compare previous result with the architects’ expectations

Metrics Categories

Behavioral coupling and cohesion metrics

Behavior hotspots

Violations-based metrics

Page 15: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

ARAMIS

Coupling and Cohesion Metrics

Behavioral Coupling (BCo)

Behavioral Cohesion (BCh)

Scenario-based Unit Behavior Metric (SUB)

Cohesion vs. Coupling of an Architecture Unit

SUB = 𝐵𝐶ℎ

𝐵𝐶ℎ+𝐵𝐶𝑜

SUB Characterization (SUBC)

High coupling/low cohesion; SUB ∈ [0, 0.5)

Mid coupling/mid cohesion; SUB ∈ [0,5, 0.66)

Low coupling/high cohesion; SUB ∈ [0.66, 1]

Page 16: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

Evaluation

Phase 1: validation of communication rules

The MosAIC software system

Incentive: an initial architecture description was available

interest in evaluating its conformance

111753 LOC

116 classes, 10 packages

Defined 4 top-level units & 16 inner units

20 allowed rules

>100000 monitored calls

3 distinct architecture violations (frequencies: 1, 2, 22 resp.)

The result was used to improve the architecture

Increased confidence in MosAIC’s quality

Page 17: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

Evaluation

Phase 2: relevance of the SUB and SUBC metrics

The JHotDraw framework

Incentive: presumably, a good designed architecture

126068 LOC

529 classes, 38 packages

Defined 12 top-level units but only 7 were used at run-time

No rules

> 150000 monitored calls

Page 18: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

Evaluation

JHotDraw results

Page 19: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

Conclusions

ARAMIS aims to:

Support the understanding of software systems

Validate their communication integrity

Assess their behavior

The current results are promising

Evaluated two projects

MosAIC

JHotDraw

Page 20: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

Outlook

Develop relevant multi-level visualizations

Experiment with monitoring the inter-play of heterogeneous systems

Model evolution scenarios

Simulation & Impact analysis

Explore further relevant metrics to depict the quality of the

architecture

Flexible quality model, tailorable according to the specific needs of the

architects

Page 21: Run-time Monitoring-based Evaluation and Communication Integrity Validation of Software Architectures

Summary

ARAMIS

Flexible approach towards

Monitoring architecture

Validating its communication integrity according to specified rules

Assessing its quality

Supporting a reasonable evolution


Recommended