Optimizing Monitorability of Multi-cloud Applications

Post on 11-Feb-2017

85 views 0 download

transcript

Optimizing Monitorability of Multi-cloud Applications

E. Fadda, P. Plebani, M. Vitali

Politecnico di Milano, ItalyPolitecnico di Torino, Italy

Multi-cloud applications

VMVMVMVMVMVMVMVMVMVM

Multi-cloud application developer Cloud

providers

Multi-cloud applications

VM

VM

VM

VM

Optimal deployment strategies take usually into account performances and capabilities of cloud providers

MOTIVATION

Developers want to collect information about the behaviour of their applications deployed in clouds

GOALDeployment optimization based on both capabilities, quality, and cost of application monitoring data

Information on behaviour is obtained gathering monitoring dataNot all cloud providers offer the same monitoring capabilities

The approach

Monitorability The possibility to measure and assess behaviour of the deployed application

Asks for monitorability

Offersmonitorability

The approach

Monitorability The possibility to measure and assess behaviour of the deployed application

Ask for monitorability

Offermonitorability

VM VM

VM VM

Monitorability

● Requested list of dimensions: e.g., availability, cpu load● Sampling time (not always)

+ capabilities and constraints

+ budget

VM

● Offered list of dimensions: e.g., availability, cpu load● Sampling time

+ capabilities and constraints

+ cost

We want more

VMUsability● Application developers can easily define their

requirements ● Technical details should be hidden to the user

Extensibility

● Offering includes monitored dimensions● … but also estimated (E) dimensions● … and on-demand (M) dimensions

Approach feasibility

Different cloud providers can provide a different set of metrics.

A cloud provider offers metrics with higher accuracy at a cost (e.g. Amazon Cloud Watch, Paraleap Cloud Monix)

Some monitoring systems can be extended with custom metrics (e.g. Nagios, PCMONS, Sensus)

MatchmakingOfferings and Requests are submitted to a Cloud Broker in charge of finding the best deployment

Ask for monitorability

Offermonitorability

VM VM

VM VM

Matchmaking

VM VM

VM

VM

Maximizing

● Dimensions coverage

● Quality of monitoring

Minimizing

● Cost

Example

ExampleNumber of VMS and metrics of interest

Example

Constraints on VM deployment

Example

Metrics offered by cloud providers

Additional information is required

Knowledge Base

Knowledge Base

Dimensions abstract information the user want to collect

Knowledge Base

Dimensions abstract information the user want to collect

Metrics used to assess the dimension of interest

Knowledge Base

Dimensions abstract information the user want to collect

Metrics used to assess the dimension of interest

Metric Measurements used to compose the metric and provided by probes

Metrics estimationEstimation is used to provide trends of a metric without need to measure it.

Analysis of stored data to find relations between metrics. Represented through a Bayesian Network.

Vitali,Pernici, and O’Reilly, “Learning a goal-oriented model for energy efficient adaptive applications in data centers,” Information Sciences 2015

Running optimizationSTEP 1 The user specifies for each VM the dimensions or the metrics he is interested to collect, with their accuracy

STEP 2 The set of metrics are extracted from the knowledge base from the dimensions

STEP 3 The optimization algorithm - multi-objective MILP - is executed to find the set of feasible solutions

Estimating the accuracy for each metric in each configuration

The optimization function

Assign VMs to sites to maximize:

monitored(m,s,v) + Δon_demand(m,s,v) + Δestimated(m,s,v)

and minimize cost

… and constraints

Accuracy computation

For monitored and on_demand metric measurements (mm), accuracy is:

sensor sampling timedesired sampling time

For estimated metric measurements (mm), accuracy is:

min sensor sampling time desired sampling time

∀ mm parents of the estimated mm

Accuracy computation

The accuracy of a metric (m) is:

min(mm1,..,mmn)

∀ mm contributing to m

Performance evaluationPerformances depend on number of servers, number of VMs, and number of metrics per VM

Solver: Gurobi

Servers:Intel Core i7-5500U 8GB RAM

Validation

Sites: 7VMs: 4

Metrics: 7Response time: 19.2 sec

Validation

Sites: 7VMs: 4

Metrics: 7Response time: 19.2 sec

Future stepsImproving accuracy evaluation

Considering server capability in MILP

New multi-objective goal: integrating performance

Optimizing Monitorability of Multi-cloud Applications

E. Fadda, P. Plebani, M. Vitali

Politecnico di Milano, ItalyPolitecnico di Torino, Italy