How A Consumer Can Measure Elasticity for Cloud Platforms

helainelins
Mind Map by , created over 5 years ago

Ano: 2012 Autores: Sadeka Islam/Kevin Lee/Anna Liu - Nacional ICT - Austrália - University of New South Wales Alan Fekete - University of Sydney - Nacional ICT - Austrália

16
0
0
helainelins
Created by helainelins over 5 years ago
Metallic bonding
anna.a.graysmith
PuKW - STEP 1 (mögliche Prüfungsfragen/Prüfungsvorbereitung)
Steven Lee
How did the Cold War develop?
Elisa de Toro Arias
Salesforce Admin Exam Chunk 2 (31-65)
Brianne Wright
1PR101 2.test - Část 10.
Nikola Truong
Elasticidade em Nuvem
helainelins
Mapa Mental - Intelligent Edge for IoT Analytics
Alex Martins
CLOUD COMPUTING
marceloccarvalho
Silas Marner notes
mehxinee
Memory - AQA Psychology Unit 1 GCSE
joshua6729
How A Consumer Can Measure Elasticity for Cloud Platforms
1 Introduction
1.1 IT Infrastructure
1.1.1 Cloud
1.1.1.1 Low-cost
1.1.1.2 Availability

Annotations:

  • Disponibilidade = a proporção de tempo que um sistema está em uma condição de funcionamento http://en.wikipedia.org/wiki/Availability
1.1.1.3 Elasticity

Annotations:

  • NIST definition Capabilities can be rapidly and elastically provisioned, in some cases automatically, to quickly scale out, and rapidly released to quickly scale in. To the consumer, the capabilities avaliable for provisioning often appear to be unlimited and can be purchased in any quantity at any time
1.1.1.3.1 Pay only for what need
1.1.1.3.2 Automatic provisioning
1.1.1.3.3 Quick scale
1.1.1.3.4 Unlimited
1.1.1.3.5 Any quantity
1.1.1.3.6 Any Time
1.1.1.3.7 Pay-as-you-grow
1.1.1.3.8 Elasticity <> Availability
1.1.1.4 How elastic is each system?
1.1.1.4.1 Benchmark
1.1.1.4.1.1 Measures
1.1.1.4.1.1.1 Existing today
1.1.1.4.1.1.1.1 Not explicit measurement of elasticity
1.1.1.4.1.1.2 Need to develop
1.1.1.4.1.1.2.1 Appropriate measures
1.1.1.4.1.1.2.1.1 QoS requirements
1.1.1.4.1.1.2.1.2 Contributions
1.1.1.4.1.1.2.1.2.1 Framework to measure Elasticity
1.1.1.4.1.1.2.1.2.2 Case studies
1.1.1.4.1.1.2.1.2.3 Insights that impacts Elasticit
1.1.1.4.1.1.2.1.2.4 Understand Elasticity Behavior
1.1.1.4.1.1.2.1.3 Compare offerings
1.1.1.4.1.1.2.1.3.1 Benchmark
1.1.1.4.1.1.2.1.3.2 According to need
2 Related Work
2.1 Definition And Characteristics
2.1.1 Armbrust et. al - the value of Elasticity

Annotations:

  • M. Armbrust, A. Fox, R. Griffith, A. Joeph, R. Katz, A. Konwinski, G. Lee, D. Patterson, A. Rabkin, I. Stoica amd M. Zaharia A view of Cloud Computing. Communications of the ACM 2010
2.1.2 NIST - rapidly (de)provisioning
2.1.3 David Chiu and Ricky Ho - (de)commission immediattly

Annotations:

  • David Chiu - Crosswords, Vol. 16, No. 3. (2010), pp. (3-4) Ricky Ho - http://horicky.blogspot.com/2009/07/between-elasticity-and-scalability.html
2.2 Elasticity Measurement Model
2.2.1 Weinman measurement of elasticity

Annotations:

  • J. Weinman - Time is Money: The Value of "On-Demand" www.joeweinman.com/Resources/Joe_Weinman_Time_Is_Money.pdf Jan/2011
2.2.1.1 Demand curve (D)
2.2.1.1.1 Time (t)
2.2.1.1.2 Resource (R)
2.2.1.2 Situations
2.2.1.2.1 Perfect Elasticity = R(t) = D(t)
2.2.1.2.2 Overprovisioning = R(t) > D(t)
2.2.1.2.3 Underprovisioning = D(t) < R(t)
2.2.2 Proposed Modifications
2.2.2.1 Real data Workload
2.2.2.2 Include penalties (unsatisfactory performance)
2.2.2.3 QoS based
2.2.2.4 Allow SLA
2.2.2.5 allocated resources x charged resources
2.2.2.6 Unified metric to sumarize
2.3 Cloud Performance and Benchmarks
2.3.1 Existent Works
2.3.1.1 Stantchev et al. - generic benchmark to evaluate nonfunctional properties (cost-benefit)

Annotations:

  • V. Stantchev - Performance evaluation of cloud computing offerings.  IEEE AdvComp 2009
2.3.1.2 Dejun etl al. and Schad et al. - evaluate performance characteristics of cloud infrastructure (without variation during provisioning)

Annotations:

  • J. Dejun, G. Pierre, and C. Chi - EC2 performance analysis for resource provisioning of service-oriented ICSOC Workshops 2009 J. Schad, J. Dittirich, and J.-A. Quiané-Ruiz - Runtime measurements in the cloud: Observing, analyzing, and reducing variance PVLDB 2010
2.3.1.3 HP Labs - measurements of quality features (cloud platforms perspective)

Annotations:

  • C. Bash, T. Cader, Y.Chen, D.Gmach, R.Kaufman, D. Milojicic, A. Shah, and P. Sharma HPL-20110148 Cloud Sustainability of Dashboard Dynamically 2011
2.3.1.4 Srinivasan et al. and Huang et all. - compare data center migration techniques

Annotations:

  • K. Sriniviasan, S. Yuuw and T. Adelmeyer - Dynamic VM migration: assesing its risks &amp; rewards using a benchmark ICPE 20111D. Huang, D. Ye, Q. He, J. Chen, and K. Ye. - Virt-LM: a benchmark for live migration of virtual machine ICPE 2011
2.3.1.5 Ygitbasi et al. - evaluate performance overheads with scalling lattency of VM instances

Annotations:

  • N. Yigitbasi, A. Iosup, D. Epdema, and S. Ostermann - C-meter: A framework for performance analysis of computing clouds CCGrid 2009
2.3.1.6 Li et al. - propose CloudCmp: user perceived performance and cost effectiveness with fine granularity

Annotations:

  • A. Li, X. Yang, S. Kandula, and M. Zhang - CloudCmp: comparing public cloud providers. ICM 2010
2.3.1.7 YCSB - evaluate performance of cloud databases (workloads and elasticity; do not evaluate de-provisioning and resource granularity aspects; not capture financial implications as well as traditional performance)

Annotations:

  • B. Cooper, A. Silberstein, E. Tam, E. Ramakrishnan, and R. Sears - Benchmarking cloud serving systems with YCSB SoCC 2010
2.3.1.8 Kossmann - compare with a set of performance and cost metrics to compare throughput, performance/cost ratio and cost predictability (omit the speed of responding to change; not consider workload shrink and grow)

Annotations:

  • D. Kossmann, T. Kraska, and S. Loesing - An evaluation of alternative architectures for transaction processing in the cloud SIGMOD 2010
2.3.2 Proposed Work
2.3.2.1 Evaluate Elasticity from user perspective
2.3.2.2 Impact of Imperfection of Elasticity based on consumers' business situation.
2.3.2.3 Evaluate perceived performance and cost effectiveness with coarse granularity

Annotations:

  • Could impact the metric's expression.
3 Elasticity Measurement
3.1 Framework
3.1.1 (sum) Penalties
3.1.1.1 Workload Penalties
3.1.1.1.1 overprovisioning faults
3.1.1.1.2 underprovisioning faults
3.2 Penalty model
3.2.1 Identify resources
3.2.2 Identify resources metrics
3.2.3 Identify QoS metrics
3.2.4 over-provisioning penalties R(t) > D(t)
3.2.5 under-provisioning penalties
3.2.6 Execution total penalty rate
3.3 Single Figure of Merit for Elasticity
3.4 Choices for an Elasticity Benchmark
3.4.1 Elasticity Score
3.4.1.1 SLA objectives
3.4.1.1.1 F. Nah Study and Weinman

Annotations:

  • F. Nah. A suty on tolerale waiting time: how long are web users willing to wait?  Behaviour &amp; Information Technology 2004 J. Dejun, G. Pierre, and C. Chi. EC2 performance analysis for resource provisioning of service-oriented applications. ICSOC 2009
3.4.1.2 Metrics
3.4.1.2.1 EC2 - CPU capacity
3.4.1.3 Workloads
3.4.1.3.1 Sinusoidal
3.4.1.3.2 Sinusouidal with Plateau
3.4.1.3.3 Exponentially Bursting
3.4.1.3.4 Linearly Growing
3.4.1.3.5 Random
3.4.1.4 Penalties
3.4.1.4.1 Latency
3.4.1.4.2 Aviability
3.4.1.5 Workload characteristics
3.4.1.5.1 Periodicity
3.4.1.5.2 Growth and decay rate
3.4.1.5.3 Randomness
3.5 Implementation
3.5.1 Experimental Setup