In agile, fast and continuous development lifecycles, software performance analysis is fundamental to confidently release continuously improved software versions. Researchers and industry practitioners have identified the importance of integrating performance testing in agile development processes in a timely and efficient way. However, existing techniques are fragmented and not integrated taking into account the heterogeneous skills of the users developing polyglot distributed software, and their need to automate performance practices as they are integrated in the whole lifecycle without breaking its intrinsic velocity. In this paper we present our vision for holistic continuous software performance assessment, which is being implemented in the BenchFlow tool. BenchFlow enables performance testing and analysis practices to be pervasively integrated in continuous development lifecycle activities. Users can specify performance activities (e.g., standard performance tests) by relying on an expressive Domain Specific Language for objective-driven performance analysis. Collected performance knowledge can be thus reused to speed up performance activities throughout the entire process.
My talk from The International Workshop on Quality-aware DevOps (QUDOS 2017). Cite us: http://dl.acm.org/citation.cfm?id=3053636
14. 7
Limitations of Current Solutions
Not integrated in CSA
(E.g., Do not Leverage
Continuous Feedback)
CSA: Continuous Software Assessment
15. 7
Limitations of Current Solutions
Rarely Automating
the End-to-End Process
Not integrated in CSA
(E.g., Do not Leverage
Continuous Feedback)
CSA: Continuous Software Assessment
16. 8
DevOps: Professional Profiles Perspective
Different Professional Profiles
(e.g., Developers, Q/A and Operations Engineers)
17. 8
DevOps: Professional Profiles Perspective
Heterogeneous and Cross-Cutting Skills
Different Professional Profiles
(e.g., Developers, Q/A and Operations Engineers)
37. 15
Objectives Taxonomy
Base Objectives (Test Types)
standard performance tests, e.g., load test, stress test,
spike test, and configuration test
Objectives
specific types of performance engineering activities, e.g.,
capacity planning and performance optimisations
38. 15
Objectives Taxonomy
Base Objectives (Test Types)
standard performance tests, e.g., load test, stress test,
spike test, and configuration test
Objectives
specific types of performance engineering activities, e.g.,
capacity planning and performance optimisations
Meta-Objectives
defined from already collected performance knowledge,
e.g., comparing different systems using a benchmark
47. 21
Objectives
- Capacity planning (also based on some
constraints)
e.g., CPU, RAM
why? cost of resources -> important for the business
48. 21
Objectives
- Capacity planning (also based on some
constraints)
e.g., CPU, RAM
why? cost of resources -> important for the business
- Performance optimisation based on some
(resource) constraints
e.g., which configuration is optimal?
why? responsiveness -> important for the user
53. 24
Meta-Objectives
- Regression
is the performance, capacity or scalability still the same as
previous tests show?
- What-If Analysis
what do we expect to happen to the output/dependent variables
if we change some of the input/independent variables?
54. 24
Meta-Objectives
- Regression
is the performance, capacity or scalability still the same as
previous tests show?
- What-If Analysis
what do we expect to happen to the output/dependent variables
if we change some of the input/independent variables?
- Before-and-After
how has the performance changed given some features have
been added?
55. 24
Meta-Objectives
- Regression
is the performance, capacity or scalability still the same as
previous tests show?
- What-If Analysis
what do we expect to happen to the output/dependent variables
if we change some of the input/independent variables?
- Before-and-After
how has the performance changed given some features have
been added?
- Benchmarking
how does the performance of different systems compare?
57. 26
Different Types of Fast Feedback
Evaluating if the System is Ready for the Defined
Performance Test Objectives and Reaches Expected State
58. 26
Different Types of Fast Feedback
Evaluating if the System is Ready for the Defined
Performance Test Objectives and Reaches Expected State
Reusing Collected Performance Knowledge
66. Highlights
30
Current Solutions + Limitations
7
Limitations of Current Solutions
Rarely Automating
the End-to-End Process
Not integrated in CSA
(E.g., Do not Leverage
Continuous Feedback)
CSA: Continuous Software Assessment
67. Highlights
30
Current Solutions + Limitations
7
Limitations of Current Solutions
Rarely Automating
the End-to-End Process
Not integrated in CSA
(E.g., Do not Leverage
Continuous Feedback)
CSA: Continuous Software Assessment
Approach Overview
CI ServerRepo
10
Approach Overview
Performance
Knowledge
Objective-
driven Tests
Fast Perf.
Feedback
CSA Integration
(DSL)
3 Main
Features
68. Highlights
30
Approach Details
In Depth Details
34
Current Solutions + Limitations
7
Limitations of Current Solutions
Rarely Automating
the End-to-End Process
Not integrated in CSA
(E.g., Do not Leverage
Continuous Feedback)
CSA: Continuous Software Assessment
Approach Overview
CI ServerRepo
10
Approach Overview
Performance
Knowledge
Objective-
driven Tests
Fast Perf.
Feedback
CSA Integration
(DSL)
3 Main
Features
72. 35
Docker Performance
[IBM ’14]
W. Felter, A. Ferreira, R. Rajamony, and J. Rubio. An updated performance
comparison of virtual machines and Linux containers. IBM Research
Report, 2014.
Although containers themselves have almost no
overhead, Docker is not without performance
gotchas. Docker volumes have noticeably better performance
than files stored in AUFS. Docker’s NAT also introduces
overhead for workloads with high packet rates.These features
represent a tradeoff between ease of management
and performance and should be considered on a
case-by-case basis.
”
“Our results show that containers result in equal or
better performance than VMs in almost all cases.
“
”
BenchFlow Configures Docker for Performance by Default
77. 36
BenchFlow: System Under Test
Docker Swarm
Docker Engine
Docker Machine
provides
manages
Containers
Servers
78. 36
BenchFlow: System Under Test
Docker Swarm
Docker Engine
Docker Compose
Docker Machine
provides
manages
Containers
Servers
79. 36
BenchFlow: System Under Test
Docker Swarm
Docker Engine
Docker Compose
Docker Machine
provides
manages
Containers
Servers
SUT’s Deployment Conf.
deploys
80. 37
Server-side Data and Metrics Collection
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
81. 37
Server-side Data and Metrics Collection
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
MONITOR
82. 38
Server-side Data and Metrics Collection
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
• CPU usage
• Database state
Examples of Monitors:Monitors’ Characteristics:
• RESTful services
• Lightweight (written in Go)
• As less invasive on the SUT as possible
83. 38
Server-side Data and Metrics Collection
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
CPU
Monitor
DB
Monitor
• CPU usage
• Database state
Examples of Monitors:Monitors’ Characteristics:
• RESTful services
• Lightweight (written in Go)
• As less invasive on the SUT as possible
84. 39
Server-side Data and Metrics Collection
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
85. 39
Server-side Data and Metrics Collection
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
Minio
Instance
Database
Analyses
COLLECTORS
86. 40
Server-side Data and Metrics Collection
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
• Container’s Stats (e.g., CPU usage)
• Database dump
• Applications Logs
Examples of Collectors:Collectors’ Characteristics:
• RESTful services
• Lightweight (written in Go)
• Two types: online and offline
• Buffer data locally
87. 40
Server-side Data and Metrics Collection
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
Stats
Collector
DB
Collector
• Container’s Stats (e.g., CPU usage)
• Database dump
• Applications Logs
Examples of Collectors:Collectors’ Characteristics:
• RESTful services
• Lightweight (written in Go)
• Two types: online and offline
• Buffer data locally
88. 41
Performance Metrics and KPIs
coordinate data collection and data transformationTestExecution
A high-throughput
distributed
messaging system
Minio
Instance
Database
Analyses
DBMSFaban Drivers
ContainersServers
harness
WfMS
Web
Service
Stats
Collector
DB
Collector
89. 41
Performance Metrics and KPIs
coordinate data collection and data transformationTestExecution
A high-throughput
distributed
messaging system
Minio
Instance
Database
Analyses
DBMSFaban Drivers
ContainersServers
harness
WfMS
Web
Service
Stats
Collector
DB
Collector
COLLECT
90. 41
Performance Metrics and KPIs
coordinate data collection and data transformationTestExecution
A high-throughput
distributed
messaging system
Minio
Instance
Database
Analyses
DBMSFaban Drivers
ContainersServers
harness
WfMS
Web
Service
Stats
Collector
DB
Collector
COLLECT
Publis
h
91. 41
Performance Metrics and KPIs
coordinate data collection and data transformationTestExecution
A high-throughput
distributed
messaging system
Minio
Instance
Database
Analyses
DBMSFaban Drivers
ContainersServers
harness
WfMS
Web
Service
Stats
Collector
DB
Collector
COLLECT
Publis
h
Subscri
92. 41
Performance Metrics and KPIs
coordinate data collection and data transformationTestExecution
A high-throughput
distributed
messaging system
Minio
Instance
Database
Analyses
DBMSFaban Drivers
ContainersServers
harness
WfMS
Web
Service
Stats
Collector
DB
Collector
COLLECT
Publis
h
Subscri
Read