Software performance testing is an important activity to ensure quality in continuous software development environments. Current performance testing approaches are mostly based on scripting languages and framework where users implement, in a procedural way, the performance tests they want to issue to the system under test. However, existing solutions lack support for explicitly declaring the performance test goals and intents. Thus, while it is possible to express how to execute a performance test, its purpose and applicability context remain implicitly described. In this work, we propose a declarative domain specific language (DSL) for software performance testing and a model-driven framework that can be programmed using the mentioned language and drive the end-to-end process of executing performance tests. Users of the DSL and the framework can specify their performance intents by relying on a powerful goal-oriented language, where standard (e.g., load tests) and more advanced (e.g., stability boundary detection and configuration tests) performance tests can be specified starting from templates. The DSL and the framework have been designed to be integrated into a continuous software development process and validated through extensive use cases that illustrate the expressiveness of the goal-oriented language, and the powerful control it enables on the end-to-end performance test execution to determine how to reach the declared intent.
My talk from The 9th ACM/SPEC International Conference on Performance Engineering (ICPE 2018). Cite us: https://dl.acm.org/citation.cfm?id=3184417
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
A Declarative Approach for Performance Tests Execution in Continuous Software Development Environments
1. Università
della
Svizzera
italiana
A Declarative Approach for
Performance Tests Execution
in Continuous Software
Development Environments
Vincenzo Ferme
Software Institute
Faculty of Informatics
USI Lugano, Switzerland
Cesare Pautasso
7. 3
CI ServerRepo
Continuous Software Development Environments
Developers,
Testers,
Architects
Continuous Changes
Continuous Test Execution
Load Test, Spike Test
8. 3
CI ServerRepo
Continuous Software Development Environments
Developers,
Testers,
Architects
Continuous Changes
Continuous Test Execution
Capacity Test
Configuration Test
Load Test, Spike Test
11. State of Practice
4
Script/UI Based Integrated in CI Pipelines
Performance Tests
Execution Automation
Google Vizier
AutoPerf
DataMill
CloudPerf
End-to-end Tests against REST APIs
13. Define a Configuration Test
5
Test Definition Configuration Parameters Change
Simulated Users
14. Define a Configuration Test
5
Test Definition Configuration Parameters Change
Simulated Users
System Under Test 3rd-party tools
15. Define a Configuration Test
5
Test Definition Configuration Parameters Change
Simulated Users
System Under Test 3rd-party tools
Test Execution, Data Collection, Metrics Computation
16. Define a Configuration Test
5
Test Definition Configuration Parameters Change
Simulated Users
System Under Test 3rd-party tools
Test Execution, Data Collection, Metrics Computation
Test Goals are not
Explicit in the Definition
17. Define a Configuration Test
5
Test Definition Configuration Parameters Change
Simulated Users
System Under Test 3rd-party tools
Test Execution, Data Collection, Metrics Computation
N
o
Sem
antic
V
alidation
(Runtim
e
Errors)
Test Goals are not
Explicit in the Definition
18. Define a Configuration Test
5
Test Definition Configuration Parameters Change
Simulated Users
System Under Test 3rd-party tools
Test Execution, Data Collection, Metrics Computation
N
o
Sem
antic
V
alidation
(Runtim
e
Errors)
Scarce Visibility and
Control on the Test
Execution Lifecycle
Test Goals are not
Explicit in the Definition
19. Declarative Performance Engineering (DPE)
6
“Enabling the performance analyst to declaratively specify what
performance-relevant questions need to be answered without
being concerned about how they should be answered.
”
[Walter et al.]
Jürgen Walter, André van Hoorn, Heiko Koziolek, Dusan Okanovic, and Samuel Kounev. Asking ”What”?,
Automating the ”How”? -TheVision of Declarative Performance Engineering. In Proc. of ICPE 2016. 91–94.
[Walter et al.]
20. Declarative Performance Engineering (DPE)
6
“Enabling the performance analyst to declaratively specify what
performance-relevant questions need to be answered without
being concerned about how they should be answered.
”
[Walter et al.]
Jürgen Walter, André van Hoorn, Heiko Koziolek, Dusan Okanovic, and Samuel Kounev. Asking ”What”?,
Automating the ”How”? -TheVision of Declarative Performance Engineering. In Proc. of ICPE 2016. 91–94.
[Walter et al.]
Developers,
Testers,
Architects,
Performance Analyst …
21. DPE: Approaches for Performance Testing
7
[Omar et al.]
Towards an automated approach to use expert systems in the performance testing of distributed systems.
[Westermann] [Scheuner et al.][Omar et al.]
“Enabling the performance analyst to declaratively specify what
performance-relevant questions need to be answered without
being concerned about how they should be answered.
”[Walter et al.]
[Westermann]
Deriving Goal-oriented Performance Models by Systematic Experimentation
[Scheuner et al.]
Cloud Work Bench - Infrastructure-as-Code Based Cloud Benchmarking.
57. Approach: Summary
26
Explicit Test Goals in
the Definition
Control on the Test
Execution Lifecycle
Terminated
Running
Handle Experiment Result
Determine Exploration Strategy
Determine and Execute
Experiments: Experiment
Life Cycle
[can reach goal]
[cannot reach goal]
user terminated
OR
[max_time]
Goal Reached
Completed with Failure
experiment
results
available
[experiments
remaining]
[all experiments executed]
Partially Complete
[number_of_experiments]
Validate
Termination
Criteria
Terminating
Check Quality Gates
[quality
gates pass]
[failed quality gates]
Visibility on the Test
Execution Lifecycle
58. Approach: Summary
26
Static Syntactic and
Semantic Validation
Explicit Test Goals in
the Definition
Control on the Test
Execution Lifecycle
Terminated
Running
Handle Experiment Result
Determine Exploration Strategy
Determine and Execute
Experiments: Experiment
Life Cycle
[can reach goal]
[cannot reach goal]
user terminated
OR
[max_time]
Goal Reached
Completed with Failure
experiment
results
available
[experiments
remaining]
[all experiments executed]
Partially Complete
[number_of_experiments]
Validate
Termination
Criteria
Terminating
Check Quality Gates
[quality
gates pass]
[failed quality gates]
Visibility on the Test
Execution Lifecycle
59. Approach: Summary
26
Extensible for different
Test Goals
Static Syntactic and
Semantic Validation
Explicit Test Goals in
the Definition
Control on the Test
Execution Lifecycle
Terminated
Running
Handle Experiment Result
Determine Exploration Strategy
Determine and Execute
Experiments: Experiment
Life Cycle
[can reach goal]
[cannot reach goal]
user terminated
OR
[max_time]
Goal Reached
Completed with Failure
experiment
results
available
[experiments
remaining]
[all experiments executed]
Partially Complete
[number_of_experiments]
Validate
Termination
Criteria
Terminating
Check Quality Gates
[quality
gates pass]
[failed quality gates]
Visibility on the Test
Execution Lifecycle
61. Future Work
27
Apply to real-world use cases:
- Apply the approach in real-world contexts
Add more goals:
- Regression
- Assess different micro services deployment alternatives
62. Future Work
27
Apply to real-world use cases:
- Apply the approach in real-world contexts
Add more goals:
- Regression
- Assess different micro services deployment alternatives
Leverage performance test execution data:
- Prioritise performance test execution
- Skip execution of tests
63. Future Work
27
Apply to real-world use cases:
- Apply the approach in real-world contexts
Add more goals:
- Regression
- Assess different micro services deployment alternatives
Leverage performance test execution data:
- Prioritise performance test execution
- Skip execution of tests
Integrate with other frameworks:
- ContinuITy framework, to leverage production data
64. State of Practice
4
Script/UI Based Integrated in CI Pipelines
Performance Tests
Execution Automation
Google Vizier
AutoPerf
DataMill
CloudPerf
End-to-end Tests against REST APIs
State of Practice
Highlights
28
Declarative DSL
Declarative Test Definition
10
configuration:
goal:
load_function:
quality_gates:
termination_criteria:
data_collection:
workload:
sut:
Model-driven Framework
BenchFlow
Model-driven Framework
20
CI ServerRepo
BenchFlow
BenchFlow
BenchFlow
BenchFlow
BenchFlow
BenchFlow
BenchFlow
Extending the Approach
configuration:
goal:
type: configuration
observe:
exploration:
…
configuration:
goal:
type: stability_boundary
observe:
exploration:
…
Extending the Proposed Approach
24
LOAD
CONFIGURATION
STABILITY_BOUNDARY
Available Goals: