SlideShare a Scribd company logo
1 of 90
Download to read offline
Continuous Performance
Testing for Microservices
Vincenzo Ferme
Reliable Software Systems
Institute for Software Technologies
University of Stuttgart, Germany
HPI Future SOC Lab Day
info@vincenzoferme.it
2
1. Continuous Software Development Environments

Outline
2
1. Continuous Software Development Environments

2. Continuous Performance Testing of Microservices:
a. Challenges
b. Our Approach
c. Preliminary Results
d. BenchFlow Automation Framework

Outline
2
1. Continuous Software Development Environments

2. Continuous Performance Testing of Microservices:
a. Challenges
b. Our Approach
c. Preliminary Results
d. BenchFlow Automation Framework

3. Approach Extensions:
a. ContinuITy Framework

Outline
2
1. Continuous Software Development Environments

2. Continuous Performance Testing of Microservices:
a. Challenges
b. Our Approach
c. Preliminary Results
d. BenchFlow Automation Framework

3. Approach Extensions:
a. ContinuITy Framework

4. Future Work
Outline
3
Continuous Software Development Environments
Repo
Developers,
Testers,
Architects
3
Continuous Software Development Environments
CI ServerRepo
Developers,
Testers,
Architects
3
Continuous Software Development Environments
CI ServerRepo
Developers,
Testers,
Architects
CD Server
3
Continuous Software Development Environments
CI ServerRepo
Developers,
Testers,
Architects
Production
CD Server
4
The system changes frequently:

- Need to keep the test definition updated (manual is not
an option)

Challenges
4
The system changes frequently:

- Need to keep the test definition updated (manual is not
an option)

Too many tests to be executed:

- Need to reduce the number of executed tests

Challenges
4
The system changes frequently:

- Need to keep the test definition updated (manual is not
an option)

Too many tests to be executed:

- Need to reduce the number of executed tests

Need to integrate tests in delivery pipelines:

- Need to have complete automation for test execution

- Need to have clear KPIs to be evaluated in the pipeline
Challenges
5
Our Approach
Production
5
Our Approach
Observed load situations
Time
LoadLevel
Production
5
Our Approach
Observed load situations
Time
LoadLevel
Production Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
1
5
Our Approach
Observed load situations
Time
LoadLevel
Production Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
1
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
2
5
Our Approach
Observed load situations
Time
LoadLevel
Production
Sampled Load Tests
Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
1
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
2
5
Our Approach
Observed load situations
Time
LoadLevel
Production
Sampled Load TestsDeployment Conf.
Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
1
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
2
5
Our Approach
Observed load situations
Time
LoadLevel
Production
Baseline Test
Sampled Load TestsDeployment Conf.
Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
1
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
2
5
Our Approach
Observed load situations
Time
LoadLevel
Production
Baseline Test
Sampled Load Tests
Scalability Criteria
Deployment Conf.
Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
1
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
2
5
Our Approach
Observed load situations
Time
LoadLevel
Production
Baseline Test
Sampled Load Tests
Scalability Criteria
Deployment Conf.
Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
1
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
2
0.12 0.14 0.20 0.16 0.11
Test Results
3
5
Our Approach
Observed load situations
Time
LoadLevel
Production
Baseline Test
Sampled Load Tests
Scalability Criteria
Deployment Conf.
Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
1
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
2
0.12 0.14 0.20 0.16 0.11
Test Results
3
Domain Metric
0.73
4
6
Preliminary Experiments
Production
12 microservices
6
Preliminary Experiments
Production
12 microservices
Sampled Load Tests
Empirical Distribution of Load situations
6 Load Levels
50,100,150,200,250,300
Users
1,2
6
Preliminary Experiments
Production
12 microservices
Custom Op. Mix
Sampled Load Tests
Empirical Distribution of Load situations
6 Load Levels
50,100,150,200,250,300
Users
1,2
6
Preliminary Experiments
Production
12 microservices
Custom Op. Mix
Sampled Load Tests
Empirical Distribution of Load situations
6 Load Levels
50,100,150,200,250,300
Users
1,2
Deployment Config.
10 configurations
Replicas
CPURAM
3
6
Preliminary Experiments
Stab = avg + 3σ
Production
12 microservices
Custom Op. Mix
Sampled Load Tests
Empirical Distribution of Load situations
6 Load Levels
50,100,150,200,250,300
Users
1,2
Deployment Config.
10 configurations
Replicas
CPURAM
3
7
Experiments Infrastructure
Virtual Users
32GB RAM, 24 Cores
7
Experiments Infrastructure
Virtual Users
32GB RAM, 24 Cores
896GB RAM, 80 Cores
System Under Test
8
Preliminary Experiments Results
Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
8
Preliminary Experiments Results
Custom Op. Mix
API
Scalability Criteria
(PASS/FAIL)
GET / PASS
GET /cart PASS
POST /item FAIL
Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
8
Preliminary Experiments Results
Custom Op. Mix
API
Scalability Criteria
(PASS/FAIL)
GET / PASS
GET /cart PASS
POST /item FAIL
Users Aggr. Rel. Freq.
50 0.10582
100 0.18519
150 0.22222
200 0.22222
250 0.20370
300 0.06085
Aggr. Rel. Freq.
Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
8
Preliminary Experiments Results
Custom Op. Mix
API
Scalability Criteria
(PASS/FAIL)
GET / PASS
GET /cart PASS
POST /item FAIL
Users Aggr. Rel. Freq.
50 0.10582
100 0.18519
150 0.22222
200 0.22222
250 0.20370
300 0.06085
Aggr. Rel. Freq. Contrib. to Domain Metric
Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
8
Preliminary Experiments Results
Custom Op. Mix
API
Scalability Criteria
(PASS/FAIL)
GET / PASS
GET /cart PASS
POST /item FAIL
Users Aggr. Rel. Freq.
50 0.10582
100 0.18519
150 0.22222
200 0.22222
250 0.20370
300 0.06085
Aggr. Rel. Freq. Contrib. to Domain Metric
Max: 0.20370
Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
8
Preliminary Experiments Results
Custom Op. Mix
API
Scalability Criteria
(PASS/FAIL)
GET / PASS
GET /cart PASS
POST /item FAIL
Users Aggr. Rel. Freq.
50 0.10582
100 0.18519
150 0.22222
200 0.22222
250 0.20370
300 0.06085
Aggr. Rel. Freq. Contrib. to Domain Metric
Max: 0.20370
Actual: 0.13580
Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
9
Preliminary Experiments Results
Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
9
Preliminary Experiments Results
Users Contribution
50 0.10582
100 0.18519
150 0.22222
200 0.07999
250 0.13580
300 0.04729
Contrib. to Domain Metric
Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
9
Preliminary Experiments Results
Users Contribution
50 0.10582
100 0.18519
150 0.22222
200 0.07999
250 0.13580
300 0.04729
Contrib. to Domain Metric
Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
Max: 1
Domain Metric
4
9
Preliminary Experiments Results
Users Contribution
50 0.10582
100 0.18519
150 0.22222
200 0.07999
250 0.13580
300 0.04729
Contrib. to Domain Metric
Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
Max: 1
Domain Metric
4
0.77631
Actual:
10
Preliminary Experiments Results
10
Preliminary Experiments Results
10
Preliminary Experiments Results
BenchFlow Automation Framework
11
BenchFlow
BF
BenchFlow Automation Framework
11
BenchFlow
BF
DSL
BenchFlow Automation Framework
11
BenchFlow
BF
DSL
- Workload Modeling

BenchFlow Automation Framework
11
BenchFlow
BF
DSL
- Workload Modeling

Adapters
BenchFlow Automation Framework
11
BenchFlow
BF
DSL
- Workload Modeling

- SUT Configuration

Adapters
BenchFlow Automation Framework
11
BenchFlow
BF
DSL
- Workload Modeling

- SUT Configuration

- SUT Deployment

Adapters
BenchFlow Automation Framework
11
BenchFlow
BF
DSL
- Workload Modeling

- SUT Configuration

- SUT Deployment

Adapters
BenchFlow Automation Framework
11
BenchFlow
BF
DSL
- Workload Modeling

- SUT Configuration

- SUT Deployment

- BenchFlow
Adapters
BenchFlow Automation Framework
11
BenchFlow
BF
DSL
- Workload Modeling

- SUT Configuration

- SUT Deployment

- BenchFlow
Adapters
BenchFlow Automation Framework
11
BenchFlow
BF
DSL
- Workload Modeling

- SUT Configuration

- SUT Deployment

- BenchFlow
Adapters
Faban
Test Execution
BenchFlow Automation Framework
11
BenchFlow
BF
DSL
- Workload Modeling

- SUT Configuration

- SUT Deployment

- BenchFlow
Adapters
Faban
Minio
Data Collection
Test Execution
BenchFlow Automation Framework
11
BenchFlow
BF
DSL
- Workload Modeling

- SUT Configuration

- SUT Deployment

- BenchFlow
Adapters
Faban
Minio
Data Collection
Data Analysis
Test Execution
BenchFlow Automation Framework
11
BenchFlow
BF
DSL
- Workload Modeling

- SUT Configuration

- SUT Deployment

- BenchFlow
Adapters
Faban
Minio
Data Collection
Data Analysis
Test Execution
https://github.com/benchflow
12
Generation
BenchFlow Automation Framework
[Ferme et al.]
Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development
Environments. In Proc. of ICPE 2018. 261-272.
12
Generation
Test
Generation
BenchFlow Automation Framework
[Ferme et al.]
Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development
Environments. In Proc. of ICPE 2018. 261-272.
12
Generation
Test Bundle
Test
Generation
BenchFlow Automation Framework
[Ferme et al.]
Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development
Environments. In Proc. of ICPE 2018. 261-272.
12
Generation
Test Bundle
Test
Generation
Experiment Generation
Experiment Bundle
BenchFlow Automation Framework
[Ferme et al.]
Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development
Environments. In Proc. of ICPE 2018. 261-272.
12
GenerationExecution
Test Bundle
Experiment Execution
Test
Generation
Experiment Generation
Experiment Bundle
BenchFlow Automation Framework
[Ferme et al.]
Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development
Environments. In Proc. of ICPE 2018. 261-272.
12
GenerationExecution
Test Bundle
Failures
Experiment Execution
Test
Generation
Experiment Generation
Experiment Bundle
Success
BenchFlow Automation Framework
[Ferme et al.]
Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development
Environments. In Proc. of ICPE 2018. 261-272.
12
GenerationExecutionAnalysis
Test Bundle
Failures
Experiment Execution
Result Analysis
Test
Generation
Experiment Generation
Experiment Bundle
Success
BenchFlow Automation Framework
[Ferme et al.]
Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development
Environments. In Proc. of ICPE 2018. 261-272.
12
GenerationExecutionAnalysis
Test Bundle
Metrics
Failures
Experiment Execution
Result Analysis
Test
Generation
Experiment Generation
Experiment Bundle
Success
BenchFlow Automation Framework
[Ferme et al.]
Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development
Environments. In Proc. of ICPE 2018. 261-272.
Approach: Summary
13
Production
Approach: Summary
13
Production
Sampled Load Tests
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
1,2
Approach: Summary
13
Production
Sampled Load Tests
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
1,2
Baseline Test
Scalability Criteria
Deployment Conf.
BenchFlow
BF
3
Approach: Summary
13
Production
Sampled Load Tests
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
1,2
Baseline Test
Scalability Criteria
Deployment Conf.
Domain Metric
0.73
4
BenchFlow
BF
3
Approach: Summary
13
Production
Sampled Load Tests
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
1,2
Baseline Test
Scalability Criteria
Deployment Conf.
Domain Metric
0.73
4
BenchFlow
BF
3
1. Decide if the system is ready for
deployment
Examples of Use of Domain Metric KPI:
Approach: Summary
13
Production
Sampled Load Tests
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
1,2
Baseline Test
Scalability Criteria
Deployment Conf.
Domain Metric
0.73
4
BenchFlow
BF
3
1. Decide if the system is ready for
deployment
2. Assess different deployment
configurations
Examples of Use of Domain Metric KPI:
Approach: Summary
14
Contrib.toDomainMetric
Sampled Load Tests
Approach: Summary
14
Contrib.toDomainMetric
Sampled Load Tests
Max Contrib.
Approach: Summary
14
Contrib.toDomainMetric
Sampled Load Tests
Max Contrib.
Depl. Conf.
Approach: Summary
15
Production
Sampled Load Tests
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
1,2
Baseline Test
Scalability Criteria
Deployment Conf.
Domain Metric
0.73
4
BenchFlow
BF
3
Examples of Use of Domain Metric KPI:
1. Decide if the system is ready for
deployment
2. Assess different deployment
configurations
Approach: Summary
15
Production
Sampled Load Tests
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
1,2
Baseline Test
Scalability Criteria
Deployment Conf.
Domain Metric
0.73
4
BenchFlow
BF
3
3. Configure scaling plans and
autoscalers
Examples of Use of Domain Metric KPI:
1. Decide if the system is ready for
deployment
2. Assess different deployment
configurations
16
Approach: Extensions
0.12 0.14 0.20 0.16 0.11
Observed load situations
Time
LoadLevel
Production Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
Baseline Test
Sampled Load Tests
Scalability Criteria
Domain Metric
Test Results
Deployment Conf.
0.73
1
23
4
16
Approach: Extensions
0.12 0.14 0.20 0.16 0.11
Observed load situations
Time
LoadLevel
Production Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
Baseline Test
Sampled Load Tests
Scalability Criteria
Domain Metric
Test Results
Deployment Conf.
0.73
1
23
4
16
Approach: Extensions
0.12 0.14 0.20 0.16 0.11
Observed load situations
Time
LoadLevel
Production Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
Baseline Test
Sampled Load Tests
Scalability Criteria
Domain Metric
Test Results
Deployment Conf.
0.73
1
23
4
ContinuITy Framework
17
Production
[Schulz et al.]
Henning Schulz,Tobias Angerstein, and André van Hoorn.Towards Automating Representative Load Testing in Continuous Software Engineering. In
Proc. of ICPE 2018 Companion. 123-126.
ContinuITy Framework
18
Production Sampled Load TestsSampled Load Tests
Load Levels
API Calls
Op. Mix
[Schulz et al.]
Henning Schulz,Tobias Angerstein, and André van Hoorn.Towards Automating Representative Load Testing in Continuous Software Engineering. In
Proc. of ICPE 2018 Companion. 123-126.
ContinuITy Framework
18
Workload Model
Production Sampled Load TestsSampled Load Tests
Load Levels
API Calls
Op. Mix
[Schulz et al.]
Henning Schulz,Tobias Angerstein, and André van Hoorn.Towards Automating Representative Load Testing in Continuous Software Engineering. In
Proc. of ICPE 2018 Companion. 123-126.
ContinuITy Framework
18
APIs SpecificationWorkload Model
+
Production Sampled Load TestsSampled Load Tests
Load Levels
API Calls
Op. Mix
[Schulz et al.]
Henning Schulz,Tobias Angerstein, and André van Hoorn.Towards Automating Representative Load Testing in Continuous Software Engineering. In
Proc. of ICPE 2018 Companion. 123-126.
ContinuITy Framework
18
APIs Specification Annotations ModelWorkload Model
+ +
Production Sampled Load TestsSampled Load Tests
Load Levels
API Calls
Op. Mix
[Schulz et al.]
Henning Schulz,Tobias Angerstein, and André van Hoorn.Towards Automating Representative Load Testing in Continuous Software Engineering. In
Proc. of ICPE 2018 Companion. 123-126.
Future Work
19
Automated update of tests definition:

- Automatically update load levels, API calls, operations mix

Future Work
19
Automated update of tests definition:

- Automatically update load levels, API calls, operations mix

Declarative Definitions for Load Testing:

- Make performance engineering activities more accessible

Future Work
19
Automated update of tests definition:

- Automatically update load levels, API calls, operations mix

Declarative Definitions for Load Testing:

- Make performance engineering activities more accessible

Declarative Performance Test Regression: 

- Extend the proposed approach to continuously assess
regressions in performance and PASS/FAIL CI/CD pipelines

Future Work
19
Automated update of tests definition:

- Automatically update load levels, API calls, operations mix

Declarative Definitions for Load Testing:

- Make performance engineering activities more accessible

Declarative Performance Test Regression: 

- Extend the proposed approach to continuously assess
regressions in performance and PASS/FAIL CI/CD pipelines

Prioritisation and Selection of Load Tests: 

- Exploit performance models to reduce even more the # of tests
20
Highlights
4
The system changes frequently:

- Need to keep the test definition updated

Too many tests to be executed:

- Need to reduce the number of executed tests

Need to integrate tests in delivery pipelines:

- Need to have complete automation for test execution

- Need to have clear KPIs to be evaluated in the pipeline
Challenges
Challenges
20
Highlights
4
The system changes frequently:

- Need to keep the test definition updated

Too many tests to be executed:

- Need to reduce the number of executed tests

Need to integrate tests in delivery pipelines:

- Need to have complete automation for test execution

- Need to have clear KPIs to be evaluated in the pipeline
Challenges
Challenges Our Approach
5
Our Approach
Observed load situations
Time
LoadLevel
Production
Baseline Test
Sampled Load Tests
Scalability Criteria
Deployment Conf.
Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
1
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
2
0.12 0.14 0.20 0.16 0.11
Test Results
3
Domain Metric
0.73
4
20
Highlights
4
The system changes frequently:

- Need to keep the test definition updated

Too many tests to be executed:

- Need to reduce the number of executed tests

Need to integrate tests in delivery pipelines:

- Need to have complete automation for test execution

- Need to have clear KPIs to be evaluated in the pipeline
Challenges
Challenges Our Approach
5
Our Approach
Observed load situations
Time
LoadLevel
Production
Baseline Test
Sampled Load Tests
Scalability Criteria
Deployment Conf.
Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
1
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
2
0.12 0.14 0.20 0.16 0.11
Test Results
3
Domain Metric
0.73
4
Preliminary Experiments
6
Preliminary Experiments
Stab = avg + 3σ
Production
12 microservices
Custom Op. Mix
Sampled Load Tests
Empirical Distribution of Load situations
6 Load Levels
50,100,150,200,250,300
Users
1,2
Deployment Config.
10 configurations
Replicas
CPURAM
3
20
Highlights
4
The system changes frequently:

- Need to keep the test definition updated

Too many tests to be executed:

- Need to reduce the number of executed tests

Need to integrate tests in delivery pipelines:

- Need to have complete automation for test execution

- Need to have clear KPIs to be evaluated in the pipeline
Challenges
Challenges Our Approach
5
Our Approach
Observed load situations
Time
LoadLevel
Production
Baseline Test
Sampled Load Tests
Scalability Criteria
Deployment Conf.
Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
1
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
2
0.12 0.14 0.20 0.16 0.11
Test Results
3
Domain Metric
0.73
4
Preliminary Experiments
6
Preliminary Experiments
Stab = avg + 3σ
Production
12 microservices
Custom Op. Mix
Sampled Load Tests
Empirical Distribution of Load situations
6 Load Levels
50,100,150,200,250,300
Users
1,2
Deployment Config.
10 configurations
Replicas
CPURAM
3
Extensions and Future Work
16
Approach: Extensions
0.12 0.14 0.20 0.16 0.11
Observed load situations
Time
LoadLevel
Production Empirical Distribution of Load situations
50 200 350 500
Load Level
Rel.Freq.
Empirical Distribution of Load situations
100 300 500
Sampled Load Level
Aggr.Rel.Freq.
Baseline Test
Sampled Load Tests
Scalability Criteria
Domain Metric
Test Results
Deployment Conf.
0.73
1
23
4

More Related Content

What's hot

Automated Performance Testing With J Meter And Maven
Automated  Performance  Testing With  J Meter And  MavenAutomated  Performance  Testing With  J Meter And  Maven
Automated Performance Testing With J Meter And MavenPerconaPerformance
 
Using Jenkins and Jmeter to build a scalable Load Testing solution
Using Jenkins and Jmeter to build a scalable Load Testing solutionUsing Jenkins and Jmeter to build a scalable Load Testing solution
Using Jenkins and Jmeter to build a scalable Load Testing solutionRuslan Strazhnyk
 
Jmeter Tester Certification
Jmeter Tester CertificationJmeter Tester Certification
Jmeter Tester CertificationVskills
 
Turbocharge Your Automation Framework to Shorten Regression Execution Time
Turbocharge Your Automation Framework to Shorten Regression Execution TimeTurbocharge Your Automation Framework to Shorten Regression Execution Time
Turbocharge Your Automation Framework to Shorten Regression Execution TimeJosiah Renaudin
 
Self healing test automation with Healenium and Minimization of regression su...
Self healing test automation with Healenium and Minimization of regression su...Self healing test automation with Healenium and Minimization of regression su...
Self healing test automation with Healenium and Minimization of regression su...Dmitriy Gumeniuk
 
Performance testing interview questions and answers
Performance testing interview questions and answersPerformance testing interview questions and answers
Performance testing interview questions and answersGaruda Trainings
 
Ginsbourg.com - Performance and Load Test Report Template LTR 1.5
Ginsbourg.com - Performance and Load Test Report Template LTR 1.5Ginsbourg.com - Performance and Load Test Report Template LTR 1.5
Ginsbourg.com - Performance and Load Test Report Template LTR 1.5Shay Ginsbourg
 
Integrate UFT with Jenkins Guide
Integrate UFT with Jenkins GuideIntegrate UFT with Jenkins Guide
Integrate UFT with Jenkins GuideYu Tao Zhang
 
Apache Jmeter 3.2 Performance & Load Testing 2017
Apache Jmeter 3.2 Performance & Load Testing 2017Apache Jmeter 3.2 Performance & Load Testing 2017
Apache Jmeter 3.2 Performance & Load Testing 2017Shay Ginsbourg
 
Introduction to JMeter
Introduction to JMeterIntroduction to JMeter
Introduction to JMeterGalih Lasahido
 
A Test Automation Framework
A Test Automation FrameworkA Test Automation Framework
A Test Automation FrameworkGregory Solovey
 
Ginsbourg.com presentation of performance and load testing 2020
Ginsbourg.com presentation of performance and load testing 2020Ginsbourg.com presentation of performance and load testing 2020
Ginsbourg.com presentation of performance and load testing 2020Shay Ginsbourg
 
Ginsbourg.com - Performance and load test report template ltr 2.0
Ginsbourg.com - Performance and load test report template ltr 2.0Ginsbourg.com - Performance and load test report template ltr 2.0
Ginsbourg.com - Performance and load test report template ltr 2.0Shay Ginsbourg
 
JMeter - Performance testing your webapp
JMeter - Performance testing your webappJMeter - Performance testing your webapp
JMeter - Performance testing your webappAmit Solanki
 
QA Fest 2019. Анна Чернышова. Self-healing test automation 2.0. The Future
QA Fest 2019. Анна Чернышова. Self-healing test automation 2.0. The FutureQA Fest 2019. Анна Чернышова. Self-healing test automation 2.0. The Future
QA Fest 2019. Анна Чернышова. Self-healing test automation 2.0. The FutureQAFest
 
Jmeter interviewquestions
Jmeter interviewquestionsJmeter interviewquestions
Jmeter interviewquestionsgirichinna27
 
Jenkins/Jmeter Configuration - Colombo Performance Test Meetup - 2016 April
Jenkins/Jmeter Configuration - Colombo Performance Test Meetup - 2016 AprilJenkins/Jmeter Configuration - Colombo Performance Test Meetup - 2016 April
Jenkins/Jmeter Configuration - Colombo Performance Test Meetup - 2016 Aprilnmadusanka
 
Performance testing with JMeter
Performance testing with JMeterPerformance testing with JMeter
Performance testing with JMeterMikael Kundert
 
Keyword-driven Test Automation Framework
Keyword-driven Test Automation FrameworkKeyword-driven Test Automation Framework
Keyword-driven Test Automation FrameworkMikhail Subach
 

What's hot (20)

Automated Performance Testing With J Meter And Maven
Automated  Performance  Testing With  J Meter And  MavenAutomated  Performance  Testing With  J Meter And  Maven
Automated Performance Testing With J Meter And Maven
 
Using Jenkins and Jmeter to build a scalable Load Testing solution
Using Jenkins and Jmeter to build a scalable Load Testing solutionUsing Jenkins and Jmeter to build a scalable Load Testing solution
Using Jenkins and Jmeter to build a scalable Load Testing solution
 
Jmeter Tester Certification
Jmeter Tester CertificationJmeter Tester Certification
Jmeter Tester Certification
 
Turbocharge Your Automation Framework to Shorten Regression Execution Time
Turbocharge Your Automation Framework to Shorten Regression Execution TimeTurbocharge Your Automation Framework to Shorten Regression Execution Time
Turbocharge Your Automation Framework to Shorten Regression Execution Time
 
Self healing test automation with Healenium and Minimization of regression su...
Self healing test automation with Healenium and Minimization of regression su...Self healing test automation with Healenium and Minimization of regression su...
Self healing test automation with Healenium and Minimization of regression su...
 
Performance testing interview questions and answers
Performance testing interview questions and answersPerformance testing interview questions and answers
Performance testing interview questions and answers
 
Ginsbourg.com - Performance and Load Test Report Template LTR 1.5
Ginsbourg.com - Performance and Load Test Report Template LTR 1.5Ginsbourg.com - Performance and Load Test Report Template LTR 1.5
Ginsbourg.com - Performance and Load Test Report Template LTR 1.5
 
Integrate UFT with Jenkins Guide
Integrate UFT with Jenkins GuideIntegrate UFT with Jenkins Guide
Integrate UFT with Jenkins Guide
 
Apache Jmeter 3.2 Performance & Load Testing 2017
Apache Jmeter 3.2 Performance & Load Testing 2017Apache Jmeter 3.2 Performance & Load Testing 2017
Apache Jmeter 3.2 Performance & Load Testing 2017
 
Introduction to JMeter
Introduction to JMeterIntroduction to JMeter
Introduction to JMeter
 
A Test Automation Framework
A Test Automation FrameworkA Test Automation Framework
A Test Automation Framework
 
JMeter Database Performace Testing - Keytorc Approach
JMeter Database Performace Testing - Keytorc ApproachJMeter Database Performace Testing - Keytorc Approach
JMeter Database Performace Testing - Keytorc Approach
 
Ginsbourg.com presentation of performance and load testing 2020
Ginsbourg.com presentation of performance and load testing 2020Ginsbourg.com presentation of performance and load testing 2020
Ginsbourg.com presentation of performance and load testing 2020
 
Ginsbourg.com - Performance and load test report template ltr 2.0
Ginsbourg.com - Performance and load test report template ltr 2.0Ginsbourg.com - Performance and load test report template ltr 2.0
Ginsbourg.com - Performance and load test report template ltr 2.0
 
JMeter - Performance testing your webapp
JMeter - Performance testing your webappJMeter - Performance testing your webapp
JMeter - Performance testing your webapp
 
QA Fest 2019. Анна Чернышова. Self-healing test automation 2.0. The Future
QA Fest 2019. Анна Чернышова. Self-healing test automation 2.0. The FutureQA Fest 2019. Анна Чернышова. Self-healing test automation 2.0. The Future
QA Fest 2019. Анна Чернышова. Self-healing test automation 2.0. The Future
 
Jmeter interviewquestions
Jmeter interviewquestionsJmeter interviewquestions
Jmeter interviewquestions
 
Jenkins/Jmeter Configuration - Colombo Performance Test Meetup - 2016 April
Jenkins/Jmeter Configuration - Colombo Performance Test Meetup - 2016 AprilJenkins/Jmeter Configuration - Colombo Performance Test Meetup - 2016 April
Jenkins/Jmeter Configuration - Colombo Performance Test Meetup - 2016 April
 
Performance testing with JMeter
Performance testing with JMeterPerformance testing with JMeter
Performance testing with JMeter
 
Keyword-driven Test Automation Framework
Keyword-driven Test Automation FrameworkKeyword-driven Test Automation Framework
Keyword-driven Test Automation Framework
 

Similar to Continuous Performance Testing for Microservices

Issre2010 malik
Issre2010 malikIssre2010 malik
Issre2010 malikSAIL_QU
 
Testing using load runner performance testing
Testing using load runner  performance testingTesting using load runner  performance testing
Testing using load runner performance testingSivaprasanthRentala1975
 
How To Combine Back-End 
 & Front-End Testing with BlazeMeter & Sauce Labs
How To Combine Back-End 
 & Front-End Testing with BlazeMeter & Sauce LabsHow To Combine Back-End 
 & Front-End Testing with BlazeMeter & Sauce Labs
How To Combine Back-End 
 & Front-End Testing with BlazeMeter & Sauce LabsSauce Labs
 
Gatling workshop lets test17
Gatling workshop lets test17Gatling workshop lets test17
Gatling workshop lets test17Gerald Muecke
 
Performancetestingjmeter 131210111657-phpapp02
Performancetestingjmeter 131210111657-phpapp02Performancetestingjmeter 131210111657-phpapp02
Performancetestingjmeter 131210111657-phpapp02Nitish Bhardwaj
 
Keynote VST2020 (Workshop on Validation, Analysis and Evolution of Software ...
Keynote VST2020 (Workshop on  Validation, Analysis and Evolution of Software ...Keynote VST2020 (Workshop on  Validation, Analysis and Evolution of Software ...
Keynote VST2020 (Workshop on Validation, Analysis and Evolution of Software ...University of Antwerp
 
How to make a Load Testing with Visual Studio 2012
How to make a Load Testing with Visual Studio 2012How to make a Load Testing with Visual Studio 2012
How to make a Load Testing with Visual Studio 2012Chen-Tien Tsai
 
It Does What You Say, Not What You Mean: Lessons From A Decade of Program Repair
It Does What You Say, Not What You Mean: Lessons From A Decade of Program RepairIt Does What You Say, Not What You Mean: Lessons From A Decade of Program Repair
It Does What You Say, Not What You Mean: Lessons From A Decade of Program RepairClaire Le Goues
 
Level Up Your Integration Testing With Testcontainers
Level Up Your Integration Testing With TestcontainersLevel Up Your Integration Testing With Testcontainers
Level Up Your Integration Testing With TestcontainersVMware Tanzu
 
Data Virtualization: Revolutionizing data cloning
Data Virtualization: Revolutionizing data cloningData Virtualization: Revolutionizing data cloning
Data Virtualization: Revolutionizing data cloning Kyle Hailey
 
Ensuring OpenStack Version up Compatibility for CloudOpen Japan 2013-05-31
Ensuring OpenStack Version up Compatibility for CloudOpen Japan 2013-05-31Ensuring OpenStack Version up Compatibility for CloudOpen Japan 2013-05-31
Ensuring OpenStack Version up Compatibility for CloudOpen Japan 2013-05-31Masayuki Igawa
 
Jvm Performance Tunning
Jvm Performance TunningJvm Performance Tunning
Jvm Performance Tunningguest1f2740
 
Jvm Performance Tunning
Jvm Performance TunningJvm Performance Tunning
Jvm Performance TunningTerry Cho
 
Connect, Test, Optimize: The Ultimate Kafka Connector Benchmarking Toolkit
Connect, Test, Optimize: The Ultimate Kafka Connector Benchmarking ToolkitConnect, Test, Optimize: The Ultimate Kafka Connector Benchmarking Toolkit
Connect, Test, Optimize: The Ultimate Kafka Connector Benchmarking ToolkitHostedbyConfluent
 
Webinar: Does Your Data Center Need NVMe?
Webinar: Does Your Data Center Need NVMe?Webinar: Does Your Data Center Need NVMe?
Webinar: Does Your Data Center Need NVMe?Storage Switzerland
 
Automatic Load Test Verification Using Control Charts
Automatic Load Test Verification Using Control ChartsAutomatic Load Test Verification Using Control Charts
Automatic Load Test Verification Using Control ChartsSAIL_QU
 
Clipper: A Low-Latency Online Prediction Serving System
Clipper: A Low-Latency Online Prediction Serving SystemClipper: A Low-Latency Online Prediction Serving System
Clipper: A Low-Latency Online Prediction Serving SystemDatabricks
 
STAR: Stack Trace based Automatic Crash Reproduction
STAR: Stack Trace based Automatic Crash ReproductionSTAR: Stack Trace based Automatic Crash Reproduction
STAR: Stack Trace based Automatic Crash ReproductionSung Kim
 
Alex Smola, Professor in the Machine Learning Department, Carnegie Mellon Uni...
Alex Smola, Professor in the Machine Learning Department, Carnegie Mellon Uni...Alex Smola, Professor in the Machine Learning Department, Carnegie Mellon Uni...
Alex Smola, Professor in the Machine Learning Department, Carnegie Mellon Uni...MLconf
 

Similar to Continuous Performance Testing for Microservices (20)

Issre2010 malik
Issre2010 malikIssre2010 malik
Issre2010 malik
 
Testing using load runner performance testing
Testing using load runner  performance testingTesting using load runner  performance testing
Testing using load runner performance testing
 
How To Combine Back-End 
 & Front-End Testing with BlazeMeter & Sauce Labs
How To Combine Back-End 
 & Front-End Testing with BlazeMeter & Sauce LabsHow To Combine Back-End 
 & Front-End Testing with BlazeMeter & Sauce Labs
How To Combine Back-End 
 & Front-End Testing with BlazeMeter & Sauce Labs
 
Gatling workshop lets test17
Gatling workshop lets test17Gatling workshop lets test17
Gatling workshop lets test17
 
Performancetestingjmeter 131210111657-phpapp02
Performancetestingjmeter 131210111657-phpapp02Performancetestingjmeter 131210111657-phpapp02
Performancetestingjmeter 131210111657-phpapp02
 
Keynote VST2020 (Workshop on Validation, Analysis and Evolution of Software ...
Keynote VST2020 (Workshop on  Validation, Analysis and Evolution of Software ...Keynote VST2020 (Workshop on  Validation, Analysis and Evolution of Software ...
Keynote VST2020 (Workshop on Validation, Analysis and Evolution of Software ...
 
How to make a Load Testing with Visual Studio 2012
How to make a Load Testing with Visual Studio 2012How to make a Load Testing with Visual Studio 2012
How to make a Load Testing with Visual Studio 2012
 
It Does What You Say, Not What You Mean: Lessons From A Decade of Program Repair
It Does What You Say, Not What You Mean: Lessons From A Decade of Program RepairIt Does What You Say, Not What You Mean: Lessons From A Decade of Program Repair
It Does What You Say, Not What You Mean: Lessons From A Decade of Program Repair
 
Level Up Your Integration Testing With Testcontainers
Level Up Your Integration Testing With TestcontainersLevel Up Your Integration Testing With Testcontainers
Level Up Your Integration Testing With Testcontainers
 
Data Virtualization: Revolutionizing data cloning
Data Virtualization: Revolutionizing data cloningData Virtualization: Revolutionizing data cloning
Data Virtualization: Revolutionizing data cloning
 
Ensuring OpenStack Version up Compatibility for CloudOpen Japan 2013-05-31
Ensuring OpenStack Version up Compatibility for CloudOpen Japan 2013-05-31Ensuring OpenStack Version up Compatibility for CloudOpen Japan 2013-05-31
Ensuring OpenStack Version up Compatibility for CloudOpen Japan 2013-05-31
 
Jvm Performance Tunning
Jvm Performance TunningJvm Performance Tunning
Jvm Performance Tunning
 
Jvm Performance Tunning
Jvm Performance TunningJvm Performance Tunning
Jvm Performance Tunning
 
Connect, Test, Optimize: The Ultimate Kafka Connector Benchmarking Toolkit
Connect, Test, Optimize: The Ultimate Kafka Connector Benchmarking ToolkitConnect, Test, Optimize: The Ultimate Kafka Connector Benchmarking Toolkit
Connect, Test, Optimize: The Ultimate Kafka Connector Benchmarking Toolkit
 
Load testing Java & Docker
Load testing Java & DockerLoad testing Java & Docker
Load testing Java & Docker
 
Webinar: Does Your Data Center Need NVMe?
Webinar: Does Your Data Center Need NVMe?Webinar: Does Your Data Center Need NVMe?
Webinar: Does Your Data Center Need NVMe?
 
Automatic Load Test Verification Using Control Charts
Automatic Load Test Verification Using Control ChartsAutomatic Load Test Verification Using Control Charts
Automatic Load Test Verification Using Control Charts
 
Clipper: A Low-Latency Online Prediction Serving System
Clipper: A Low-Latency Online Prediction Serving SystemClipper: A Low-Latency Online Prediction Serving System
Clipper: A Low-Latency Online Prediction Serving System
 
STAR: Stack Trace based Automatic Crash Reproduction
STAR: Stack Trace based Automatic Crash ReproductionSTAR: Stack Trace based Automatic Crash Reproduction
STAR: Stack Trace based Automatic Crash Reproduction
 
Alex Smola, Professor in the Machine Learning Department, Carnegie Mellon Uni...
Alex Smola, Professor in the Machine Learning Department, Carnegie Mellon Uni...Alex Smola, Professor in the Machine Learning Department, Carnegie Mellon Uni...
Alex Smola, Professor in the Machine Learning Department, Carnegie Mellon Uni...
 

More from Vincenzo Ferme

Estimating the Cost for Executing Business Processes in the Cloud
Estimating the Cost for Executing Business Processes in the CloudEstimating the Cost for Executing Business Processes in the Cloud
Estimating the Cost for Executing Business Processes in the CloudVincenzo Ferme
 
Workflow Engine Performance Benchmarking with BenchFlow
Workflow Engine Performance Benchmarking with BenchFlowWorkflow Engine Performance Benchmarking with BenchFlow
Workflow Engine Performance Benchmarking with BenchFlowVincenzo Ferme
 
Using Docker Containers to Improve Reproducibility in Software and Web Engine...
Using Docker Containers to Improve Reproducibility in Software and Web Engine...Using Docker Containers to Improve Reproducibility in Software and Web Engine...
Using Docker Containers to Improve Reproducibility in Software and Web Engine...Vincenzo Ferme
 
A Container-Centric Methodology for Benchmarking Workflow Management Systems
A Container-Centric Methodology for Benchmarking Workflow Management SystemsA Container-Centric Methodology for Benchmarking Workflow Management Systems
A Container-Centric Methodology for Benchmarking Workflow Management SystemsVincenzo Ferme
 
Towards a Benchmark for BPMN Engines
Towards a Benchmark for BPMN EnginesTowards a Benchmark for BPMN Engines
Towards a Benchmark for BPMN EnginesVincenzo Ferme
 
BenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management Systems
BenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management SystemsBenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management Systems
BenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management SystemsVincenzo Ferme
 
On the Road to Benchmarking BPMN 2.0 Workflow Engines
On the Road to Benchmarking BPMN 2.0 Workflow EnginesOn the Road to Benchmarking BPMN 2.0 Workflow Engines
On the Road to Benchmarking BPMN 2.0 Workflow EnginesVincenzo Ferme
 

More from Vincenzo Ferme (8)

Estimating the Cost for Executing Business Processes in the Cloud
Estimating the Cost for Executing Business Processes in the CloudEstimating the Cost for Executing Business Processes in the Cloud
Estimating the Cost for Executing Business Processes in the Cloud
 
Workflow Engine Performance Benchmarking with BenchFlow
Workflow Engine Performance Benchmarking with BenchFlowWorkflow Engine Performance Benchmarking with BenchFlow
Workflow Engine Performance Benchmarking with BenchFlow
 
Using Docker Containers to Improve Reproducibility in Software and Web Engine...
Using Docker Containers to Improve Reproducibility in Software and Web Engine...Using Docker Containers to Improve Reproducibility in Software and Web Engine...
Using Docker Containers to Improve Reproducibility in Software and Web Engine...
 
A Container-Centric Methodology for Benchmarking Workflow Management Systems
A Container-Centric Methodology for Benchmarking Workflow Management SystemsA Container-Centric Methodology for Benchmarking Workflow Management Systems
A Container-Centric Methodology for Benchmarking Workflow Management Systems
 
Towards a Benchmark for BPMN Engines
Towards a Benchmark for BPMN EnginesTowards a Benchmark for BPMN Engines
Towards a Benchmark for BPMN Engines
 
BenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management Systems
BenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management SystemsBenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management Systems
BenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management Systems
 
On the Road to Benchmarking BPMN 2.0 Workflow Engines
On the Road to Benchmarking BPMN 2.0 Workflow EnginesOn the Road to Benchmarking BPMN 2.0 Workflow Engines
On the Road to Benchmarking BPMN 2.0 Workflow Engines
 
Open Data
Open DataOpen Data
Open Data
 

Recently uploaded

The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxLoriGlavin3
 
Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rick Flair
 
2024 April Patch Tuesday
2024 April Patch Tuesday2024 April Patch Tuesday
2024 April Patch TuesdayIvanti
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteDianaGray10
 
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...Wes McKinney
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersRaghuram Pandurangan
 
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...AliaaTarek5
 
[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality Assurance[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality AssuranceInflectra
 
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfSo einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfpanagenda
 
A Framework for Development in the AI Age
A Framework for Development in the AI AgeA Framework for Development in the AI Age
A Framework for Development in the AI AgeCprime
 
Testing tools and AI - ideas what to try with some tool examples
Testing tools and AI - ideas what to try with some tool examplesTesting tools and AI - ideas what to try with some tool examples
Testing tools and AI - ideas what to try with some tool examplesKari Kakkonen
 
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better StrongerModern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better Strongerpanagenda
 
Sample pptx for embedding into website for demo
Sample pptx for embedding into website for demoSample pptx for embedding into website for demo
Sample pptx for embedding into website for demoHarshalMandlekar2
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
Emixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native developmentEmixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native developmentPim van der Noll
 
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...panagenda
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...
Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...
Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...Alkin Tezuysal
 
Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...Farhan Tariq
 

Recently uploaded (20)

The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
 
Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...
 
2024 April Patch Tuesday
2024 April Patch Tuesday2024 April Patch Tuesday
2024 April Patch Tuesday
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test Suite
 
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information Developers
 
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
 
[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality Assurance[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality Assurance
 
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfSo einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
 
A Framework for Development in the AI Age
A Framework for Development in the AI AgeA Framework for Development in the AI Age
A Framework for Development in the AI Age
 
Testing tools and AI - ideas what to try with some tool examples
Testing tools and AI - ideas what to try with some tool examplesTesting tools and AI - ideas what to try with some tool examples
Testing tools and AI - ideas what to try with some tool examples
 
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better StrongerModern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
 
Sample pptx for embedding into website for demo
Sample pptx for embedding into website for demoSample pptx for embedding into website for demo
Sample pptx for embedding into website for demo
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
Emixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native developmentEmixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native development
 
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...
Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...
Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...
 
Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...
 

Continuous Performance Testing for Microservices

  • 1. Continuous Performance Testing for Microservices Vincenzo Ferme Reliable Software Systems Institute for Software Technologies University of Stuttgart, Germany HPI Future SOC Lab Day info@vincenzoferme.it
  • 2. 2 1. Continuous Software Development Environments
 Outline
  • 3. 2 1. Continuous Software Development Environments
 2. Continuous Performance Testing of Microservices: a. Challenges b. Our Approach c. Preliminary Results d. BenchFlow Automation Framework
 Outline
  • 4. 2 1. Continuous Software Development Environments
 2. Continuous Performance Testing of Microservices: a. Challenges b. Our Approach c. Preliminary Results d. BenchFlow Automation Framework
 3. Approach Extensions: a. ContinuITy Framework
 Outline
  • 5. 2 1. Continuous Software Development Environments
 2. Continuous Performance Testing of Microservices: a. Challenges b. Our Approach c. Preliminary Results d. BenchFlow Automation Framework
 3. Approach Extensions: a. ContinuITy Framework
 4. Future Work Outline
  • 6. 3 Continuous Software Development Environments Repo Developers, Testers, Architects
  • 7. 3 Continuous Software Development Environments CI ServerRepo Developers, Testers, Architects
  • 8. 3 Continuous Software Development Environments CI ServerRepo Developers, Testers, Architects CD Server
  • 9. 3 Continuous Software Development Environments CI ServerRepo Developers, Testers, Architects Production CD Server
  • 10. 4 The system changes frequently:
 - Need to keep the test definition updated (manual is not an option)
 Challenges
  • 11. 4 The system changes frequently:
 - Need to keep the test definition updated (manual is not an option)
 Too many tests to be executed:
 - Need to reduce the number of executed tests
 Challenges
  • 12. 4 The system changes frequently:
 - Need to keep the test definition updated (manual is not an option)
 Too many tests to be executed:
 - Need to reduce the number of executed tests
 Need to integrate tests in delivery pipelines:
 - Need to have complete automation for test execution
 - Need to have clear KPIs to be evaluated in the pipeline Challenges
  • 14. 5 Our Approach Observed load situations Time LoadLevel Production
  • 15. 5 Our Approach Observed load situations Time LoadLevel Production Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. 1
  • 16. 5 Our Approach Observed load situations Time LoadLevel Production Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. 1 Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 2
  • 17. 5 Our Approach Observed load situations Time LoadLevel Production Sampled Load Tests Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. 1 Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 2
  • 18. 5 Our Approach Observed load situations Time LoadLevel Production Sampled Load TestsDeployment Conf. Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. 1 Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 2
  • 19. 5 Our Approach Observed load situations Time LoadLevel Production Baseline Test Sampled Load TestsDeployment Conf. Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. 1 Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 2
  • 20. 5 Our Approach Observed load situations Time LoadLevel Production Baseline Test Sampled Load Tests Scalability Criteria Deployment Conf. Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. 1 Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 2
  • 21. 5 Our Approach Observed load situations Time LoadLevel Production Baseline Test Sampled Load Tests Scalability Criteria Deployment Conf. Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. 1 Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 2 0.12 0.14 0.20 0.16 0.11 Test Results 3
  • 22. 5 Our Approach Observed load situations Time LoadLevel Production Baseline Test Sampled Load Tests Scalability Criteria Deployment Conf. Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. 1 Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 2 0.12 0.14 0.20 0.16 0.11 Test Results 3 Domain Metric 0.73 4
  • 24. 6 Preliminary Experiments Production 12 microservices Sampled Load Tests Empirical Distribution of Load situations 6 Load Levels 50,100,150,200,250,300 Users 1,2
  • 25. 6 Preliminary Experiments Production 12 microservices Custom Op. Mix Sampled Load Tests Empirical Distribution of Load situations 6 Load Levels 50,100,150,200,250,300 Users 1,2
  • 26. 6 Preliminary Experiments Production 12 microservices Custom Op. Mix Sampled Load Tests Empirical Distribution of Load situations 6 Load Levels 50,100,150,200,250,300 Users 1,2 Deployment Config. 10 configurations Replicas CPURAM 3
  • 27. 6 Preliminary Experiments Stab = avg + 3σ Production 12 microservices Custom Op. Mix Sampled Load Tests Empirical Distribution of Load situations 6 Load Levels 50,100,150,200,250,300 Users 1,2 Deployment Config. 10 configurations Replicas CPURAM 3
  • 29. 7 Experiments Infrastructure Virtual Users 32GB RAM, 24 Cores 896GB RAM, 80 Cores System Under Test
  • 30. 8 Preliminary Experiments Results Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
  • 31. 8 Preliminary Experiments Results Custom Op. Mix API Scalability Criteria (PASS/FAIL) GET / PASS GET /cart PASS POST /item FAIL Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
  • 32. 8 Preliminary Experiments Results Custom Op. Mix API Scalability Criteria (PASS/FAIL) GET / PASS GET /cart PASS POST /item FAIL Users Aggr. Rel. Freq. 50 0.10582 100 0.18519 150 0.22222 200 0.22222 250 0.20370 300 0.06085 Aggr. Rel. Freq. Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
  • 33. 8 Preliminary Experiments Results Custom Op. Mix API Scalability Criteria (PASS/FAIL) GET / PASS GET /cart PASS POST /item FAIL Users Aggr. Rel. Freq. 50 0.10582 100 0.18519 150 0.22222 200 0.22222 250 0.20370 300 0.06085 Aggr. Rel. Freq. Contrib. to Domain Metric Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
  • 34. 8 Preliminary Experiments Results Custom Op. Mix API Scalability Criteria (PASS/FAIL) GET / PASS GET /cart PASS POST /item FAIL Users Aggr. Rel. Freq. 50 0.10582 100 0.18519 150 0.22222 200 0.22222 250 0.20370 300 0.06085 Aggr. Rel. Freq. Contrib. to Domain Metric Max: 0.20370 Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
  • 35. 8 Preliminary Experiments Results Custom Op. Mix API Scalability Criteria (PASS/FAIL) GET / PASS GET /cart PASS POST /item FAIL Users Aggr. Rel. Freq. 50 0.10582 100 0.18519 150 0.22222 200 0.22222 250 0.20370 300 0.06085 Aggr. Rel. Freq. Contrib. to Domain Metric Max: 0.20370 Actual: 0.13580 Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
  • 36. 9 Preliminary Experiments Results Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
  • 37. 9 Preliminary Experiments Results Users Contribution 50 0.10582 100 0.18519 150 0.22222 200 0.07999 250 0.13580 300 0.04729 Contrib. to Domain Metric Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica
  • 38. 9 Preliminary Experiments Results Users Contribution 50 0.10582 100 0.18519 150 0.22222 200 0.07999 250 0.13580 300 0.04729 Contrib. to Domain Metric Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica Max: 1 Domain Metric 4
  • 39. 9 Preliminary Experiments Results Users Contribution 50 0.10582 100 0.18519 150 0.22222 200 0.07999 250 0.13580 300 0.04729 Contrib. to Domain Metric Deployment Configuration: 1 GB RAM, 0.25 CPU, 1 Replica Max: 1 Domain Metric 4 0.77631 Actual:
  • 47. BenchFlow Automation Framework 11 BenchFlow BF DSL - Workload Modeling
 - SUT Configuration
 Adapters
  • 48. BenchFlow Automation Framework 11 BenchFlow BF DSL - Workload Modeling
 - SUT Configuration
 - SUT Deployment
 Adapters
  • 49. BenchFlow Automation Framework 11 BenchFlow BF DSL - Workload Modeling
 - SUT Configuration
 - SUT Deployment
 Adapters
  • 50. BenchFlow Automation Framework 11 BenchFlow BF DSL - Workload Modeling
 - SUT Configuration
 - SUT Deployment
 - BenchFlow Adapters
  • 51. BenchFlow Automation Framework 11 BenchFlow BF DSL - Workload Modeling
 - SUT Configuration
 - SUT Deployment
 - BenchFlow Adapters
  • 52. BenchFlow Automation Framework 11 BenchFlow BF DSL - Workload Modeling
 - SUT Configuration
 - SUT Deployment
 - BenchFlow Adapters Faban Test Execution
  • 53. BenchFlow Automation Framework 11 BenchFlow BF DSL - Workload Modeling
 - SUT Configuration
 - SUT Deployment
 - BenchFlow Adapters Faban Minio Data Collection Test Execution
  • 54. BenchFlow Automation Framework 11 BenchFlow BF DSL - Workload Modeling
 - SUT Configuration
 - SUT Deployment
 - BenchFlow Adapters Faban Minio Data Collection Data Analysis Test Execution
  • 55. BenchFlow Automation Framework 11 BenchFlow BF DSL - Workload Modeling
 - SUT Configuration
 - SUT Deployment
 - BenchFlow Adapters Faban Minio Data Collection Data Analysis Test Execution https://github.com/benchflow
  • 56. 12 Generation BenchFlow Automation Framework [Ferme et al.] Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development Environments. In Proc. of ICPE 2018. 261-272.
  • 57. 12 Generation Test Generation BenchFlow Automation Framework [Ferme et al.] Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development Environments. In Proc. of ICPE 2018. 261-272.
  • 58. 12 Generation Test Bundle Test Generation BenchFlow Automation Framework [Ferme et al.] Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development Environments. In Proc. of ICPE 2018. 261-272.
  • 59. 12 Generation Test Bundle Test Generation Experiment Generation Experiment Bundle BenchFlow Automation Framework [Ferme et al.] Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development Environments. In Proc. of ICPE 2018. 261-272.
  • 60. 12 GenerationExecution Test Bundle Experiment Execution Test Generation Experiment Generation Experiment Bundle BenchFlow Automation Framework [Ferme et al.] Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development Environments. In Proc. of ICPE 2018. 261-272.
  • 61. 12 GenerationExecution Test Bundle Failures Experiment Execution Test Generation Experiment Generation Experiment Bundle Success BenchFlow Automation Framework [Ferme et al.] Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development Environments. In Proc. of ICPE 2018. 261-272.
  • 62. 12 GenerationExecutionAnalysis Test Bundle Failures Experiment Execution Result Analysis Test Generation Experiment Generation Experiment Bundle Success BenchFlow Automation Framework [Ferme et al.] Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development Environments. In Proc. of ICPE 2018. 261-272.
  • 63. 12 GenerationExecutionAnalysis Test Bundle Metrics Failures Experiment Execution Result Analysis Test Generation Experiment Generation Experiment Bundle Success BenchFlow Automation Framework [Ferme et al.] Vincenzo Ferme and Cesare Pautasso. A Declarative Approach for Performance Tests Execution in Continuous Software Development Environments. In Proc. of ICPE 2018. 261-272.
  • 65. Approach: Summary 13 Production Sampled Load Tests Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 1,2
  • 66. Approach: Summary 13 Production Sampled Load Tests Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 1,2 Baseline Test Scalability Criteria Deployment Conf. BenchFlow BF 3
  • 67. Approach: Summary 13 Production Sampled Load Tests Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 1,2 Baseline Test Scalability Criteria Deployment Conf. Domain Metric 0.73 4 BenchFlow BF 3
  • 68. Approach: Summary 13 Production Sampled Load Tests Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 1,2 Baseline Test Scalability Criteria Deployment Conf. Domain Metric 0.73 4 BenchFlow BF 3 1. Decide if the system is ready for deployment Examples of Use of Domain Metric KPI:
  • 69. Approach: Summary 13 Production Sampled Load Tests Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 1,2 Baseline Test Scalability Criteria Deployment Conf. Domain Metric 0.73 4 BenchFlow BF 3 1. Decide if the system is ready for deployment 2. Assess different deployment configurations Examples of Use of Domain Metric KPI:
  • 73. Approach: Summary 15 Production Sampled Load Tests Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 1,2 Baseline Test Scalability Criteria Deployment Conf. Domain Metric 0.73 4 BenchFlow BF 3 Examples of Use of Domain Metric KPI: 1. Decide if the system is ready for deployment 2. Assess different deployment configurations
  • 74. Approach: Summary 15 Production Sampled Load Tests Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 1,2 Baseline Test Scalability Criteria Deployment Conf. Domain Metric 0.73 4 BenchFlow BF 3 3. Configure scaling plans and autoscalers Examples of Use of Domain Metric KPI: 1. Decide if the system is ready for deployment 2. Assess different deployment configurations
  • 75. 16 Approach: Extensions 0.12 0.14 0.20 0.16 0.11 Observed load situations Time LoadLevel Production Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. Baseline Test Sampled Load Tests Scalability Criteria Domain Metric Test Results Deployment Conf. 0.73 1 23 4
  • 76. 16 Approach: Extensions 0.12 0.14 0.20 0.16 0.11 Observed load situations Time LoadLevel Production Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. Baseline Test Sampled Load Tests Scalability Criteria Domain Metric Test Results Deployment Conf. 0.73 1 23 4
  • 77. 16 Approach: Extensions 0.12 0.14 0.20 0.16 0.11 Observed load situations Time LoadLevel Production Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. Baseline Test Sampled Load Tests Scalability Criteria Domain Metric Test Results Deployment Conf. 0.73 1 23 4
  • 78. ContinuITy Framework 17 Production [Schulz et al.] Henning Schulz,Tobias Angerstein, and André van Hoorn.Towards Automating Representative Load Testing in Continuous Software Engineering. In Proc. of ICPE 2018 Companion. 123-126.
  • 79. ContinuITy Framework 18 Production Sampled Load TestsSampled Load Tests Load Levels API Calls Op. Mix [Schulz et al.] Henning Schulz,Tobias Angerstein, and André van Hoorn.Towards Automating Representative Load Testing in Continuous Software Engineering. In Proc. of ICPE 2018 Companion. 123-126.
  • 80. ContinuITy Framework 18 Workload Model Production Sampled Load TestsSampled Load Tests Load Levels API Calls Op. Mix [Schulz et al.] Henning Schulz,Tobias Angerstein, and André van Hoorn.Towards Automating Representative Load Testing in Continuous Software Engineering. In Proc. of ICPE 2018 Companion. 123-126.
  • 81. ContinuITy Framework 18 APIs SpecificationWorkload Model + Production Sampled Load TestsSampled Load Tests Load Levels API Calls Op. Mix [Schulz et al.] Henning Schulz,Tobias Angerstein, and André van Hoorn.Towards Automating Representative Load Testing in Continuous Software Engineering. In Proc. of ICPE 2018 Companion. 123-126.
  • 82. ContinuITy Framework 18 APIs Specification Annotations ModelWorkload Model + + Production Sampled Load TestsSampled Load Tests Load Levels API Calls Op. Mix [Schulz et al.] Henning Schulz,Tobias Angerstein, and André van Hoorn.Towards Automating Representative Load Testing in Continuous Software Engineering. In Proc. of ICPE 2018 Companion. 123-126.
  • 83. Future Work 19 Automated update of tests definition:
 - Automatically update load levels, API calls, operations mix

  • 84. Future Work 19 Automated update of tests definition:
 - Automatically update load levels, API calls, operations mix
 Declarative Definitions for Load Testing:
 - Make performance engineering activities more accessible

  • 85. Future Work 19 Automated update of tests definition:
 - Automatically update load levels, API calls, operations mix
 Declarative Definitions for Load Testing:
 - Make performance engineering activities more accessible
 Declarative Performance Test Regression: 
 - Extend the proposed approach to continuously assess regressions in performance and PASS/FAIL CI/CD pipelines

  • 86. Future Work 19 Automated update of tests definition:
 - Automatically update load levels, API calls, operations mix
 Declarative Definitions for Load Testing:
 - Make performance engineering activities more accessible
 Declarative Performance Test Regression: 
 - Extend the proposed approach to continuously assess regressions in performance and PASS/FAIL CI/CD pipelines
 Prioritisation and Selection of Load Tests: 
 - Exploit performance models to reduce even more the # of tests
  • 87. 20 Highlights 4 The system changes frequently:
 - Need to keep the test definition updated
 Too many tests to be executed:
 - Need to reduce the number of executed tests
 Need to integrate tests in delivery pipelines:
 - Need to have complete automation for test execution
 - Need to have clear KPIs to be evaluated in the pipeline Challenges Challenges
  • 88. 20 Highlights 4 The system changes frequently:
 - Need to keep the test definition updated
 Too many tests to be executed:
 - Need to reduce the number of executed tests
 Need to integrate tests in delivery pipelines:
 - Need to have complete automation for test execution
 - Need to have clear KPIs to be evaluated in the pipeline Challenges Challenges Our Approach 5 Our Approach Observed load situations Time LoadLevel Production Baseline Test Sampled Load Tests Scalability Criteria Deployment Conf. Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. 1 Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 2 0.12 0.14 0.20 0.16 0.11 Test Results 3 Domain Metric 0.73 4
  • 89. 20 Highlights 4 The system changes frequently:
 - Need to keep the test definition updated
 Too many tests to be executed:
 - Need to reduce the number of executed tests
 Need to integrate tests in delivery pipelines:
 - Need to have complete automation for test execution
 - Need to have clear KPIs to be evaluated in the pipeline Challenges Challenges Our Approach 5 Our Approach Observed load situations Time LoadLevel Production Baseline Test Sampled Load Tests Scalability Criteria Deployment Conf. Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. 1 Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 2 0.12 0.14 0.20 0.16 0.11 Test Results 3 Domain Metric 0.73 4 Preliminary Experiments 6 Preliminary Experiments Stab = avg + 3σ Production 12 microservices Custom Op. Mix Sampled Load Tests Empirical Distribution of Load situations 6 Load Levels 50,100,150,200,250,300 Users 1,2 Deployment Config. 10 configurations Replicas CPURAM 3
  • 90. 20 Highlights 4 The system changes frequently:
 - Need to keep the test definition updated
 Too many tests to be executed:
 - Need to reduce the number of executed tests
 Need to integrate tests in delivery pipelines:
 - Need to have complete automation for test execution
 - Need to have clear KPIs to be evaluated in the pipeline Challenges Challenges Our Approach 5 Our Approach Observed load situations Time LoadLevel Production Baseline Test Sampled Load Tests Scalability Criteria Deployment Conf. Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. 1 Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. 2 0.12 0.14 0.20 0.16 0.11 Test Results 3 Domain Metric 0.73 4 Preliminary Experiments 6 Preliminary Experiments Stab = avg + 3σ Production 12 microservices Custom Op. Mix Sampled Load Tests Empirical Distribution of Load situations 6 Load Levels 50,100,150,200,250,300 Users 1,2 Deployment Config. 10 configurations Replicas CPURAM 3 Extensions and Future Work 16 Approach: Extensions 0.12 0.14 0.20 0.16 0.11 Observed load situations Time LoadLevel Production Empirical Distribution of Load situations 50 200 350 500 Load Level Rel.Freq. Empirical Distribution of Load situations 100 300 500 Sampled Load Level Aggr.Rel.Freq. Baseline Test Sampled Load Tests Scalability Criteria Domain Metric Test Results Deployment Conf. 0.73 1 23 4