on the road to benchmarking bpmn 2.0 workflow engines

40
Architecture, Design and Web Information Systems Engineering Group Vincenzo Ferme, Cesare Pautasso Faculty of Informatics University of Lugano (USI) Switzerland Marigianna Skouradaki, Dieter H. Roller, Frank Leymann Institute of Architecture and Application Systems University of Stuttgart Germany ON THE ROAD TO BENCHMARKING BPMN 2.0 WORKFLOW ENGINES

Upload: vincenzo-ferme

Post on 18-Jul-2015

347 views

Category:

Technology


1 download

TRANSCRIPT

Architecture, Design and Web Information Systems Engineering Group

Vincenzo  Ferme,  Cesare  Pautasso  Faculty  of  Informatics  

University  of  Lugano  (USI)  Switzerland

Marigianna  Skouradaki,  Dieter  H.  Roller,  Frank  Leymann  Institute  of  Architecture  and  Application  Systems  

University  of  Stuttgart  Germany  

ON THE ROAD TO BENCHMARKING BPMN 2.0

WORKFLOW ENGINES

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

Application Server DBMS

2

Workflow Engine

What is a Workflow Engine?

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Job Executor

Core Engine

Transaction Manager

Instance DatabasePersistent Manager

Process Navigator

A

B

CD

Task DispatcherUsers

Service Invoker

Web Service

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

3

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Many Business Process Modeling/Execution Languages

BPELEPC YAWLXPDL BPMN

1992 1998 2002 2004

PNML

2008

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

4

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

BPMN 2.0: A Widely Adopted Standard

Jan 2011

BPMN 2.0

Jan 2014

BPMN 2.0.2

https://en.wikipedia.org/wiki/List_of_BPMN_2.0_engines

ISO/IEC 19510

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

5

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Why do we need a benchmark?companies, developers

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

5

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Why do we need a benchmark?companies, developers

1. How to choose the best engine according to the company requirements?

2. How to choose the best engine according to the company business process models? B

CD

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

5

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Why do we need a benchmark?companies, developers

1. How to choose the best engine according to the company requirements?

2. How to choose the best engine according to the company business process models? B

CD

4. How to find out the engine bottlenecks?

3. How to evaluate performance improvements during the engine development?

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

6

Main Challenges in Benchmarking BPMN 2.0 Workflow Engines

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

6

Main Challenges in Benchmarking BPMN 2.0 Workflow Engines

WORKLOAD CHARACTERIZATION

BENCHMARK EXECUTION

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

6

Main Challenges in Benchmarking BPMN 2.0 Workflow Engines

WORKLOAD CHARACTERIZATION

BENCHMARK EXECUTION

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

2. Define the Load Functions

1. Define the Workload Mix

20%

80%D

A

B

C

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

6

Main Challenges in Benchmarking BPMN 2.0 Workflow Engines

WORKLOAD CHARACTERIZATION

BENCHMARK EXECUTION

4. Asynchronous execution of business processesInstance Database

5. Define meaningful and reliable KPIs

3. Deal with engine-specific interfaces and BPMN 2.0 customizations

xEn

gine

↔ U

sers

Clie

nt →

Eng

ine

Engi

ne ↔

Web

Ser

vice

s

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

2. Define the Load Functions

1. Define the Workload Mix

20%

80%D

A

B

C

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

7

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

1. Define the Workload Mix

7

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

7

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

1. Define the Workload Mix

7

Data Flow

A BC

B

A

Control Flow

Events Activities Task Types Execution Behavior

A B

C D

E F

G

H

I

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

8

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

1. Define the Workload Mix

8

NUMBER OF ENGINES SUPPORTING THE FEATURE

NU

MB

ER O

F R

EAL-

WO

RLD

MO

DEL

S

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

8

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

1. Define the Workload Mix

8

NUMBER OF ENGINES SUPPORTING THE FEATURE

NU

MB

ER O

F R

EAL-

WO

RLD

MO

DEL

S

0

200

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

8

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

1. Define the Workload Mix

8

NUMBER OF ENGINES SUPPORTING THE FEATURE

NU

MB

ER O

F R

EAL-

WO

RLD

MO

DEL

S

0

200

12

950

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

8

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

1. Define the Workload Mix

8

NUMBER OF ENGINES SUPPORTING THE FEATURE

NU

MB

ER O

F R

EAL-

WO

RLD

MO

DEL

S

0

200

12

950

2 4 5 8 10

F400

600

700

800

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

9

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

2. Define the Load Functions

Workflow Engine Users

A

B

C

D

Start

Start

Web Services

Users

Events

Instance Database

Application Server

DBMS

Web Service

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

10

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

3. Deal with engine-specific interfaces and BPMN 2.0 customizations

Workflow Engine UsersLoading Driver

A

B

CD

Instance Database

Application Server

DBMS

Web Service

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

11

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

4. Asynchronous execution of processes

Workflow Engine Users

A

B

CD

Start

Loading Driver

Instance Database

Application Server

DBMS

Web Service

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

11

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

4. Asynchronous execution of processes

Workflow Engine Users

A

B

CD

Start

Loading Driver

Instance Database

End

Application Server

DBMS

Web Service

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

12

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

The BenchFlow Project

Design the first benchmark to assess and compare the performance of Workflow Engines that are compliant with Business Process Model and Notation 2.0 (BPMN 2.0) standard

”En

gine

↔ U

sers

Instance Database

Clie

nt →

Eng

ine

Engi

ne ↔

Web

Ser

vice

s

20%

80%D

A

B

C

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

13

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

1. Define the Workload Mix

What we need: even more (anonymized) real-world BPMN 2.0 process models

A

B

CD

H

F

EG

REAL-WORLD PROCESSES

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

13

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

1. Define the Workload Mix

What we need: even more (anonymized) real-world BPMN 2.0 process models

A

B

CD

H

F

EG

REAL-WORLD PROCESSES

[SOSE2015]

a3

a2

a1

a6

a5

a4

Graph Matching

REOCCURRING STRUCTURES

Skouradaki et al.

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

Selection Criteria

13

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

1. Define the Workload Mix

What we need: even more (anonymized) real-world BPMN 2.0 process models

A

B

CD

H

F

EG

REAL-WORLD PROCESSES

[SOSE2015]

a3

a2

a1

a6

a5

a4

Graph Matching

REOCCURRING STRUCTURES

Skouradaki et al.

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

Selection Criteria

13

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

1. Define the Workload Mix

What we need: even more (anonymized) real-world BPMN 2.0 process models

A

B

CD

H

F

EG

REAL-WORLD PROCESSES

WORKLOAD MIX

a1

a2

a1

a3 a550%

50%

Composition Criteria

[SOSE2015]

a3

a2

a1

a6

a5

a4

Graph Matching

REOCCURRING STRUCTURES

Skouradaki et al.

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

14

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Enabling the Benchmark Execution

Workflow Engine

Application Server

Users

DBMS

A

B

CD

Loading Driver

Instance Database

Web Service

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

15

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Workflow Engine

Instance Database

UsersFabanharness

Faban Drivers

A

B

CD

Loading Functions

Enabling the Benchmark Execution

Application Server

DBMS

Web Service

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

Workflow Engine

16

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

harness

Web ServiceDBMS

1. Flexible deployment2. Flexible HW Resources3. Frozen Initial Condition

Faban Drivers

Enabling the Benchmark Execution

Faban

Docker ContainersServers

A

B

CD

Loading Functions

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

17

harness

Faban Drivers

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Enabling the Benchmark Execution

Faban+

Workflow Engine

DBMS

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

1. Automatically deploy and start the benchmark environment;

17

harness

Faban Drivers

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Enabling the Benchmark Execution

Faban+

Workflow Engine

DBMS

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

1. Automatically deploy and start the benchmark environment;

17

harness

Faban Drivers

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

Enabling the Benchmark Execution

Faban+

Workflow Engine

DBMS

A

B

CD

2. Automatically deploy the workload mix;

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

1. Automatically deploy and start the benchmark environment;

17

harness

Faban Drivers

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

3. Determine when the benchmark ends;

Enabling the Benchmark Execution

Faban+

Workflow Engine

DBMS

A

B

CD

MONITOR

2. Automatically deploy the workload mix;

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

1. Automatically deploy and start the benchmark environment;

17

harness

Faban Drivers

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

3. Determine when the benchmark ends;

Enabling the Benchmark Execution

Faban+

Workflow Engine

DBMS

A

B

CD

MONITOR

Instance Database

CO

LLECT

OR

S

2. Automatically deploy the workload mix;

4. Collect the execution and process logs.

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

18

Context » BPMN 2.0 Adoption » Why a Benchmark? » Challenges » BenchFlow » Next Steps » Conclusions

The BenchFlow Project Next Steps

• Release the first prototype of the Benchmark environment » Yes: Abstract the Interaction with the Engines; Automatic Deploy and Undeploy of the S.U.T.; Execution and Process Log Gathering » No: Automatic Generation of Drivers; Users, Web Services and External Catching Business Events

• Release the first prototype of the Workload Mix synthesizer

• First Experiments with KPIs Definition and Computation

• Collect More Process Models and Process Execution Logs

BenchFlow  Project:  http://design.inf.usi.ch/research/projects/benchflow

Architecture, Design and Web Information Systems Engineering Group

BACKUP SLIDES

Vincenzo  Ferme,  Cesare  Pautasso  Faculty  of  Informatics  

University  of  Lugano  (USI)  Switzerland

Marigianna  Skouradaki,  Dieter  H.  Roller,  Frank  Leymann  Institute  of  Architecture  and  Application  Systems  

University  of  Stuttgart  Germany  

• Cited Works;• Related Works.

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

20

Cited Works

DBMS

[SOSE2015]Skouradaki, Marigianna; Goerlach, Katharina; Hahn, Michael; Leymann, Frank. Application of Sub-Graph Isomorphism to Extract Reoccurring Structures from BPMN 2.0 Process Models. In Proceedings of 9th International IEEE Symposium on Service-Oriented System Engineering (SOSE 2015). San Francisco Bay, USA, March 30 - April 3, 2015. (to appear)

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

21

Related Works

DBMS

Active Endpoints Inc. Assessing ActiveVOS performance, 2011. http://www.activevos.com/content/developers/ technical_notes/assessing_activevos_performance.pdf.

D. Bianculli, W. Binder, and M. L. Drago. SOABench: Performance evaluation of service-oriented middleware made easy. In Proc. of ICSE’10 - Volume 2, pages 301–302, 2010.

J. Cardoso. Business process control-flow complexity: Metric, evaluation, and validation. International Journal of Web Services Research, 5(2):49–76, 2008.

G. Din, K.-P. Eckert, and I. Schieferdecker. A workload model for benchmarking BPEL engines. In Proc. of ICSTW’08, pages 356–360, 2008.

M. Dumas, L. Garćıa-Bañuelos, and R. M. Dijkman. Similarity search of business process models. IEEE Data Eng. Bull., 32(3):23–28, 2009.

J. Gray. The Benchmark Handbook for Database and Transaction Systems. Morgan Kaufmann, 2nd edition, 1992.

G. Hackmann, M. Haitjema, C. Gill, and G.-C. Roman. Sliver: A BPEL workflow process execution engine for mobile devices. In Proc. of ICSOC’06, pages 503–508. Springer, 2006.

S. Harrer, J. Lenhard, and G. Wirtz. BPEL conformance in open source engines. In Proc. of SOCA’12, pages 1–8, 2012.

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

22

Related Works

DBMS

K. Huppler. The art of building a good benchmark. In Performance Evaluation and Benchmarking, pages 18–30. Springer, 2009.

Intel and Cape Clear. BPEL scalability and performance testing. White paper, 2007.

F. Leymann. Managing business processes via workflow technology. In Proc. of VLDB 2001, pages 729–, 2001.

A. Liu, Q. Li, L. Huang, and M. Xiao. Facts: A framework for fault-tolerant composition of transactional web services. IEEE Trans. on Services Computing, 3(1):46–59, 2010.

J. Mendling. Metrics for Process Models: Empirical Foundations of Verification, Error Prediction, and Guidelines for Correctness. Springer, 2008.

I. Molyneaux. The Art of Application Performance Testing: Help for Programmers and Quality Assurance. O’Reilly, 2009.

M. Z. Muehlen and J. Recker. How much language is enough? theoretical and practical use of the business process modeling notation. In Proc. of CAiSE’08, pages 465–479, 2008.

C. Röck and S. Harrer. Literature survey of performance benchmarking approaches of BPEL engines. Technical report, Otto-Friedrich University of Bamberg, 2014.

D. H. Roller. Throughput Improvements for BPEL Engines: Implementation Techniques and Measurements applied in SWoM. PhD thesis, University of Stuttgart, 2013.

Architecture, Design and Web Information Systems Engineering Group

Vincenzo Ferme

23

Related Works

DBMS

N. Russell, W. M. van der Aalst, and A. Hofstede. All that glitters is not gold: Selecting the right tool for your BPM needs. Cutter IT Journal, 20(11):31–38, 2007.

D. Schumm, D. Karastoyanova, O. Kopp, F. Leymann, M. Sonntag, and S. Strauch. Process fragment libraries for easier and faster development of process-based applications. CSSI, 2(1):39–55, 2011.

M. Skouradaki, D. Roller, C. Pautasso, and F. Leymann. BPELanon: Anonymizing BPEL processes. In Proc. of ZEUS’14, pages 9–15, 2014.

Sun Microsystems. Benchmarking BPEL service engine, 2007. http://wiki.open-esb.java.net/Wiki.jsp?page=BpelPerformance.html.

B. Wetzstein, P. Leitner, F. Rosenberg, I. Brandic, S. Dustdar, and F. Leymann. Monitoring and analyzing influential factors of business process performance. In Proc. of EDOC’09, pages 141–150, 2009.