Performance Parameters - PUG Challenge Americas

Performance testing of Progress
Appservers and a plug-in for Jmeter
Syed Irfan Pasha
Progress Software
k
Functionality

Existing
features
2
User
Interface

Vs
Performance
?
New
features
Agenda
Performance testing of Appservers
OpenEdge Plug-in “in” Jmeter
Demo
3
Process followed for OpenEdge Appservers Performance testing
Statistics we get out of
our Performance tests
Process followed by
PSC
Process
Scenarios
Different types of tests
and their significance
Measurement
Beyond
Performance
Siblings of
Performance like
Load, Endurance etc..
Emphasizing more on Process than the numbers
4
Components involved and factors measured
OpenEdge ABL
Session-Managed
Clients
Session-free
Jmeter for SOAP and REST
Java OpenClient
Operating
Modes
Transports
Performance Parameters
 Client Execution time
 Agent Execution time
Classic
PAS for OE
5
Appservers
Direct/Nameserver
 Throughput
AIA/APSV
 Resource Utilization
REST & SOAP
 Network I/O
Types of tests followed
 Types of Tests
OpenEdge
Clients
OpenEdge
Appserver
OpenEdge
ABL Client
C
R
6
U
OpenEdge
OpenEdge
D
DB
Appserver
ATM tests
CRUD Operation
Not the real one
Database transactions with
datasets
Datasets transfer to-fro
Types of tests - ATM
OpenEdge
Clients
OpenEdge
Appserver
OpenEdge
ABL Client
C
R
1
•
•
•
U
OpenEdge
OpenEdge
D
DB
Appserver
Adds, deletes and modifies accounts.
Measures the number of transactions completed and the average seconds for each transaction
Transaction times are measured at the AppServer agent not at the client.
Measure - Transactions per sec(TPS)
7
Types of tests – CRUD Operation
OpenEdge
Clients
OpenEdge
Appserver
OpenEdge
ABL Client
C
R
2
•
•
•
U
OpenEdge
OpenEdge
D
DB
Appserver
Huge datasets performing CRUD operations
Runs for all transports ( Direct/Nameserver, AIA/APSV, REST and SOAP)
Measure execution time at the Client & Resource Utilization of all servers
Measure - Average client execution time in millisec
8
Types of tests – Dataset transfer
OpenEdge
Clients
OpenEdge
Appserver
OpenEdge
ABL Client
C
R
3
•
•
U
OpenEdge
OpenEdge
D
DB
Appserver
Huge payloads of datasets from Client to Appserver to and fro
Measures Client execution time and Resource Utilization
Measure - Average client execution time in millisec
9
Types of tests – Summary
OpenEdge
Clients
OpenEdge
Appserver
OpenEdge
ABL Client
C
R
U
OpenEdge
OpenEdge
D
DB
Appserver
All tests run for single-client and multiple concurrent clients
1
2
3
Each test has its own significance
Combined, we could get different Performance Statistics and perform
various kinds of Performance testing
10
More than Performance tests
Profiling
Compression
Webserver & Agent
With various
payloads/network
Load testing
Load by Payload/Clients
11
Endurance
Testing
Running for more than a
week
Garbage
Collection
Running with different
collectors
Performance Results - Good
CRUD - PAS for OE Response times (%) Improvement
11.5.1
21%
10%
48%
28%
PAS OE Vs Classic
Create
Update
Read
Delete
52%
30%
25%
22%
(APSV
Direct)
Initial runs
% PASOE slower than Classic
ATM Tests
12
MS-Agent =
(faster)
% PASOE faster than Classic
8X Classic-Agent
Does not apply to the Client
PASOE – 1 Agent / n sessions
Response time
Classic – n Agents
Performance Results – Bad & Ugly
Dataset Transfer to & fro
CRUD - SOAP Transport
1200
Response time in millisec
1400
1200
1000
1281
1263
2 X PASOE = Classic
1000
800
1029
800
600
22%
600
400
Vs
400
200
200
0
0
APSV
Classic - Direct
AIA
Working on HTTP transport to boost Performance
Create
Update
Read
Delete
PASOE-SOAP
11.5.1 is 10% faster than 11.5
13
OpenEdge Apserver Performance tool

Expectations from a OpenEdge Appserver Performance tool
• Performance tool/utility that could be seamlessly easy to run ABL business logic over OE-Transport
• Should be very easy to parse/validate the request or response
• Parameterization and concurrent clients execution capabilities
Above all ....
Accurate Calculations
Integration
Platform Independent
Concurrency
Robust Framework
15
Establish confidence that it can be used for
Performance testing
Considering an Open Source tool and enhancing it
Robust Framework
Accurate Stats
•
•
•
Develop everything from scratch
Define a Standard
Considering a OpenSource tool
and extending it
Making it Stable
Enhancements to be added to use Jmeter
Concurrent Clients
Integration
OE Adapter in
Jmeter
OE I/O variable
parsing
Jmeter-OE
Serialization
Parameterization
for OE
GUI for OE
OE Response
Parser
Platform Independent
Jmeter
16
Jmeter and its whereabouts
 Apache Jmeter
•
Apache Jakarta Project for Performance/load testing
of variety of Java Applications(originally)
•
Later added support for various services
• JDBC,JMS, FTP,LDAP,TCP, HTTP, SOAP & REST
•
Advanced support for monitoring resources and
distributing client load remotely
•
•
Gives all kinds of statistics and reports
•
Many functions & scripting options
Features like Assertions, Post Processors, Preprocessors
A short tour to Jmeter
17
Considering Jmeter for the Framework

Considering Jmeter
OpenClient in
Jmeter
Reliable
• Proven Performance
testing tool
• Mostly used for API’s
testing
• Calls through Java OpenClient
to Appserver
• Designed Sampler for
OpenEdge needs
Easy to use
• Tool without OE install
• Works on all platforms
Apache Jmeter
OpenEdge Plug-in
18
OpenEdge Application
Server
Implementing customized basic Java Sampler
A custom Java Sampler developed using AbstractJavaSamplerClient
Interface
19
Customized JavaSampler and the Use-case
 We were expecting more in this Sampler
1. A new GUI Interface with a different functionality
2. Dynamic class reader to parse the input/output variables
3. Dynamic input request file parser for complex datatypes
4. Storing the entire input in one file and passing the input to the sampler in the same format
All together, we wanted a new Sampler
20
Framework in a bigger Picture
Jmeter Samplers
HTTP
SOAP
OpenEdge Appserver
OpenEdge Response Parser
Customized JSON Parser
OpenEdge Appserver
GUI Interface for OE
Sampler Processer
OE Client
21
OE Client Response
Builder
Testing an OpenEdge Procedure
Create Customers
Array of Temp-tables - “n” elements
22
& Update Customers
Testing OpenEdge Procedure - Scenarios
Executing a OpenEdge procedure using
“OpenEdge Plug-in for Jmeter”
•
•
•
•
Execute the dataset procedure
Validate the output/response
Stop if your validation is failed
Pass the output of create to update
operation
•
Parameterize tt-1 attributes for “n”
elements running with “m” concurrent users
for “iter” iterations
•
•
Validate the output/response
Parse/get a specific elements attribute
value
Measure the Performance for every Appserver call with all statistics
23
Testing OpenEdge Procedure – Scenarios(Cntd..)
•
Parameterize tt-1 attributes for “n” elements running with “m” concurrent users for “iter”
iterations
Request Payload
User-1
Iterations
User-2
Dataset ds-1
temp-table tt-1[0]
name
DO WHILE cnt <= iter
/* concurrent users
execution */
User-3
.............
.............
Assign cnt = cnt + 1.
End.
24
custnum
temp-table tt-1[n]
name
User-m
custnum
Time for a Demo
Code , Time
& Effort
Lines of Code and Statistics
4500
Lines of Code

27
OpenEdge Client
Jmeter Samplers
1500
3000
2000 Lines of Code
2500 Lines of Code
2 ½ Engineers – 5 Days
1 ½ Engineers – 30 Days
Developing the base framework



Make the framework resilient
Things were getting complex
Adding more features
Just ask ! I am here to answer now and even later
[email protected]
Performance Results - Good
CRUD - PAS for OE Response times (%)
Improvement
PASOE –APSV Vs Classic-Direct
Initial runs
Create
Update
52%
30%
ATM Tests
11.5.1
21%
10%
MS-Agent =
8X Classic-Agent
(faster)
- Does not apply to the Client
Read
25%
48%
Delete
22%
28%
% PASOE slower than Classic
Response time
% PASOE faster than Classic
PAS for OE(MS-Agent with n sessions) = Classic n Agents
30