Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Summary

This article describes the methodology and result of performance testing on the Ed-Fi ODS / API v5.2.

In brief, performance testing did not uncover any significant concerns in performance relative to previous suite 3 versions.


Article Contents:

Table of Contents
maxLevel2
excludeSummary

Test Methodology

Volume Tests

Volume testing of v5.2 occurred in July 2021, using the Locust-based suite 3 performance testing framework (an Exchange contribution available in GitHub). This volume test covers the resources and HTTP verbs described in the Ed-Fi SIS vendor certification process. It runs for 30 minutes, spawning 30 clients that run simultaneously to perform tens of thousands of operations.

Data Set

Populated Template data set, which contains approximately 1000 students

Azure Test Lab Virtual Machines

The test lab environment used a three-server setup: one each for the database, web applications, and the Locust performance testing. VM "Sizes" listed here, such as "DS11_v2", are Microsoft-defined names for the specs of an Azure VM. Key specs are listed beside these size names. These sizes were chosen as their specs are comparable to those of the Vendor Certification VMs but have SSD disks to more closely match a production environment.

Database VM

Image: Free SQL Server License: SQL Server 2017 Developer on Windows Server 2016
Size: DS11_v2 (Standard, 2 vcpus, 14 GB ram, 6400 max iops, 127GB local SSD)

Web VM

Image: Windows Server 2016 Datacenter
Size: B2ms (Standard, 2 vcpus, 8 GB ram, 4800 max iops, 127GB local SSD)

Test Runner VM

Image: Windows Server 2016 Datacenter
Size: B2ms (Standard, 2 vcpus, 8 GB ram, 4800 max iops, 127GB local SSD)

Test Results

Experimentation Summary

These tests included the out-of-the-box installation of 3.3.0, 5.0.1, and 5.2. 

VersionExecution Date# of RequestsMean Response Time in msMax Response Time in ms
3.3.0

 

64,687589797

5.0.1

 

63,943659848

5.2 (Change Queries Enabled)

   

63,814

73

9885

Key Take-Aways

  • No two executions of the same code/configuration will result in the exact same mean response time — there is a degree of randomness in the Locust-based clients. Thus the difference between 130 58 ms, 135 65 ms, and 136 73 ms is not significant.
  • More thorough analysis with larger Northridge dataset could reveal more details. 
  • A higher max response time was recorded during this test run on all versions of the API tested. On analyzing this further, volume distribution shows that 99% of the requests had similar response times as previous runs and shows a consistent behavior across releases. This leads to conclude that the max response time outliers are possibly due to environmental changes overtime. 

The table below shows volume distribution for some recourses across API versions. The 50% column here represents the time under which 50% of the requests were completed.

VersionMethod# requests50%66%75%80%90%95%98%99%100%
3.3.0POST /data/v3/ed-fi/parents535627878931101301702309700
3.3.0POST /data/v3/ed-fi/staffs20517893931101301602205509000
3.3.0POST /data/v3/ed-fi/students4729466262789311017018009700












5.0.1POST /data/v3/ed-fi/parents547627878931101301902309000
5.0.1POST /data/v3/ed-fi/staffs20787893931101301602002709700
5.0.1POST /data/v3/ed-fi/students480946626263931101602309800












5.2POST /data/v3/ed-fi/parents557626278781101301902308300
5.2POST /data/v3/ed-fi/staffs2061627878931201602705309300
5.2POST /data/v3/ed-fi/students559946626263789413010609800

Server Stats

Overall the web and database server statistics do not show any serious concerns. 

Web Server Stats

Image RemovedImage Added


Image RemovedImage Added

Database Server Stats

Image RemovedImage Added

Image RemovedImage Added