Load testing with thousands of concurrent requests #7332

This discussion has been migrated from our Github Discussion #7332


yogesh-dalvi125d ago

Load testing with thousands of concurrent requests

If Load tested the application with Jmeter with 300-400 concurent requests then the application renders every requests but when the number of requests are increased to >500 then the appliction rejects most of the requests.

Steps to reproduce the behavior

  1. Download Jmeter.
  2. Create a new test plan and a new Thread group.
  3. Increase number of threads to more than 500 in thread group
  4. Set ramp-up period to 0
  5. Set loop count to 1
  6. Test a Strapi post request by adding Sampler > HTTP Request
  7. Add A LIstner> View results in table as Chlld to the Http request.
  8. Simultate the test plan. Most of the requests are getting rejected.

For a normal Restful Nodejs application having a post method the requests are not rejected.
Is there any way by which the above scenario can be handled?

1 Like

Responses to the discussion on Github


derrickmehaffy124d ago

Collaborator

We need more information here.

Strapi version, host OS, Database type, Database location (local/remote), type of deployment (IE where is it deployed).

In this case there is too many unknown variables to say for sure.


yogesh-dalvi124d ago

Author

Strapi version :- 3.1.1
Node Version:- v12.18.0
Npm version :- 6.14.4
host OS :- Windows 10
Database type:- Postgres
Database Location:- local
Deployed on local system.
JMeter version:- 5.3
java version :- “1.8.0_261” (64 bit)

I am attaching some screen shots of the Jmeter
1
2

PS:- I made a sample Restful node js application with post method just for testing whether the issue is only with strapi or with all nodejs application. When I call this post api from JMeter then the above mentioned scenario i.e calling 500+ concurrent requests works fine without any errors but it breaks only while using strapi.
Below is the code for the sample nodejs application for testing

const express = require("express");
const app = express();
app.use(express.json());

app.post("/test", (req, res) => { res.send("received"); }`
const port = process.env.PORT || 8080; app.listen(port, () => console.log(`Listening on port ${port}..`));

derrickmehaffy124d ago

Collaborator

@yogesh-dalvi Thank you for the information, I want to do a bit of testing myself in variable environments but I will poke @lauriejim and @alexandrebodin to see if they have any suggestions.

My initial thoughts are the tuning of the PG database (database tends to be the largest bottleneck) and the vertical (PM2 Clustering for example) and horizontal (more deployments load balanced by something like Nginx) scaling. In a localhost scenario you can only really test with vertical scaling, but as you grow your application horizontal scaling will be much easier and provide more benefit.

But that doesn’t answer the application question you had, and that’s what I want to test over the next few days.


yogesh-dalvi124d ago

Author

@derrickmehaffy Thank you so much for your response :slight_smile:


derrickmehaffy124d ago

Collaborator

One key core difference between your testing on Strapi vs your basic node.js sample is dealing with the ORM. Strapi has to format, construct, and send the database query to save the data and at the same time it has to wait for the response from the database transaction. Thus if that transaction fails (timeout, error, whatever) it has to try and resend it until it hits the failure point.

Naturally since Strapi is node.js based, it means that Strapi is naively single threaded, in your example that isn’t a problem because it sends the request and start processing the next with no holdback, but if you tie that sample application to a database and start storing the POST’ed data you will see that sample number drop exponentially as the database becomes your bottleneck.

You can scale out Strapi horizontally/vertically as much as possible, but that won’t stop the bottleneck of the database which needs to be able to handle the load or scale as well. (this type of comparison isn’t entirely black and white and there is lot of metrics to look at other than just requests per second)

In your case my guess is the number of database connections at one time was hit, and Strapi was waiting for the response from the database thus your pending POST requests were rejected because they couldn’t be handled. This is certainly something Strapi can and will be improving on (see: Database layer (v4) - Roadmap | Product Roadmap) but in the short term you can handle this by “beefing up” the database itself and scaling strapi out horizontally.


yogesh-dalvi124d ago

Author

Thanks @derrickmehaffy for such quick response.
Basically Inside the post api’s controller function looks something like below

async test(ctx) { return "success"; },

Basically no database calling is being done in the controller as of now as initially I thought of just testing it first before proceding ahead.
So the database scenario you mentioned which might be causing the issue, will it be applicable to my above use case also?


derrickmehaffy124d ago

Collaborator

There are still database queries happening, if you edit the ./config/database.js file and in the options set debug: true and start making requests you will see at least 1, probably 2 for the users-permissions plugin that is ran on every single request to validate the user (or public non-authed user) has the permissions to access the requested controller.


yogesh-dalvi124d ago

Author

Ohh I had completely forgotten about that. Thanks for the quick replies @derrickmehaffy


derrickmehaffy124d ago

Collaborator

No problem :slight_smile: