I would like to know how do you handle with large image upload ? I have a client who want to upload around 20 images (each image around 1.5 Mo to 2Mo) and unfortunatly my app crashed on heroku each time he wants to upload its huge images… I use the @strapi/provider-upload-cloudinary and the app hosted in Heroku.
Everythings worked perfectly when I resize the image and compress it before the upload… But my client doesn’t want to loose time to do this kind of handling… Do you have any suggestion to figure out my issue ? I tried to upgrade to Standard 2X to have 1GB RAM just for testing but got the same problem.
- **Strapi Version4.1.12:
- **DatabasePostgres ^8.7.3:
Node Version>=12.x.x <=16.x.x:
I have the same issue as well. Which image provider are you using? Im using image kit: strapi-provider-upload-imagekit - npm
Im wondering is it related to the warning Im given:
Warning: The upload provider "strapi-provider-upload-imagekit" doesn't implement the uploadStream function. Strapi will fallback on the upload method. Some performance issues may occur.
Thank you for your reply Shaun,
As mentioned in my first post I am using the @strapi/provider-upload-cloudinary provider. I tested as well the @strapi/provider-upload-aws-s3 without any success with uploading around 20 large images.
Did you test with another provider ?
It’s so sad, I think there is something wrong with the strapi uploader while handling with bulk image…
Yeah it sucks and in my case Im only uploading max 4 images at a time and same thing is happening. Im using imagekit and having the same issue - strapi-provider-upload-imagekit - npm. I also tried forking the provider library and adding the uploadStream function, which removed the warning above, but didn’t help with the memory leak issue unfortunately
I just spent the whole day trying to solve this memory leak issue and the only thing that worked for me in the end was doing these steps: sharp - High performance Node.js image processing
so far it looks good, I was finally able to upload multiple larger images without running out of memory!
hope this helps!
I’ve also encountered an apparent memory leak when uploading images.
In my case I’m deploying on Render using SQLite with uploads persisted to a disk. In the Strapi Media Library settings, I have ‘Responsive friendly upload” and “Size optimization” set to false.
I can upload 10 small images totaling about 324KB with no problem. But if I try uploading a batch of images that total about 25MB, the Strapi instance crashes.
(My project is on the starter plan with a 512MB memory limit.)
After crashing, Strapi will auto restart. When it does, it uses about 146M of memory. If I try uploading just one image at about 5MB, its memory usage jumps to about 290M. Even checking back on its metrics an hour later, its usage was holding at 290M.
I looked at the link bncngy provided. There wasn’t anything specifically for a Render deploy. I did try adding in a webpack.config.js file with configuration to exclude sharp, but it didn’t resolve the memory leak for my deployment. (I think bncngy was doing a Heroku deploy which has some additional options for resolving this.)
This sounds related to issue #14417.
Interested in any solutions or ideas anyone has for this issue.
I have the same issue. I’m deploying Strapi to Render on the free plan (512MB memory) with a Postgres database, and the @strapi/provider-upload-cloudinary package (^4.4.7).
I can’t see anything useful in the logs. Does anyone know if there’s a way to turn on more verbose logging in production? (Would that even be wise?)
I’m experiencing the same issue and I can’t find a solution in Strapi. How were you able to implement Sharp? I’m currently using the DigitalOcean app installer, however, I can’t get it to configure it correctly.