Export all rows with 300,000 rows returns in server crash

System Information
  • Strapi Version: v3.5.2
  • Operating System: Windows
  • Database: MySQL
  • Node Version: 12.9.0
  • NPM Version: 6.14.8
  • Yarn Version: 1.22.5

I need to create CSV file of all the rows available in a collection, This is my code for now for fetching all data and using the data to write the CSV file, but one of collection has over 300,000 lines and this cause in server crash.

 const data = await strapi.query(model).find({ _limit: -1 });

Response logs

  Error: write ECONNRESET
      at afterWriteDispatched (internal/stream_base_commons.js:156:25)
      at writevGeneric (internal/stream_base_commons.js:139:3)
      at Socket._writeGeneric (net.js:786:11)
      at Socket._writev (net.js:795:8)
      at doWrite (_stream_writable.js:401:12)
      at clearBuffer (_stream_writable.js:519:5)
      at Socket.Writable.uncork (_stream_writable.js:338:7)
      at ServerResponse.OutgoingMessage.uncork (_http_outgoing.js:248:17)
      at connectionCorkNT (_http_outgoing.js:698:8)
      at processTicksAndRejections (internal/process/task_queues.js:84:21)

Let me know the best way to handle this situation. Thanks in advance

@dhruv Export them in smaller chunks.

Something like:

const rowsToExport = await strapi.query(model).count();

let fetchedItems = 0;

while (fetchedItems < rowsToExport) {
  
  const data = await strapi.query(model).find({ _limit: 1000, _start: fetchedItems }); //
  // _start is used to skip the rows that you already fetched.
  
  /* 
   * write data to csv file here
  */
  fetchedItems = fetchedItems + data.length;
  strapi.log.info(`Fetched: ${fetchedItems} / ${rowsToExport}.`);
}
2 Likes