Uploading a huge batch of data

I have a very large JSON file containing over a million lines of code of English/Chinese translations that I want to upload to the Strapi backend.

I created a content type called ‘Dictionary’ and under ‘src/api/dictionary/services/dictionary.js’ I wrote a custom controller (following the advice from @sunnyson from Can import data by JSON or CSV?) -

module.exports = createCoreService('api::dictionary.dictionary', ({ strapi }) => {
    console.log('Service called...')
    const { translations } = require('all_cedict.json')
    for (const translation in translations) {
        strapi.services.dictionary.create({
            simplified: translations[translation].simplified,
            traditional: translations[translation].traditional,
            pinyin: translations[translation].pinyin,
            definitions: translations[translation].definitions
        })
    }
});

‘Service called’ isn’t being logged into the console and the data doesn’t seem to have been created.

What am I missing here?

I’m having the same problem. The only difference is that I’m calling strapi.service(‘api::publisher.publisher’).create({})

If I extend the create, I can see it being called, but only if I don’t uncomment super.create
‘use strict’;
const { create } = require(‘lodash’);
const { createCoreService } = require(’@strapi/strapi’).factories;

module.exports = createCoreService(‘api::publisher.publisher’, ({ strapi }) => ({
async create(params) {
console.log(“service create publisher”)
console.log(params)
//await super.create(params)
}
}));