Import production PostgreSQL using Heroku to new local using Docker

Hi,

I have a Strapi application running in production using Heroku and PostgreSQL. I need to generate a dump file from it and configure a new local Strapi using Docker and this dump file.

Is there some tutorial to do that? I just find tutorials to install Strapi with Docker

Thanks!

Hi @LucasBassetti I would say that what you can do for this is to follow the docs here

This should give you the ability to dump it to your local machine, Currently and personally on a mac so I use something like TablePlus to dump and import again on the local machine. Though if that is not your flavour there is plenty of free tools like pgAdmin

Here is how I use to set up and run my server locally using yarn and PostgreSQL

Dockerfile

FROM node:14-alpine
ARG NODE_ENV=production
ENV NODE_ENV=${NODE_ENV}
WORKDIR /opt/
COPY ./package.json ./
COPY ./yarn.lock ./
ENV PATH /opt/node_modules/.bin:$PATH
RUN yarn config set network-timeout 600000 -g
RUN yarn install
WORKDIR /opt/app
COPY ./ .
RUN yarn build
EXPOSE 1337
CMD ["yarn", "start"]

docker-compose.yml

version: '3'
services:
  strapi:
    container_name: strapi
    build:
      context: .
      args:
        NODE_ENV: ${NODE_ENV}
      dockerfile: Dockerfile
    image: XXXstrapi:latest
    restart: unless-stopped
    env_file: .env
    environment:
      DATABASE_CLIENT: ${DATABASE_CLIENT}
      DATABASE_NAME: ${DATABASE_NAME}
      DATABASE_HOST: strapiDB
      DATABASE_PORT: ${DATABASE_PORT}
      DATABASE_USERNAME: ${DATABASE_USERNAME}
      DATABASE_PASSWORD: ${DATABASE_PASSWORD}
      JWT_SECRET: ${JWT_SECRET}
      NODE_ENV: ${NODE_ENV}
    links:
      - strapiDB:strapiDB
    volumes:
      - ./:/srv/
      - ./uploads:/srv/public/uploads
    ports:
      - '80:80'
    networks:
      - strapi-app-network
    depends_on:
      - strapiDB

  strapiDB:
    image: postgres:12.0-alpine
    container_name: strapiDB
    restart: unless-stopped
    env_file: .env
    environment:
      POSTGRES_USER: ${DATABASE_USERNAME}
      POSTGRES_PASSWORD: ${DATABASE_PASSWORD}
    volumes:
      - ./data:/var/lib/strapiDBql/data
    ports:
      - '5432:5432'
    networks:
      - strapi-app-network

networks:
  strapi-app-network:
    name: Strapi
    driver: bridge

.env

DATABASE_CLIENT=postgres
DATABASE_NAME=strapi
DATABASE_HOST=localhost
PORT=80
DATABASE_PORT=5432
DATABASE_USERNAME=strapi
DATABASE_PASSWORD=INSERT DB PASSWORD
NODE_ENV=development

To use it I do the following:

  1. Install strapi globally, using npm or yarn (Personally using yarn)
  2. In your project create .env Dockerfile docker-compose.yml files using the above.

You can then use docker-compose up -d strapiDB to run the PostgreSQL in the background.
If you want to run strapi in production mode you can also do docker-compose up -d --build this will start both services, and build the strapi project for development. Note you need to change the NODE_ENV here, but if omitted on a server it will fallback to production

When running in development, I just do yarn develop in the project to start the strapi development server.

Now following the guides you should be able to connect to your local running PostgresQL server and import data. If this is data not created by strapi then I don’t think it will be with be part of the rest response etc, so it might be worth export with csv etc then use import tools to map the correct data values.

Hope it helps :+1: