How do I import a large database dump into a managed postgres database
I have a 25GB file with 200million rows of data. Its a data only dump from a local database and I need to get it imported to a Linode managed psql database that has 160GB of storage and 8GB of RAM. It runs out of memory while trying to do a simple psql < massive_file.sql and it takes a very long time if I import it in chunks (I'm trying only 20million rows and its taken two hours so far and I don't know if it will complete or run out of memory).
Is there a way to do the initial database import in a more efficient way?
1 Reply
This post on Stack Exchange titled postgresql - Import large .sql file to Postgres identifies the error as being related to stdin
. You'll find instructions there for running a commands on the origin DB as well as the target DB that may resolve your issue.
You may also want to read over these articles about Bulk Data Loading for PostgreSQL Databases: