If you are developing a Django project with a SQLite development database and want to migrate that data to your production database (either MYSQL or PostgresQL), there are many opportunities for things to go horribly wrong. Hopefully these steps will make the process a little easier.
-
First, dump the contents of the SQLite database using the
datadump
command. SQLite, Postgres, and MYSQL all use different flavors of SQL to create and populate tables. This makes it difficult to simply dump to a .sql file and load data from there. To help with this, we'll usedumpdata
to dump structure and contents of the database to a JSON file first.
python manage.py dumpdata --natural-foreign --exclude auth.permission --exclude contenttypes --indent 4 > dump.json
-
Next, change the
settings.py
file to connect to the Postgres database and comment out the SQLite connection. -
If the Postgres database is new, it likely has no structure yet. Apply the structure from the
models.py
file by running eitherpython manage.py syncdb
orpython manage.py migrate
-
Finally, it's time to import the data from the JSON dump. We'll use Django's
loaddata
command for that.
python manage.py loaddata dump.json
There's a very good chance that some of the values for columns in your SQLite database are longer than the max value as defined in models.py
. For example, you may have a notes field defined as such in one of your models:
notes = models.CharField("Notes", max_length=100)
You may need to adjust the max_length
value, make and apply migrations, and try again. Fine-tuning field length will be a trial-and-error process, but look closely at error messages generated by loaddata
to identify the models and fields that need adjusting.