I've been playing with Django lately. Things have been going well enough for me to decide I want to deploy one of the applications. After reading a horror story or two about self-hosting, I decide to go with Heroku. But Heroku uses PostgreSQL, and I'd been following the path of least resistance through tutorials and ended up with SQLite.
But that's OK, looks like migrating from one RDMS to another is easy with Django's dumpdata and loaddata features. And there are tutorials that make it as easy as 1-2-3-4!
Then there are the hiccups.
1. Getting psycop2g to work
The Heroku toolbelt depends on psycop2g, a Python-Postgres adapter. Fair enough. But using the recommended install procedure resulted in lots of... nothing. Pip hangs forever on the psycop2g install - no errors, nothing. Turns out that it was silently not finding the python3-dev
package (the stuff you need to compile C/C++ extensions on Ubuntu). The kicker? No way to abort it but terminating the process, which left the virtual environment in an unusable state.
2. Applying migrations
I'm using Django 1.7, which has built-in migration support. Those don't port over. I haven't attempted to figure out whether that's because the syntax used in the migration definitions is deliberately DB-specific, or the process simply being buggy. The first few migrations applied, this might be due to the difference in how table modifications are handled between SQLite and Postgres. Regardless, I ended up starting the migration sequence all over again.
3. Moving over the data
Theoretically, this should be possible to do by
- Exporting the data:
python manage.py dumpdata > database.json
- Creating a new Postgres database,
amazing_pgsql_db
- updating settings.py from something like:
-
DATABASES = { 'default': { 'ENGINE': 'django.db.backends.sqlite3', 'NAME': 'amazing_sqlite_db', } }
DATABASES = { 'default': { 'ENGINE': 'django.db.backends.postgresql_psycopg2', 'NAME': 'amazing_pgsql_db', 'USER': 'amazing_pgsql_user', 'PASSWORD': 'amazing_pgsql_password', 'HOST': '127.0.0.1', 'PORT': '5432', } }
- Syncing the new database with the models (or maybe applying migrations, but that didn't work for me, see above). I ended up with
python manage.py syncdb
- Flushing the new db after syncing:
python manage.py sqlflush
-
Loading the data back in:
python manage.py loaddata database.json
And ta-da! Everything should work.
Well, no. Hiccups.
The first, syncdb asks for me to create a super user, since I am using Django's authentication functionality. Well, I shouldn't, because that one will be ported over from the old DB.
The second and more annoying one - flushing apparently doesn't quite cover it. The solution? Manual deletion. There were a more columns I needed to empty than the one in the linked SO answer, but that was the general idea.
Moral of the story? If you're going to end up using Postgres, just use Postgres from the start.