Development to Deployment in Django
One of the most important things I've done to improve my experience working with Django is to develop a consistent pipeline between development and deployment. Here I'll quickly outline some of the things I've done to make the experience easier, but I'm curious to hear about the techniques that you use as well.
local_settings.py++
One of the best tricks for working with Django deployment is to end your
settings.py
file with these four lines:
try:
from local_settings import *
except ImportError:
pass
Doing that lets you override settings.py
with the contents
of local_settings.py
. For a public shareable project, you
can't take this much further than providing a local_settings.py
template which users can customize, but for a private project where
you control the production environment and the development environment,
then you can take this approach a bit further.
I create a production_settings.py
file with settings for production servers, and
a devel_settings.py
for local development. I add both of them to
version control, and then creating a local_settings.py
for my local
checkout that looks like this:
from devel_settings import *
and for my production checkout local_settings.py
looks like:
from production_settings import *
Finally, I make sure that local_settings.py
is ignored by
version control.
I like this setup because it (like the standard local_settings.py
trick) makes it easy to modify shared settings, but in addition this trick also makes it
easy to modify the production settings withing SSHing into the
production server, and is simpler than maintaining a production and
development branch in the repository (which requires a lot of pointless
merging).
devel_settings.py
and production_settings.py
My devel_settings.py
file almost always looks exactly
the same regardless of the project it is part of. Yours may
look a bit different, but I find that it saves a few precious
moments of thinking to just keep a standardized project template
somewhere (instead of using djangoadmin startproject
to create new projects).
Mine looks like this:
# Settings for devel server
import os
ROOT_PATH = os.path.dirname(__file__)
DEBUG = True
TEMPLATE_DEBUG = DEBUG
COMPRESS = False # django-compress setting
CACHE_BACKEND="locmem:///"
DATABASE_ENGINE = 'sqlite3'
DATABASE_NAME = os.path.join(ROOT_PATH, 'devel.sqlite')
DATABASE_USER = ''
DATABASE_PASSWORD = ''
DATABASE_HOST = ''
DATABASE_PORT = ''
MEDIA_ROOT = os.path.join(ROOT_PATH, 'media')
MEDIA_URL = 'http://127.0.0.1:8000/media/'
ADMIN_MEDIA_PREFIX = '/media/admin/'
The production_settings.py
file varies a bit more, since
it depends on the Postgres and Memcached setup, but
very roughly it looks something like:
# Django settings for codernode.com project
COMPRESS = True # django-compress setting
COMPRESS_VERSION = True # django-compress setting
DEBUG = True
TEMPLATE_DEBUG = DEBUG
ADMINS = (('Mr Admin','admin@example.com'),)
MANAGERS = ADMINS
CACHE_BACKEND="memcached://127.0.0.1:11211"
EMAIL_SUBJECT_PREFIX = "[My Project]"
SERVER_EMAIL = "django@example.com"
DEFAULT_FROM_EMAIL = 'info@example.com'
DATABASE_ENGINE = 'postgresql_psycopg2'
DATABASE_NAME = 'some_database'
DATABASE_USER = 'some_user'
DATABASE_PASSWORD = 'some_password'
DATABASE_HOST = ''
DATABASE_PORT = ''
TIME_ZONE = 'America/Chicago'
LANGUAGE_CODE = 'en-us'
SITE_ID = 1
USE_I18N = True
MEDIA_ROOT = '/path/to/media/root/'
MEDIA_URL = 'http://example.com/media/'
ADMIN_MEDIA_PREFIX = '/media/admin/'
TEMPLATE_DIRS = ()
Yours will likely look quite different.
A Tale of Three Repositories
The foundation of my development-deployment pipeline is three repositories (it's a bit simpler using distributed version control, but you could accomplish more or less the same with Subversion or CVS).
A master repository hosted in a non-local and non-production location. I use my Slicehost slice that hosts my blog for all my private git repositories (it has daily and weekly backups, and if it does lag or crash, I can fix it myself), but you could use a paid GitHub account or one of the other similar services.
A local repository for each developer, which they push out to the master server at regular intervals.
A production repository on each production machine serving the project.
Developers do most of their work on their local machines, and push it to an appropriate branch on the master repository. They also pull from the master server occasionally to keep the repositories synchronized.
The production repositories are kept up to date by either:
Using Fabric to command them to pull the newest changes. This is the best option for large, medium and small deployments. That is, this is always the best option. Don't bother reading the others.
SSHing into the production servers and manually pulling the change set. Alternatively you could write a batch script that does this. This approach is more than sufficient for one machine deployment, but for anything more complex than that you'll want to use a deployment solution.
Setup a cronjob that pulls the changes periodically. This is the simplest and laziest of the approaches, but is a bad idea in most circumstances.
Using this setup, pushing a change on the development server to the production servers is lovable two steps:
git push
fab deploy
I highly recommend adding a fab revert hash-to-revert-to
command
to the mix as well, for those awkward moments when your deployment
doesn't go quite as well as you hoped.
Production Server Setup
I keep my directory layouts very simple and uniform. All production servers
have a django
user whose password has been disabled and doesn't have
access to SSH. The developers have accounts on the production machines, and they
are members of the django
group, and thus can modify all of the django
user's files (having SSH setup to work without passwords will save days of your life).
I organize libraries into folders based on the version control system they use,
so I'll typically have a git
folder and a svn
folder, but I might have an
hg
or bzr
folder as well depending on the libraries that are being used.
(Okay, this is a lie, I can't remember ever having a bzr
folder, but I wanted to
be inclusive.) Then I symlink them all into the /usr/python2.?/site-packages/
folder (I've been using Python 2.5 pretty exclusively of late, but I imagine in the next
year or two I may move up to 2.6 if there are any compelling performance gains).
I prefer linking directly from the checkouts (as opposed to using setup.py
to
install the libraries), since it makes it easier to update the libraries across all
production servers at a later point if necessary.
For serving media I symlink the necessary folders from the project/application
repositories into /var/www/example.com/media/
, and let Nginx handle
serving it. (I'm still quite happy with my Nginx/Apache2/mod_python stack,
even if it has fallen out of favor with the coolest kids for WSGI.)
Testing...
There are a lot of creative things you can do with integrating testing into the mix (deploy script only pushes changes to production if all the tests pass, or using a post-commit hook for the master repository to run the test battery and email developers if any tests fail, etc), but I'm still at a stage where I run tests manually. This will probably be the next area that I start improving upon my setup.
Do you use other tricks that I haven't mentioned here? Or perhaps have a better overall design? I'm curious to see how others approach this universal problem.