The super stupid idiot's guide to getting started with Django, Pipeline, and S3

When it comes to dev ops I'm an idiot. In fact, I'm a super stupid idiot. As I spent the majority of yesterday tinkering with the Django configuration for Backstep, I came to the conclusions that:

a. Bobby need to be smarter-er when come to dev ops
b. There is no super stupid idiot's guide to getting started with Django, django-pipeline, and S3.

So in the spirit of sounding like I'm better at dev ops than I am, I've decided to dedicate this post to getting the aforementioned stack set up.


Pipeline

Of course step 1 is having Django set up and configured. This guide assumes that you have - at the very least - a complete Django installation. Now we're going to go ahead and install django-pipeline. Pipeline is pretty great because it lets you specify your static files in settings.py and then just works. With only small additional configuration, you can set up Pipeline to compile your LESS files, compress your Javascript, and make you a sandwich! Who doesn't want a sandwich?!

Food aside, after installing Pipeline, add pipeline to your INSTALLED_APPS and set it as a Storage Backend using the configuration variable STATICFILES_STORAGE = 'pipeline.storage.PipelineStorage' The only thing to watch out for is Pipeline's PIPELINE_DISABLE_WRAPPER config variable. By default, it's False which means any JS variables in the global scope will automatically be wrapped in an IFFE. The idea is noble, everyone knows that global variables = bad, but in practice sometimes design patterns rely on global variables. For instance, if using Backbone.js like me, it is common practice to define models in the global scope. Therefore it might be beneficial to set PIPELINE_DISABLE_WRAPPER = True, your call.


Static Files

Now that Pipeline is set up, lets make sure that static files are served correctly. To do so, set the following variables in settings.py.

STATIC_ROOT = 'staticfiles/'
STATIC_URL = '/static/'
MEDIA_ROOT = 'uploads/'
MEDIA_URL = "/media/"

Your static files will now be served from the directory staticfiles at the URL /static/. Next, we're going to just make sure that urls.py knows how to serve the static files. Go ahead and add the following to your urlpatterns

urlpatterns += patterns('',
    (r'^static/(?P<path>.*)$', 'django.views.static.serve', {'document_root': settings.STATIC_ROOT}),

Phew! Now all my fellow stupid idiots, it's around this point that I would typically start getting lost in tutorials, so let me recap what we have so far. At this point, you should have django-pipeline installed and configured and your static files should be available at the URL /static/<file>. If you want to go ahead and test this, place a file called, oh I don't know, how about bobbys_great_tutorial in the root of your staticfiles directory and try to access it at the URL /static/bobbys_great_tutorial.


AWS

Ok on to the fun part, setting up AWS. Disclaimer: The way I did this may be sub-optimal, but I can vouch for the fact that it does get the job done. First, you're gonna need an AWS account so go ahead and sign up for that. Next we're going to want to set up an IAM account so that we're not using keys associated with the primary billing account for API calls and the like. To do so, just expand the Services dropdown in the upper left corner and select IAM. Once inside the IAM portal, select the Users tab from the left sidebar and then click the Create New Users button. Follow the prompts to create a new IAM user. Now we're going to want to set up what S3 calls a bucket (basically just a top level container for your files). So expand the services dropdown again and this time select S3. Click the Create Bucket button and follow the prompts to create a new bucket. Finally, we're all set, right? Right?! No, unfortunately. Now we get to deal with permissions. By default, S3 blocks all access to bucket contents so we have to change the suffocating access control.

To do so, first click the Properties button in the top right corner. You should then be presented with a side panel containing - duh - properties for the bucket.
properties
Now go ahead and expand the Permissions tab. Click the Edit bucket policy button and open up the policy generator from the resulting popup.
policy generator
From the policy screen, we're going to choose the IAM policy from the policy dropdown and then choose the appropriate selections for the rest of the form (it should be pretty straightforward, even to us super stupid idiots). By the end, your form should look something like this
policy form
where <bucket-name> is the name of the bucket you created earlier. Now click the Generate Policy button and copy the resulting text, we'll use this in our bucket policy editor. Go back to your S3 screen and - in the still open bucket policy editor - paste the text. If you click Save now, you'll get an error about missing a "Principal" element. The principal basically just determines who is allowed access. So, for the purposes of brevity (you should change this in the future, as you don't want just anybody to have access to all actions in your bucket, refer to my bucket policy editor screenshot above for a better policy description) just grant access to all actions for all users by adding the following Principal to the Statement object:

"Principal": {
	"AWS": "*"
},
...

Adding S3 As A Storage Backend

Hang in there buddy, we're getting close. The next step is setting up S3 as a storage backend so Pipeline knows about it. To do so we'll need both the django-storages package as well as the boto library. After adding both to our requirements.txt and pip installing, add storages to your INSTALLED_APPS. INSTALLED_APPS should now include at least

INSTALLED_APPS = (
    ...
    'pipeline',
    'storages',
    ...
)

Now it's just a matter of setting some configuration variables in settings.py.

AWS_ACCESS_KEY_ID = '<AWS ACCESS KEY FOR IAM USER>'
AWS_SECRET_ACCESS_KEY = '<AWS SECRET ACCESS KEY FOR IAM USER>'
AWS_STORAGE_BUCKET_NAME = '<AWS BUCKET NAME>'
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
STATICFILES_STORAGE = 'leftbehind.apps.matchmaker.utils.S3PipelineStorage'

Wrapping Up

Oh. My. God. Make it stop. I know that's what you're thinking my fellow stupid idiot, and we're almost there. The only thing left to do is have Django serve static files at the URL set by the S3 bucket you bravely created. To do so, simply set

STATIC_URL = 'https://<bucket domain>/{aws_bucket}/'.format(aws_bucket=AWS_STORAGE_BUCKET_NAME)

Ok! That's it! If you made it this far and didn't just do this after the first paragraph
walk away
much respect. As a dev ops super stupid idiot comrade, your dedication inspires me. Now get back out there and create something great with your shiny static files architecture in place.


Notes

  • Make sure you add settings.py to the .gitignore file so as to keep it removed from source control. We don't want Joe Schmoe (Shmoe? Shmo?) having access to your personal AWS keys.