This article continues the series on building a continuous deployment environment using Python and Django.
- Starting Your First Django Project
- Testing and Django
- Mock and Coverage
- Using Fabric for Painless Scripting
- Using Celery to handle Asynchronous Processes
- Deployment/Monitoring Strategies
If you have been following along, you now have to tools to setup a Python/Django project and fully test it. Today we will be discussing the Fabric package, which is a Python (2.5 or higher) library and command-line tool for streamlining the use of SSH for application deployment or systems administration tasks
.
I found Fabric so easy to use, and it streamlined my deployment process that I use it to deploy all my projects now (even legacy projects that aren't written in Python).
Getting ready
To install Fabric, enter your virtualenv and call:
pip install fabric
This will install Fabric and setup the CLI command fab
, which looks for and executes the fabfile.py
in the current working directory.
How to do it…
Fabric works by executing code defined by functions in the fabfile.py
. For example, if you type:
fab deploy
Fabric will look for the function deploy
in the fabfile.py
of the current working directory.
Fabric can also be used to call multiple functions, such as a function that setups the environment, before calling deploy (functions are executed left to right):
fab production deploy
Fabric simplifies running CLI commands, both for local and remote machines. The Fabric Api uses an internal env
variable to manage information for connecting to remove servers. I recommend setting up a function in the fabfile.py
for each different type of remote server. For example, if you have two production servers and a single redis server, then you might define the following:
from fabric.api import env def prod(): env.user = 'myUserNameToLoginToServers' env.hosts = [ 'server1.myserver.com', 'server2.myserver.com', ] env.otherUsefulInfor='whateverYouMightNeedDefinedForYourScripts' def redis(): env.user = 'myUserNameToLoginToServers' env.hosts = [ 'redis1.myserver.com', ] env.REDIS_PORT=1234
The
useris the user name used to login remotely to the servers and the
hostsis a list of hosts to connect to. Fabric will use these values to connect remotely for use with the
runand
sudocommands. It will prompt you for any passwords needed to execute commands or connect to machines as this user. Any additional variables defined in the host will be available in subsequent functions that leverage the host, such as
deployin our example.
Our deploy function commits code locally, then runs several scripts on each server to update them (fab prod deploy
):
from fabric.api import local, cd, run, env, sudo from fabric.decorators import runs_once @run_once def commit_code(): with cd('path_to_local_directory'): local('git push origin master') # push local to repository def update_remote(): with cd('path_to_remote_directory'): run('git pull origin master') # pull from repository to remote def restart(): sudo('/etc/init.d/apache2 restart') # restart server, such as apache def deploy(push_code=False): if push_code: commit_code() update_remote() restart()
How it works…
After installation, you can use Fabric by defining a fabfile.py
and running fab func1 func2 …
from the CLI. Generally, the first function is to setup the environment and the additional function(s) do something against that environment. Fabric uses the fabric.api.env
to handle properties for connecting remotely. While, you can hard code these values using the set
function, so that all functions in a fabfile.py
use the same environment, it is more versatile to setup functions that initialize environments, such as prod
and redis
.
The prod
function sets up the environment for our production servers and then the deploy
function is executed against each hosts defined by prod
. I usually break logic of a conceptually complex function like deploy
into discreet functions for each task. Then the deploy
will execute each of the discreet tasks, but each discreet task can also be run independently as necessary (ie. fab prod restart
to only restart the server).
Using the @run_once
decorator on a function will ensure that no matter how many hosts you have defined (how many times deploy
is called), this function will only execute one time. Thus you don't need to write any special logic for tasks that only execute the first time, such as pushing code to your repository.
To setup the working directory that a command should run against, use with
and the cd
function, passing the directory as the only argument to cd
. Inside the with
statement, use the local
command to execute statements locally, the run
command to execute statements remotely, and the sudo
command to execute remote statements as the super user.
Lastly, you may pass arguments into Fabric functions by using a colon after the function name, and each argument separated by a comma: fab prod deploy:arg1,arg2,arg3,…
. There should be no spaces between arguments. By default the deploy
function in this example would not push the local code, but that can be changed by calling:
fab production deploy:1
There's more…
This article is meant only to introduce you to Fabric and provide easy steps to get started. For the full details, see the Fabric 1.2 documentation.