Deployment Framework Technical Overview
This page tries to explain some of the more technical aspects of how the deployment framework is structured, and how it expects to be used etc. See also the General Overview.
The `fabfile.py` Script
Before describing any of the Rattail deployment specifics, it is worth noting that this system relies 100% upon Fabric. As such, everything starts with the fabfile.py script. The good news is that (with minor exceptions outlined below) the fabfile.py script should be a "normal" Fabric script. So we won't go over how Fabric scripts work here; you're expected to know something of that already.
In general these docs always assume certain tasks will be present within a Fabric script:
bootstrap_all - Install/upgrade/configure all software such that the server is fully operational.
production_sync - Perform a full data sync (as necessary) from the "current" (live) production server, onto the target server.
In practice you'd usually have more tasks than that in a given script; it depends on your needs. I typically split the bootstrap process into multiple components, e.g. a bootstrap_system and bootstrap_rattail and maybe bootstrap_samba etc. Then the bootstrap_all task would just call all the other ones.
It should be noted that the production_sync task in particular is not yet "finished" in its design. It is mentioned here because the goal is to have a single task to accomplish the full data sync, i.e. backup/restore databases and rsync of relevant file paths. But the details are not yet as polished as the author would like; experimentation is underway as of this writing.
Fabric API Functions
Perhaps the most "basic" piece of the Rattail deployment framework is the rattail.fablib subpackage, which contains various functions designed for use with Fabric scripts. Some modules are dedicated to certain software systems (e.g. rattail.fablib.postgresql) while others may be more generic.
The Fabric API itself is quite good for basic server maintenance; in general the Rattail API only adds yet more convenience but doesn't do anything very revolutionary. For a simple example, consider this Fabric code:
from fabric.api import task, sudo @task def install_pg(): sudo('apt-get install postgresql')
Since apt-get install is something you'll probably be doing a lot of, there is a shortcut:
from fabric.api import task from rattail.fablib import apt @task def install_pg(): apt.install('postgresql')
The motivation behind these shortcuts is two-fold. On the one hand a shortcut is (or should be) simpler to read, but it also allows "best practices" to be consolidated and shared. For instance the apt() function above takes extra precautions to ensure that it happens in a fully-automated way, and does not stop to prompt the user for anything.
Currently the best way to see what's available from the rattail.fablib API is to look through the source code. There really isn't much there, in fact. However one function in particular is among the most useful and deserves some explanation, next.
The `deploy()` Function
This is not a "normal" function per se; in fact there is not actually a deploy() function within the rattail.fablib package. However, a typical server bundle will make frequent use of a deploy() function. So, what's that about? First, the why:
A common need in server maintenance is to push a config file to the server, overwriting any which may have previously existed there. In many cases this file may be considered "static" and is committed to the source repo in the same state as it will exist on the server (i.e. same content). However you may need certain config values within the file to be determined dynamically instead. In this case you'll need to commit only a "template" to the repo, and during deployment this template will be merged with dynamic values, and the result should be pushed to your target server.
There also is the question of how to (most easily) indicate within the server bundle code, where exactly your local copy of the file resides. As a general rule, it's probably most convenient for the static/dynamic config files to reside within the bundle itself, although there are exceptions (e.g. a common .gitconfig that you wish to install to all servers).
These two reasons are why there is a special deploy() function. It provides a single, flexible way to push files to the server whether they are "static" or dynamic. And, it provides a way to essentially keep local file paths "relative" when specified in code. So let's get on to how to use it:
Consider the following basic server bundle file structure:
|-- fabfile.py `-- deploy `-- rattail |-- rattail.conf.template `-- filemon.conf
This structure is not complete; see the Server Bundle section below for more about what the structure should actually look like. Here we're only really interested in what's in the 'deploy' folder, which happens to be two config files under a 'rattail' subfolder. You'll notice that one of the file names ends with '.template' - that is a special suffix which will cause the file to be merged with the running Fabric environment before being pushed to the server, instead of being pushed as-is (i.e. as a "static" file).
Within the fabfile.py script then, might be the following code:
from fabric.api import task from rattail.fablib import make_deploy, mkdir deploy = make_deploy(__file__) @task def configure_rattail(): mkdir('/etc/rattail') deploy('rattail/rattail.conf.template', '/etc/rattail/rattail.conf') deploy('rattail/filemon.conf', '/etc/rattail/filemon.conf')
Here you can see where the deploy() function actually comes from: It is created dynamically with the fabfile.py path as its "context". This is how we can then specify relative file paths when calling deploy() within the task. In other words, the deploy() function itself considers its "working directory" to be the 'deploy' folder which exists alongside the 'fabfile.py' script. It is important to note that this much is effectively hard-coded, meaning if you changed the 'deploy' folder name to say, 'server-files', then you would break the magic.
The other piece of magic here involves the static vs. dynamic behavior. In the case of filemon.conf it will upload the local file as-is to the server, but for rattail.conf it will first generate a "final version" of the file which it then pushes to the server. This is merely a convenience wrapper of sorts, under the hood it's calling either the Fabric put() or upload_template() function.
The `fabenv.py` Script
Another distinguishing feature of a Rattail-style server bundle is the presence of a fabenv.py script. This is a simple Python module whose purpose is only to set local/custom attributes in the Fabric environment, for use in generating dynamic files to be pushed to the server, and in some cases conditional logic etc. Such a file might look like:
from fabric.api import env env.password_postgresql_rattail = 'somepassword' env.password_mssql_locsms = 'anotherpassword'
This file must be imported by the fabfile.py script explicitly, there is no magic to avoid that. To further flesh out the 'deploy' example from earlier:
from fabric.api import task from rattail.fablib import make_deploy, mkdir # Note: This line is the only difference. import fabenv deploy = make_deploy(__file__) @task def configure_rattail(): mkdir('/etc/rattail') deploy('rattail/rattail.conf.template', '/etc/rattail/rattail.conf') deploy('rattail/filemon.conf', '/etc/rattail/filemon.conf')
As is implied with this syntax, the fabenv.py script must exist alongside the fabfile.py script. This also means that you must run the Fabric script from within its own directory.
Now that the fabenv.py script is being imported, the rattail.conf.template file is free to leverage any of its values by name, e.g.:
[rattail.db] default.url = postgresql://rattail:%(password_postgresql_rattail)s@localhost/rattail
Given the nature of the contents of the fabenv.py script, it is important that it never be committed to the source repo. In fact the typical repo will have a VCS exclusion rule so that all files with this name are permanently ignored. However it also is good practice to maintain a "template" for the fabenv.py script, so that it's at least easier to quickly see which passwords etc. you do need to set when preparing the bundle for real use. This template should then be committed to the repo; it is usually named fabenv.py.dist by convention. Its contents are the same as fabenv.py, except the actual passwords are blanked out etc.
The term "bundle" just refers to the collection of logic and data required to install and configure a server. In practice it is expected that this collection resides within a single folder (albeit with at least one subfolder, 'deploy'). One sort of exception to this rule is that some logic which is used by the server bundle's code, may reside in other Python packages (e.g. rattail). But in all cases the bundle itself is the "final authority" for the configuration of its associated server.
Note then that a key concept here is the one-to-one association of a server bundle. There is one bundle per server, and one server per bundle, always.
The "bundle" is also just a single term which "puts it all together" as far as the various concepts described above. Meaning, within a bundle you will have a fabfile.py and a fabenv.py, and you will call various Fabric API functions, and have a deploy subfolder with various config files, and you'll call the deploy() function to push them to the server, etc.
Some of the file structure was outlined above, but here is a complete (minimal) example:
/home/lance/src/myfab/servers/host/ |-- deploy | |-- ssh | | `-- id_rsa | `-- rattail | |-- rattail.conf.template | `-- filemon.conf |-- fabenv.py |-- fabenv.py.dist `-- fabfile.py
That's it! Not much more than you've already seen, really. Again, the fabenv.py (if present) is a local-only file and never committed to the repo. The id_rsa file above is meant to show an example of a private/secure file which you also would not wish to commit to the repo. However you can see that as long as the file exists there locally, you would be able to call deploy() on it just like any other file. In practice then, such a file should also be explicitly ignored by VCS to avoid accidental commits.