Skip to content

HOWTO devops capistrano_tasks

steveoro edited this page Apr 26, 2021 · 1 revision

HOW-TO: Capistrano supporting tasks

All Capistrano supporting tasks require at least a deploy <STAGE_NAME> as parameter. ('production' or 'staging'.)

Note:

Although disabled and mostly harmless, do not use the usual cap <STAGE_NAME> deploy task or any other cap task except doctor or the ones listed below. (The automated build pipeline setup makes the other tasks pointless.)

log

Tail a specified log file in the shared path or the production.log, if none is given.

(Press CTRL-C to stop following the file.)

$> cap <STAGE_NAME> log:tail[kind]

Parameters

  • kind:
    • "api" => tail the API log for the specified <STAGE_NAME>
    • "api_audit" => tail the API audit log for the specified <STAGE_NAME>
    • nil => the app log for the specified <STAGE_NAME>

status

Output a status report for the remote server.

# Report the overall remote server status, including all steps below:
$> cap <STAGE_NAME> status

# Check remote Docker status:
$> cap <STAGE_NAME> status:docker

# Check status of the remote mail queue:
$> cap <STAGE_NAME> status:mailq

# Check remote memory & disk status:
$> cap <STAGE_NAME> status:mem

# Check remote Monit report:
$> cap <STAGE_NAME> status:monit

maintenance

For more info about the logic & static Maintenance modes check out the Wiki page about Maintenance.

# Toggle on logic maintenance:
$> cap <STAGE_NAME> maintenance:on

# Toggle off logic maintenance:
$> cap <STAGE_NAME> maintenance:on

# Toggle on static maintenance site:
$> cap <STAGE_NAME> maintenance:site

# Toggle off static maintenance site:
$> cap <STAGE_NAME> maintenance:site[off]

db

DB-related tasks.

db:dump[action]

Runs remotely the 'db:dump' task for the specified deploy stage and/or optionally downloads or uploads the dump file.

Three possible actions:

  1. [nil] => Run the remote task, creating backups/<STAGE_NAME>.sql.bz2:

    $> cap <STAGE_NAME> db:dump

    Or:

    cap <STAGE_NAME> db:dump[get=<FILENAME_OVERRIDE>]

  2. [get] => Download the remote dump file into db/dump/<STAGE_NAME>.sql.bz2 or db/dump/<FILENAME_OVERRIDE> if the <FILENAME_OVERRIDE> is set:

    $> cap <STAGE_NAME> db:dump[get]
    
    # Or (with filename override):
    $> cap <STAGE_NAME> db:dump[get=<FILENAME_OVERRIDE>]
  3. [put] => Upload the local dump file as backups/<STAGE_NAME>.sql.bz2 or backups/<FILENAME_OVERRIDE> if the <FILENAME_OVERRIDE> is set:

    $> cap <STAGE_NAME> db:dump[put]
    
    # Or (with filename override):
    $> cap <STAGE_NAME> db:dump[put=<FILENAME_OVERRIDE>]

db:exec[source_folder]

Loops on all the .sql files found in a specified directory, uploads and executes them in alphabetical order on the remote host, one by one, using the corresponding database.

When successfully run, each uploaded .sql file is consumed (deleted). Errors and warnings are intercepted and presented on the console. In case of errors, the loop is halted.

The source folder files are always left untouched (not moved nor consumed in any case).

A source folder override can be specified as additional argument to the task. The mysql client must be installed on the host.

Defaults:

  • source folder => 'localhost: <RAILS_ROOT>/tmp'
  • dest. folder => 'host: /tmp' (erased afterwards when ok)

Usage:

$> cap <STAGE_NAME> db:exec

This will upload all local tmp/*.sql files to the remote host and execute them one-by-one on the <STAGE_NAME> database. Each destination file is then deleted after execution, except when any error occurs.

For uploading all tmp/diff/*.sql files instead, use the source folder override like this:

$> cap <STAGE_NAME> db:exec[tmp/diff]
Clone this wiki locally