Moving rundeck from one server to another

Rundeck people just released an new version, la 1.5. Upgrading to this version is not as simple as usual (yum update or apt-get upgrade) because they’ve changed the database schema and that’s why they recommend following the backup/recovery protocol to upgrade.

I’ve been trying to move our rundeck service to another server with more resources for a while, but never found the moment. This upgrade was the perfect excuse, so I moved it and this post explains the steps for moving rundeck.

First we find all the 4 parts we want to move:
– Rundeck configuration
– Rundeck user keys
– Project definitions
– Job definitions
– Execution logs

Rundeck configuration is in /etc/rundeck/. To find the project definitions we should look in the file /etc/rundeck/project.properties for the project.dir value (default is /var/rundeck/projects/). We will find the path to project ssh keys in the file etc/project. properties of each project directory, in the project.ssh-keypath value. The job definitions is in the database, and we can see the path to the executions logs in the file /etc/rundeck/framework.properties, with the framework.logs.dir value (usually /var/lib/rundeck/logs).

Once we’ve located everything, we can make “the package” we will move from server to server. We start with the text files (rundeck config, project definition and execution logs):


mkdir rundeck-backup
cp -a /etc/rundeck/ rundeck-backup/
cp -a /var/rundeck/projects/ rundeck-backup
cp -a /var/lib/rundeck/logs/ rundeck-backup

To copy the projects ssh keys we should check inside each project directory for its project.properties and copy that file. The projects may share the key or may not, and the keys may have the same filename or not. That’s why we’ll save them inside each project directory:


for project in rundeck-backup/projects/*;do cp grep project.ssh-keypath $project/etc/project.properties|cut -d"=" -f 2 $project;done

For the jobs definition extraction, we need to call rd-jobs list for each project, exporting this way the xml definition:


for project in rundeck-backup/projects/*;do rd-jobs list -f rundeck-backup/basename $project.xml -p basename $project;done

And it would be fine to keep the “know_hosts” file for rundeck user:

cp getent passwd rundeck|cut -d":" -f6/.ssh/known_hosts rundeck-backup

Now we have a package with a full backup of our installation. Now we send this rundeck-backup directory to the new server (I know, it’s obvious, but there you go :P)

scp -r rundeck-backup [email protected]:.

Now we ssh to the new server. We assume rundeck is installed in the new server (if not, we talk about that
in an older post), so we just need to put the files where they belong. First the keys:


for project in rundeck-backup/projects/*;do filename=grep project.ssh-keypath $project/etc/project.properties|cut -d"=" -f 2;cp $project/basename $filename $filename;done

Then the rest of the files:

cp -a rundeck-backup/rundeck/ /etc/
cp -a rundeck-backup/projects/ /var/rundeck/
cp -a rundeck-backup/logs/ /var/lib/rundeck/
cp rundeck-backup/known_hosts getent passwd rundeck|cut -d":" -f6/.ssh/known_hosts

Now we have rundeck configuration and projects definition, but the jobs are still missing. We shold keep in mind the old server is still running, and we don’t want our jobs executed twice at the same time. We don’t want to disable in the old server until we make sure the new one is running ok, because we don’t want any of the executions to be missed. To achieve both we will make the new server to fake the executions, not really running anything, changing service.NodeExecutor.default.provider value in the file /var/rundeck/projects/$PROJECT/etc/project.properties, from jsch-ssh to stub. In a single line, it would be:

sed /var/rundeck/projects/*/etc/project.properties -e 's/jsch-ssh/stub/g' -i

Now we are sure no job will be executed until we say so, so we can riskless import the jobs:

for project in rundeck-backups/projects/*;do rd-jobs load -f rundeck-backup/basename $project.xml -p basename $project;done

With the jobs loaded we have everything we need. Now we can log in the web interface and check everything is ok: users can acess to their projects, jobs are correctly configured, etc, etc. When we are sure, we can move one project at a time (or all of them at the same time, as you wish) just changing the former value (service.NodeExecutor.default.provider). In the old server we change “jsch-ssh” to “stub” and the other way arounf in the new server, from “stub” to “jsch-ssh”. Playing with those values we are confident if we find any problem with some project, we can move this project (or all of them, just to be sure) back to the old server while we solve it.

And that’s it! Now we could change the DNS to keep the old rundeck URL, but that’s your choice.