(tl:dr: if you’re familiar with WP CLI and shell scripts, feel free to jump to the final code.)

Frontend web development can be very complicated, so I don’t like to add any complexity to the process by having to learn how to do system administration; setting up and maintaining a development server or anything like that. Thanks to the fact that I work from home, in the office, and sometimes remotely in the UK, I prefer to develop projects on my own computer or laptop. This ensures that I can work just as efficiently if I’m using an unstable or slow internet connection.

Using LocalWP as a development environment

In order to make the development process as easy as possible, I use the downloadable app from localwp.com. Once installed, this app allows me to create new WordPress websites on my own computer in less than a minute, at the click of a button.

The app provides a local server, in which I can run multiple websites at the same time, all supporting several features like local SSH into the web server, local https, Adminer-based database management, Instant Reload during development, live links so that people elsewhere can connect when I need them to, and so on. (A full list of features is here.)

The app also allows you to mirror websites to and from live servers, as long as the hosting provider (currently Flywheel and WP Engine) is supported.

Making a local copy of the live site

Without the mirroring abilities of LocalWP (our sites are mainly on Cyon and Hostpoint servers), I have to get the content and files from the live site in a different way.

There are several WordPress plugins which allow me to clone a remote site to my local machine. I’d log into the live server, create a backup copy of the live site, download it, log into the local site, import it and unpack it. This is fine, but can take a while to complete.

For larger sites, creating a ZIP or GZIP file of the entire site can potentially overload the server: both in terms of memory and hard drive space.

Shell scripts to the rescue

The alternative method, which I use to get the current version of the live website, is a “shell script”. This is a piece of code saved to a file on my own computer which I can run using the command line.

This maybe new territory for many developers, but by the time you reach the end of this post, I hope that you’ll have two short scripts which you can (pretty much) copy and tweak for each project you work on. Once you have the scripts set up, updating your local development environment can be achieved by running a single command.

As an example of a shell script, the following command connects to the live server and tells it to update the theme using the current version of our theme from Github. I can run this from the command line.

ssh USER@SERVER -A "cd ­/WEBROOT/wp-content/themes/sht-theme-name && git pull"

The part of the command between the double quotes is executed on the live server. The && is an ‘and’ (or ‘then’) concatenator, which allows you to run multiple commands sequentially.

I have this command saved in the webroot of the server as the file updatelivetheme.sh. I use it by right-clicking on the project in LocalWP, using Open Site Shell (which starts an SSH session), then calling sh updatelivetheme.sh in the webroot folder.

#!/bin/sh
ssh USER@SERVER -A "cd ­/WEBROOT/wp-content/themes/sht-theme-name && git pull"

The first line of the file is a comment or “shebang”, which tells the operating system which language the file is written in. This hasn’t proved essential when I’ve used scripts like this on my Mac.

I don’t need to enter any passwords, because our hosting is set up with SSH keys which allow a passwordless connection to the server, and which also allow the server to connect to Github without a password. If you haven’t set up SSH keys, you will be prompted for passwords at the appropriate moment.

Using WP CLI

The script above uses regular commands available on most computers. However, we have the WordPress command line interface (WP CLI) installed on our server. An example case would be that I can create a backup of the database on the live server by running a shell script on my own computer.

ssh USER@SERVER -A "cd ­/WEBROOT/ && mkdir -p ./wp-content/backups/ && ­/bin/wp db export - | gzip > ./wp-content/backups/backup-$(date +%Y-%m-%d--%H-%M-%S).sql.gz"

If I want to save that backup on my own computer and not on the live server, then the script gets a little easier.

ssh USER@SERVER -A "cd ­/WEBROOT/ && wp db export - | gzip" > ./backup-$(date +%Y-%m-%d--%H-%M-%S).sql.gz

Here, the code in double-quotes is run on the remote server, then the result of this (the GZIP file) is returned to the ssh command on my computer because of the > character. The remainder of this line is executed on my own computer: it saves the file in the current directory, with the current date and time appended.

Export the live database and import it to the local database

Now I have the script which gets me the live database, I want to import the contents of the file to the local development server. This means that the local database will be completely replaced – including site settings, users and their passwords – with the database from the live server.

Save this code as fromlive.sh in the webroot of your project on your local computer. Then call it using sh fromlive.sh in the terminal you opened (Open Site Shell) from LocalWP. (You will need to replace the placeholders USER, SERVER and WEBROOT using the real values for your own server.)

#!/bin/sh
DATE_FORMAT_STRING=$(date +%Y-%m-%d--%H-%M-%S)

ssh USER@SERVER -A "cd ­/WEBROOT/ && wp db export - | gzip" > backup-$DATE_FORMAT_STRING.sql.gz && gunzip backup-$DATE_FORMAT_STRING.sql.gz && wp db import backup-$DATE_FORMAT_STRING.sql && wp search-replace example.org example.local && rm backup-$DATE_FORMAT_STRING.sql

That’s some heavy concatenation, so I’ll explain each bit.

  1. Create a variable for the date format, which will be re-used throughout the script.
DATE_FORMAT_STRING = "%Y-%m-%d--%h-%M-%S"
  1. Connect to the remote server, export the database to a a GZIPped file and download it to my computer.
ssh USER@SERVER -A "cd ­/WEBROOT/ && wp db export - | gzip" > backup-$DATE_FORMAT_STRING.sql.gz
  1. Unzip the file. (This will create the raw SQL file and delete the sql.gz file.)
gunzip backup-$DATE_FORMAT_STRING.sql.gz
  1. Import it to the local database using WP CLI. (This is installed by localwp.com.)
wp db import backup-$DATE_FORMAT_STRING.sql
  1. Search the entire database and replace all instances of the live domain name with the local domain name.
wp search-replace example.org example.local
  1. Finally, remove the sql file (to keep your webroot tidy). This isn’t essential.
rm backup-$DATE_FORMAT_SCRIPT.sql

Getting changed files

Now you have copied the database from the live website to your local website, you’ll need to get all of the files too.

By copying down all of the files from the webroot, you’ll end up overwriting the configuration of your local site. (That’ll break it.) So only copy the contents of the wp-content folder. In a normal installation, the files outside wp-content are all provided by WordPress core, which you already have.

If you’ve already been working on the site locally, you won’t want to copy down everything, but just the files which have changed. In order to do this, you can either use an FTP programme (I recommend the sync function in Transmit) or the command line.

The easiest method is to stick to a single solution for this process, and add the file synchronisation to your shell script. You can achieve this by using the rsync command. This assesses the timestamps of all the files and compares them with the files you already have, then only copies down the files which have changed.

Creating a shell script for file synchronisation

Save the following script as getfiles.sh in the webroot of your local website, replace the placeholders USER, SERVER and WEBROOT using the real values for your own server, then run sh getfiles.sh on the command line.

Note that the inclusion of the trailing slash (source path) and the omission of the trailing slash (destination path) in the paths are essential. Otherwise, the files will be copied from and to the wrong destination.

#!/bin/sh
rsync -azP -e "ssh" USER@SERVER:WEBROOT/wp-content/uploads/ ./wp-content/uploads
rsync -azP -e "ssh" USER@SERVER:WEBROOT/wp-content/languages/ ./wp-content/languages
rsync -azP -e "ssh" USER@SERVER:WEBROOT/wp-content/plugins/ ./wp-content/plugins

The options -azP passed to the rsync command are the options -a (create an archive during transfer), -z (compress file data during transfer) and -P (capital P, combines --partial (allow large files to be transferred in chunks) and --progress (show progress as the transfer is running). -e indicates that we’re connecting to a remote server.

Combining everything

You now have the script get_live_database.sh and get_live_files.sh on your local server. Because you can concatenate commands (as we’ve seen in the examples above), you can run sh get_live_database.ch && sh get_live_files.sh on the command line. Or you can combine the two into one script: get_live.sh.

#!/bin/sh

DATE_FORMAT_STRING=$(date +%Y-%m-%d--%H-%M-%S)

# Get database
ssh USER@SERVER -A "cd ­/WEBROOT/ && wp db export - | gzip" > backup-$DATE_FORMAT_STRING.sql.gz && gunzip backup-$DATE_FORMAT_STRING.sql.gz && wp db import backup-$DATE_FORMAT_STRING.sql && wp search-replace example.org example.local && rm backup-$DATE_FORMAT_STRING.sql

# Get Uploads
rsync -azP -e "ssh" USER@SERVER:WEBROOT/wp-content/uploads/ ./wp-content/uploads

# Get Language files
rsync -azP -e "ssh" USER@SERVER:WEBROOT/wp-content/languages/ ./wp-content/languages

# Get plugins
rsync -azP -e "ssh" USER@SERVER:WEBROOT/wp-content/plugins/ ./wp-content/plugins

# Get themes - ONLY IF YOU'RE NOT USING e.g. GIT VERSION CONTROL
rsync -azP -e "ssh" USER@SERVER:WEBROOT/wp-content/themes/ ./wp-content/themes

Leave a Reply

Your email address will not be published. Required fields are marked *

For security, use of Google’s reCAPTCHA service is required which is subject to the Google Privacy Policy and Terms of Use.

This site uses Akismet to reduce spam. Learn how your comment data is processed.