This is Part 2 of a multi-part series of posts discussing how I test new major releases of WordPress before upgrading my “live” sites. The first step got us rolling with a XAMPP install . This step will tackle copying the database from our live site to our local test site. Subsequent articles will discuss copying WP, plugins and themes and upgrading.
Now that we have our environment prepared, our next step is to get a copy of our live blog running in out little test environment. I figure it makes the most sense to get the database data copied over first — we’ll tackle the actual site code in a subsequent step. That site code won’t show us our site without the data, right?
There are many different ways to tackle copying data from one database to another. Traditionally I do this via phpMyAdmin, but I’ve recently realized not all hosts offer it and some people just aren’t real comfortable with it. As a result I’ve opted to tackle this from a slightly different angle. Hopefully people will find it a bit simpler.
(but feel free to get your live site backup via phpMyAdmin exports if you’re more comfortable going that route)
Live Database Backup
First we need a copy of the data in your live blog’s database.
Get a current copy of the WordPress Database Backup plugin. Follow the directions and install it. Visit your live blog’s Plugins page and activate it.
Now visit Manage -> Backup. At the top of the page you’ll see two columns. The left column lists all the “core” tables to be backed up. You definitely want these, they are the tables that come with WordPress and make it all work.
The right column lists additional tables found in your database. In most cases, these tables are from plugins you’re running (or have ran in the past) — or perhaps you have other applications or blogs sharing that same database.
How to decide what to grab from that right column? Well, in most cases you can start by ignoring any tables that don’t have the same table prefix as your left column. For instance, all my left tables start with “wp_”, therefore I’m not going to grab any from the right list that don’t start the same.
In other cases you can infer the plugin name from the table name. For instance, wp_bad_behavior is definitely from the Bad Behavior plugin I run. If you realize a table is from a plugin that you no longer need, don’t bother with it. But when in doubt, check the box (it won’t hurt to copy more than you actually need). (actually, copying “log” tables is just wasting space, but I’m trying to keep this simple! They won’t hurt anything)
So, check all the tables on the right that you believe are in use. Here’s a sample of mine:
Next, choose the option to email the backup and provide a valid email address.
Click “Backup” and go grab a coffee. Depending on how big your database is, this step may take a while. Just let it spin, don’t click away — it’ll tell you when the backup is complete.
Unzipping the Backup
You’ll notice the backup you receive via email has a .gz extension. That means the file is compressed (with GNU Zip) and you’ll need to uncompress it before you can work with it.
You may need to install additional software to accomplish that. I, personally, suggest 7-Zip. The price is right (free!) and I’ve been using it for quite some time. I’m sure other compression programs can handle the .gz format as well though. If you have a favorite, drop a comment on this article.
Using 7-Zip, I right-click on the gz file and extract it to a new sub-directory.
Regardless of how you get it uncompressed, don’t lose track of where the resulting .sql file ends up, we’ll need it shortly.
Create the New Test Database
Now, finally, we get to play with our local test environment!
Start by pointing your browser to
http://localhost/phpmyadmin/. This will start the local copy of phpMyAdmin included with the XAMPP installation. We’ll be using it to create a new database to receive your live site’s backup.
Tip: Remember to start Apache and MySql! You’ll start them via the XAMPP control panel, as discussed towards the end of the first article in this series.
(trust me, I tend to forget and wonder why nothing happens when I try to get to my local phpMyAdmin…)
Start by creating a new database. This can be accomplished from the main phpMyAdmin page:
See it circled in red? Give the database a name and then click “Create.” The subsequent page should let you know that everything is groovy. Don’t forget that name.
On that same page, you’ll see a row of tabs across the top. Click the “Import” tab. On the Import page click the “Browse” button and locate that .sql file we unzipped a couple minutes ago.
Click “Go” and let it rip! Depending on your computer speed and the file size, this could take a while…
Assuming it all works, we’re done with this step of our journey. We have data!
Big Data Issues
If you have a large export (.sql) file, you may have problems with the import. You’ll notice that the default max upload size in phpMyAdmin for importing is 16MiB:
I have to confess, I’m not yet sure how to increase that! Looking in the php.ini file, I know it’s not any of these:
memory_limit = 32M(default)
post_max_size = 32M(this was defaulted to 16M, I raised it)
upload_max_filesize = 32M(default)
If anyone knows how to increase it, leave a comment please? My initial web search suffered from weak search-fu and was dissatisfying.
Next article we’ll run through copying down the live blog files, themes and plugins and hooking it all up. When complete we should have something that’s exactly the same (or darned close) as our live blog running on our test environment. From there, we try an upgrade!
Possibly Related posts: