Site Sucker review

In a day and age where websites are either powered by content management systems or are hopelessly out of date, it’s becoming difficult to preserve offline versions of content. When websites were static, it was easy to pop a collection of pages onto a disk for offline browsing or for presentation purposes. That’s now so often no longer the case when sites are assembled on the fly from templates and data stored on servers.

This makes Site Sucker – an open-source tool with a decade’s worth of development – more relevant now than ever. There are few offline site downloaders for the Mac. Perhaps that’s because Site Sucker does the job so well, with a simple and streamlined interface that makes backing up any website a matter of a couple of clicks.

All you have to do in most cases is enter the root web address of the site you want to save locally. Click a button and Site Sucker does the rest, processing the links and it is smart enough to know which are local to the site. It can be configured to pull down just the pages in a hierarchy or grab everything, including media and images, rewriting links in the code so they work with the new local copy.

A problem with other site download tools is that they often get stuck when robots.txt contains commands preventing search engines from spidering through pages. No such issue for Site Sucker thankfully as it can be configured to ignore robot commands.

Keep a snapshot of your site on disk and browse it offline, with Site Sucker

OUR VERDICT

Is it perfect? In our testing we found that, by default, it grabs files you don’t need. Point it at a WordPress blog, for example, and it downloads all the PHP support files as well as your content. Still, if you’re looking for a quick, easy way to locally backup sites, it’s the best thing currently available.

Find the best price