You are going to need a script which parses the gallery pages and then uses wget. For this particular site the script can be pretty straight forward, something like this:
#!/bin/bash
wget -qO - "http://www.zodiackillerfacts.com/gallery/" | \
egrep -o 'thumbnails\.php\?album=[0-9]+' | \
sort -u | \
while read gallery
do
wget -O "/tmp/$$" "http://www.zodiackillerfacts.com/gallery/$gallery"
album=$(egrep -m1 -o '<title>[^<]+' /tmp/$$ | \
sed -e 's/^<title>//' -e 's/[^a-zA-Z0-9 :-()]//g')
mkdir "$album" || continue
cd "$album"
egrep -o 'src="albums/[^"]*' "/tmp/$$" | \
sed -e 's/thumb_//' \
-e 's!^src="!http://www.zodiackillerfacts.com/gallery/!' | \
wget -i -
cd ..
rm "/tmp/$$"
done
Here, we fetch the HTML of the first page, parse out the gallery links, fetch the HTML for each gallery, make a directory for it and fetch all the images. Not very pretty or robust, but it seems to do the job.