24

I see no option to export a backup of the settings for a domain.

Maybe I should save the results of public DNS with dig but I would question whether a friend knows a better way.

gpupo
  • 1,114
  • 2
  • 10
  • 12

7 Answers7

39

Yes, it can be more friendly way. I suggest using cli53 tool, https://github.com/barnybug/cli53

After you setup it, just try

cli53 export --full sciworth.com

And you get the export zone in bind format.

astlock
  • 954
  • 7
  • 9
9

No need of additional software installations. You need only awscli.

Here is what I just wrote. It is simple and works like charm.

#!/bin/bash -e
#
#  Author: Peycho Dimitrov
#
#  DESCRIPTION
#
#  Create full backup of all hosted Route53 zones / domains in your account.
#
#  REQUIREMENTS
#
#  Available s3 bucket (where your json files will be saved)
#  awscli (with cofigured credentials or IAM role)
#  gzip
#  awk
#
####################################

#  CONFIGURATION

region="us-east-1" # Your aws region
b_route53_tmp="/tmp/r53_backup" # Your temp directory
b_route53_bucket="s3://my-backups/route53" # Your backup folder in s3.

# END OF CONFIGURATION

# Do not edit here if you don't know what your're doing! #

mkdir -p $b_route53_tmp
echo "$(date) Backup all Route53 zones and resource records."
p_aws="$(which aws) --region $region"
r53_zones=$($p_aws route53 list-hosted-zones --query '[HostedZones[*].[Id, Name]]' --output text | awk -F'/' '{print $3}')
if [ ! -z "$r53_zones" ]; then
        while read route; do
                zone=$(echo "$route" | awk '{print $1}')
                domain=$(echo "$route" | awk '{print $2}')
                echo "Processing $zone / $domain"
                $p_aws route53 list-resource-record-sets --hosted-zone-id "$zone" --output json > "$b_route53_tmp"/$(date +%Y%m%d%H%M%S)-"$zone"-"$domain"backup.json
        done <<<"$r53_zones"

        echo "Archive json files."
        gzip "$b_route53_tmp"/*backup.json
        echo "Backup $zone / $domain data to $b_route53_bucket/$(date +%Y)/$(date +%m)/$(date +%d)/"
        $p_aws s3 cp "$b_route53_tmp"/ $b_route53_bucket/$(date +%Y)/$(date +%m)/$(date +%d)/ --exclude "*" --include "*.gz" --recursive
fi

echo "$(date) Done!"
Peycho Dimitrov
  • 988
  • 7
  • 9
3

If you want to export to bind format, you can use this script:

#!/bin/bash

zonename=$1
hostedzoneid=$(aws route53 list-hosted-zones | jq -r ".HostedZones[] | select(.Name == \"$zonename.\") | .Id" | cut -d'/' -f3)
aws route53 list-resource-record-sets --hosted-zone-id $hostedzoneid --output json | jq -jr '.ResourceRecordSets[] | "\(.Name) \t\(.TTL) \t\(.Type) \t\(.ResourceRecords[].Value)\n"'
SzTibu
  • 192
  • 3
0

I wrote quick simple backup script

#!/usr/bin/env bash

#set your vars
date=$(date +%j%m%Y)
backup_to="/change/this/to/your/destination/backup/dir"
backup_command_path="/usr/local/bin/cli53"
backup_command="cli53 export"

#check if cli53 exist
if ! [ -f $backup_command_path ]; then
    echo -e " file do not exist\n please install it from https://github.com/barnybug/cli53/releases/latest"
    exit 1
fi

#get list of domains to backup
domains=$(cli53 l | awk '{print $2}'| sed 's/.$//' | sed 's/Nam//')

cd $backup_to

for i in ${domains[@]}; do
    cli53 export $i > $i
done

#list backup directory
echo -e " the following zones where backup:\n"
ls -l
kabanus
  • 11
  • 1
0

Based on @sztibu's answer above, except it shows usage and supports zone_id or zone_name. This is my fave because it's standard old school bind format, so other tools can do stuff with it.

#!/bin/bash
# r53_export

usage() {
  local cmd=$(basename "$0")
  echo -e >&2 "\nUsage: $cmd {--id ZONE_ID|--domain ZONE_NAME}\n"
  exit 1
}

while [[ $1 ]]; do
  if   [[ $1 == --id ]];     then shift; zone_id="$1"
  elif [[ $1 == --domain ]]; then shift; zone_name="$1"
  else usage
  fi
  shift
done

if [[ $zone_name ]]; then
  zone_id=$(
    aws route53 list-hosted-zones --output json \
      | jq -r ".HostedZones[] | select(.Name == \"$zone_name.\") | .Id" \
      | head -n1 \
      | cut -d/ -f3
  )
  echo >&2 "+ Found zone id: '$zone_id'"
fi
[[ $zone_id ]] || usage

aws route53 list-resource-record-sets --hosted-zone-id $zone_id --output json \
  | jq -jr '.ResourceRecordSets[] | "\(.Name) \t\(.TTL) \t\(.Type) \t\(.ResourceRecords[]?.Value)\n"'
lilole
  • 11
  • 5
0

To export a hosted zone in AWS Route 53, follow these steps (let say you are using example.com hosted zone):

Step 1: Installation – pip install route53-transfer

Step 2: Backup the zone to a CSV file:

route53-transfer dump example.com backup.csv

Use STDOUT instead of a file

route53-transfer dump example.com –

Step 3: Restore a zone:

route53-transfer load example.com backup.csv

Use - to load from STDIN instead

Migrate between accounts:

Use command line switches for overriding the access and secret keys:

route53-transfer --access-key-id=ACCOUNT1 --secret-key=SECRET dump example.com
route53-transfer --access-key-id=ACCOUNT2 --secret-key=SECRET load example.com

If you are working with private zones, use –private to distinguish private domains:

route53-transfer --private dump example.com example-private.csv
route53-transfer dump example.com example-public.csv
Dina Kaiser
  • 131
  • 3
-2

You can sign up for Cloudflare.com and add a free website.

Cloudflare will scan your DNS as part of its onboarding.

After import (or maybe during), in "Advanced" below the DNS records, there is an Export DNS file button.

Michael Cole
  • 452
  • 4
  • 13