What is backup mode in Robocopy

19

2

I am trying to copy a large database backup file over network.

The traditional program copy or xcopy failed with different binary resulted. Also tried robocopy without parameter, and resulted a failure. Just did another attempt with robocopy using /zb parameter (restart and backup mode). It took much longer but resulted in a success.

My question is, is backup mode really designed to copy large / backup file? Have searched through net and couldn't find a clear answer. Would appreciate if any experienced user could give me a hint or better solution.

ydoow

Posted 2015-09-23T00:59:01.180

Reputation: 333

There is very little difference between how xcopy and robocopy actually copy data over from A to B, so using robocopy over xcopy won't reduce your chances of accidental data corruption. Meaning that you got a different binary after using xcopy most likely because the DB was modified while it was copied. The "/Z" flag is of no relevance here (it just controls if robocopy skips over existing part of the file), as is the "/B" flag that basically controls permissions requested by the app when opening source file. – Angstrom – 2015-09-23T13:30:07.343

@Angstrom thanks for your reply. The binary difference is not from a changing DB as it's a static backup file (already detached from the database) being copied. True I believe /ZB does nothing intentionally for more robust copy, but the restartable flag seems make a more accurate binary write (from the fact that it takes much longer and result in an identical copy). Or it could be the restartability make it possible to rewrite lost data. – ydoow – 2015-09-23T23:19:41.003

Answers

2

Just want to share an update on resolving the issue above.

In my case, xcopy failed to copy the file over 10GB across servers in different domain and server location.

On the other side, robocopy with

/zb - Uses Restart mode. If access is denied, this option uses Backup mode.

can successfully copy. It increased the time from 1hour to 2.5hours though.

--

After re-arranging server, the file is copied across servers in the same domain and server location now. And using xcopy is alright too.

--

So my theory on this would be probably about the stability connection between servers. If the connection is not robust (with occasionally drop out causing an access issue), when copying large file like my case, a corruption likely occurs any time during the long process; robocopy with restart and backup can recover the copy pretty well. Time spent on recovery is probably the down side.

And as a side note, FTP instead of copy should be used if it's going to be a routine task

ydoow

Posted 2015-09-23T00:59:01.180

Reputation: 333

12

Backup mode is a way to read and write files ignoring any permissions problems.

It uses the SeBackupPrivilege (reading) and SeRestorePrivilege (writing) in order to read/write any and all files, disregarding any ACEs that would prevent you from reading or writing a file.

Normally when trying to copy or access a file, Windows performs a check to make sure you have permission to read or write to location, but with SeBackupPrivilege (granted to the Backup Operators and Administrators groups), and SeRestorePrivilege (also granted to the Backup Operators and Administrators groups), these checks are bypassed.

To check if your account has these privileges, you can run the command whoami /priv at a command prompt.

Justin Krejcha

Posted 2015-09-23T00:59:01.180

Reputation: 1 923

Only answer that actually answers the question definitively, thank you. Sources for the information provided would make it perfect. – Hashim – 2019-09-22T01:03:55.530

2

I would strongly suggest that you create a snapshot and backup the now quiesced file system. You can then run robocopy quickly using /J (unbuffered I/O for large files). Here is a script for creating a shadow copy of C: which it calls P:. This drive (P:) is a static image of the C: drive perfect for backups. We use this technique to copy active virtual machine disk images to a backup drive.

The following uses four script files:

  • A batch file to kick it the disk shadow commands
  • disk shadow commands to destroy any dangling chads previous shadow if the inner batch file crashed
  • a series of disk shadow commands to create the shadow as P:
  • a series of commands to execute while the shadow is active (an inner batch file executed while P: is active)

1) the batch file to start the process

diskshadow -s cleanup.cmds
diskshadow -s diskshadow.cmds

2) the shadow command file "cleanup.cmds" to destroy a previously active shadow

UNEXPOSE P:

3) the shadow command file "diskshadow.cmds" which builds the shadow, and then calls the fourth file

SET CONTEXT PERSISTENT NOWRITERS
SET METADATA example.cab
SET VERBOSE ON
BEGIN BACKUP
ADD VOLUME C: ALIAS systemVolumeShadow
CREATE
EXPORT %systemVolumeShadow% P:
EXEC c:\yourlocation\backup.cmd
UNEXPOSE P:
END BACKUP
RESET

4) The command file "backup.cmd" to operate on the shadow

REM do the ROBOCOPY commands here, with the source being P:, the shadow of C:

Note that Windows Server 2016 (and possibly other versions) runs a shadow copy twice daily during the week which will cause the shadow copy created below to crash. Make sure this backup technique doesn't overlap with these automated scheduled shadow commands.

erict

Posted 2015-09-23T00:59:01.180

Reputation: 185

0

To expand on @erict's answer, here's how to use PowerShell to create and destroy the snapshot:

$Drive = "D:\"
$Folder = $Drive + "ShadowCopy"

# Create the snapshot
$Snapshot = (gwmi -List Win32_ShadowCopy).Create($Drive, "ClientAccessible")
$Shadow = gwmi Win32_ShadowCopy | ? { $_.ID -eq $Snapshot.ShadowID }
$Volume = $Shadow.DeviceObject + "\"
cmd /c mklink /d "$Folder" "$Volume"

# Destroy the snapshot
cmd /c rd "$Folder"
$Shadow.Delete()

This can also be run on a remote machine, by using PowerShell remoting:

Enter-PSSession RemoteComputer

#
# Run snapshot commands here
#

Exit-PSSession

InteXX

Posted 2015-09-23T00:59:01.180

Reputation: 300