Robocopy or Robust File Copy is a utility that is included with Windows 2008+ or in the NT4, 2000, 2003 resource kits. Of all the utilities Microsoft has created, Robocopy is my favorite. And have used it extensively for file migrations and file server consolidations for years, specifically as part of the solution of Migrating from Novell to Microsoft. Why is it only part of the solution? Well Novell and Microsoft permissions don’t really align but through the use of Novell permission exports, scripting, Robocopy and icacls.exe you can get pretty close. Why am I writing this blog posting when Robocopy has been around so long? Partly as a precursor to the Distributed File System Replication walkthrough I am working on, and partly because every now and then I get customers who have done a file copy migration and for somehow have the file and/or share permissions screwed up. So this is a simple walkthrough covering the basic steps of migrating data, file and share permissions, and testing access. Robocopy is my utility of choice.
Here is a scenario that I ran into last week: Customer had a C: and D: partition on a disk. D: was where the data was and was running out of space. Customer purchases new disks, creates a raid array – disk E. Customer then from windows explorer does a file copy from D to E. Recreates the shares and users then start complaining about permission issues, and the customer’s file copy had almost 6000 files that had errors trying to copy. So this was a multi-issue that had to be resolved. While I won’t go through the specifics of all that had to happen I will go through the same scenario doing it in a more precise and least impactful way.
Step 1. Validate the size of the data being copied. We do this so we know we will have enough space in the location where we are going to copy the data to.
Step 2. Set the permissions on the root drive to match the source drive. Why? Well if say the data being copied has 2 Terabytes and 8 million files, any changes to the root location will have to touch each and every file to stamp the inherited permissions on them. It is much better to set the permissions on them up front. Changing the permissions on the root drive can take a long time that is like watching paint dry, canceling permissions midstream can lead to ACL mismatch just as the warning message states:
This can be very hard to troubleshoot because everything looks correctly through Explorer but users will still get weird permission issues. So it’s usually better to allow the process to complete.
Step 3. Set up the folders that will be used to host the data and set the permissions that are needed on them. Step 2 still applies at this level as well. We do not want to create the shares at this point we will do that at a later step, this is just setting the NTFS permissions. Which is a good time to talk about Share permissions versus NTFS permissions.
Rule 1) Share permissions are the maximum permissions that can be applied when accessing over the network. If you say domain admins only have Read when accessing the share but they have full control with NTFS permissions then they will only have read when they access the share. If you say Authenticated Users have Read/Write to a share and the NTFS permissions are set to only allow Domain Admins access, any user that is not a domain admin will get denied access.
Rule 2) NTFS Permissions always win. Again, Share permissions are only the maximum permissions that users will get; they do not define the minimal permissions that NTFS will allow. I was always under the assumption that you should give Everyone Full Control of the Share but control access from NTFS, however a lot of security scanners complain when everyone have Full Control. Therefore it’s actually a business decision how those permissions at the share level are applied. I fall into the camp of KISS (Keep It Simple Stupid).
Step 4. Copy the data. Ok now we get into the actual Robocopy. So here is the lay out of the command: Robocopy <source> <destination> switches.
http://social.technet.microsoft.com/wiki/contents/articles/1073.robocopy-and-a-few-examples.aspx gives a lot of good examples. Here are the switches I use most often
Switch 1) /E – Copy subfolders including empty ones
Switch 2) /S – Copy subfolders excluding empty ones
Switch 3) /Copyall – Copy Data, Attributes, Time Stamps, ACLs, Owner Info, and Auditing Info
Switch 4) /R:3 – Specifies the number of retries on copy failures, default is 1 million retries so essentially never finish
Switch 5) /W:1 – Specifies the number of times to wait between retries – default is 30 seconds couple that with 1 million retries that’s a really long time.
Switch 6) /Tee – shows progress on the screen – useful when you’re watching the progress
Switch 7) /Log:<Path and Name of Log File> – Useful when you want to see the failures. Default overwrites existing logfile use /Log+: to append existing log
Here’s what this looks like in practice:
Example 1) Robocopy d:\Data E\data /E /CopyAll /R:3 /W:1 /Tee /Log:C:\robo.log This command works great for quick, need to get the data moved today. It does not perform as well when its long term as this does not capture deletions from the source location. In order to capture the deletions you would need to add the /purge flag so that command would look like:
Example 2) Robocopy d:\Data E\data /E /CopyAll /purge /R:3 /W:1 /Tee /Log:C:\robo.
Step 5: The cutover – The cool thing about Robocopy is that running the same command over and over will only capture new and changed files. So it’s relatively safe to continue running the same command over and over. That said, there comes a time when you want the users to access the new data location and retire the old location. At this point, delete the old share and usually rename the folder to name + old to validate the users are still not connected to the data. Delete the old share, then create the share with the same permissions as the original share. Then remove all NTFS permissions from the original location. This will keep users who are still connected from getting access. Users accessing the old data and new data really complicates the data migration scenario because now there is a merge that has to happen. Avoid this at all costs – as this will cause the whole project to fail.
Step 6: Validation – Validate that users have access to the files they need, map the drives they expect, and that the files are what they expect them to be. Since we have not deleted any files they can be retrieved as needed.
In conclusion, we have gone through the steps required for a successful file migration utilizing the better than sliced bread utility Robocopy. We will be using this utility in my next posting about DFS-R, but it will also help in any situation where you need to migrate data and shares. Now as the great Vulcan salute states– “live long and prosper”.