Hi all,
Hope someone can give me some insight.
We're backing up a directory with 1,000s of subdirectories, with a total size of about 1.5TB. This has been split up into several DLEs (s.t. each DLE is approx ~100-200GB) by using include/exclude lists in disklists.
Compression and estimation is currently turned off (well, estimate is set to server).
When we start a backup, for each DLE, it seems to spend an hour waiting on "Dumping 0.0%". After that it'll start an actual backup of the DLE. And repeat for each DLE.
On the client at the point of waiting, it does seem to be running tar forked from a sendbackup. What's it doing? My gut instinct is that maybe it's generating the index or something, and because of the size of this folder (i.e. the many subdirs) it's causing problems?
Any insight would be greatly appreciated - especially if anyone has experience of using Amanda with this size of data.
Cheers,
Rik.