Search K
Appearance
Appearance
If your backups end up being very large and take a long time to create, you have a few options to reduce the load on your system:
Do backups less regularly. Doing backups only once per week on the slowest day of the week will reduce the load for other days.
Only backup some users one day, other users another day, etc. Spreading out the backup load for all accounts over the span of a week (or longer) will reduce your system load.
You can disable some areas of the backups if you don't need them, or want to use other backup methods (like rsync) for those areas. Skip IMAP DataSkip DatabasesSkip the /home/user/domains directory. Rsync would be a good replacement for this. Skip the home.tar.gz (other files in the User's home)
Relating to #3 above, newer versions of DA have options as to which areas you'd like to backup directly in the backup options.
Admin Level -> Admin Backup/Transfer -> Create Backup"Step 4: What" of the backup creation has options so you can pick what areas are to be included in the backup.
If Users have an over-sized /home/username/domains directory, you can use this technique to skip the "Domains Directory" option in the backup, and use rsync to transfer the Domains Directory. Related guide: [/directadmin/backup-restore-migration/migrations#pulling-data-from-a-remote-directory-with-rsync]
/home/user/domains/domain.com/public_html/images This concept would be to move the images into a skipped folder, eg:
cd /home/user
mkdir var
chown user:user var
cd var
cp -Rp ../domains/domain.com/public_html/images .which will now give you /home/user/var/images, which won't be included in the backup.
Next, move the current folder out of the way, and then link to it:
cd /home/user/domains/domain.com/public_html
mv images images.old
ln -s ../../../var/images .and test to ensure it's working. Once satisfied it's working, you can delete the images.old directory.
To cap just the tar binary with ionice, see this feature: http://www.directadmin.com/features.php?id=1423
In some cases, you might have a User account which you need to transfer, but the size of the account is too large for the backup system to handle easily. Some causes might be:
Admin Level -> Backup/Transfer -> Step 4: Whatwhere you can create a backup with or without certain areas.
With this ability, you can create the backup in smaller chunks, and restore each one on the remote server.
Don't forget to reset file permissions if you're doing manual file copies (rsync may handle this automatically, but if unsure, you can do it anyway)
If you wish to keep your load average below a certain point, and not allow backups to run right away if the load is over a certain point, the following script will help. It will check your load average before each backup file is created. If the load is too high, it will wait for 5 seconds, then check again. It will continue this process until the load is low enough, then create the backup. If the load is not below the threshold after 20 attempts, a non-zero value is returned and that user backup is skipped and an error returned in DA.
/usr/local/directadmin/scripts/custom/user_backup_pre.sh and place the code inside:#!/bin/sh
MAXTRIES=20
MAXLOAD=8.00
highload()
{
LOAD=`cat /proc/loadavg | cut -d\ -f1`
echo "$LOAD > $MAXLOAD" | bc
}
TRIES=0
while [ `highload` -eq 1 ];
do
sleep 5;
if [ "$TRIES" -ge "$MAXTRIES" ]; then
echo "system load above $MAXLOAD for $MAXTRIES attempts. Aborting.";
exit 1;
fi
((TRIES++))
done;
exit 0;chmod 755 /usr/local/directadmin/scripts/custom/user_backup_pre.shRelated: http://www.directadmin.com/features.php?id=978
DirectAdmin relies on the quota system to count how much disk space quota a User is using.
If realtime_quotas=0 is set it could be running a backup at the same moment DA is counting that quota, there is a chance DA will count almost double the size of the used disk space, since backups and the assembly area are all done as the User.
There are a few ways to avoid this scenario or to change things around so that you don't need to ever worry about it.
Start using the realtime_quotas=2 .
For ftp backups, tell DA not to use /home/tmp for assembly. This only works if /home has its own partition, and you move it to some other partition. For example, set:
backup_tmpdir=/tmpassuming that /tmp is large enough to store this temporary data.
If your don't want to increase the load of your system at peak hours and wish to only allow backups to be created by Resellers/Users within a certain range of time, you can use the all_pre.sh script to do so.
In this example we will code the all_pre.sh for the CMD_USER_BACKUP and the CMD_SITE_BACKUP to only be able to create backups between the hours of 1am and 8am.
Create /usr/local/directadmin/scripts/custom/all_pre.sh, and fill it with this code:
#!/bin/sh
HOUR=`date +%k`
MAKINGBACKUP=0
if [ "$command" = "/CMD_USER_BACKUP" ]; then
if [ "$action" = "create" ]; then
#when=now or when=cron
MAKINGBACKUP=1
if [ "$when" = "cron" ]; then
HOUR=$hour
fi
fi
fi
if [ "$command" = "/CMD_SITE_BACKUP" ]; then
if [ "$action" = "backup" ]; then
MAKINGBACKUP=1
fi
fi
if [ "$MAKINGBACKUP" -eq 1 ]; then
if [ "$HOUR" -ge 1 ] && [ "$HOUR" -lt 8 ]; then
#this is a valid time, exit.
exit 0;
else
echo "Backups must be created between 1am and 8am";
exit 1;
fi
fi
exit 0;And make it executable:
chmod 755 /usr/local/directadmin/scripts/custom/all_pre.sh