Search K
Appearance
Appearance
If you're trying to find all files with a specific extension, you can use the "find" command to do this quickly and efficiently.
For example, to find all .php files under the /home directory, type:
cd /home
find . -name "*.php"If you need to find files that have been modified within a certain number of days (e.g., files newer than X days), you can use the find command. In this example, we'll list all files under home/admin which have been modified within the last 2 days:
cd /home/admin
find -mtime -2This might be useful if you're looking for any recent changes in the account.
Similarly, you can list files which are older than a certain time by using + instead of - with the number of days. For example, any files older than a year can be listed with:
find -mtime +356Other useful find commands: http://www.mysysad.com/2007/07/using-common-unix-find-command_07.html
If you need to quickly create a large file on your disk, you can use the fallocate command.
Sample usage for a 10Gig file:
fallocate -l 10G ten_gig.filewhich creates a file named ten_gig.file that is 10GB in size.
If you need a quick way to reset your public_html data to 755 for directories and 644 for files, then you can use something like this:
cd /home/user/domains/domain.com/public_html
find . -type d -exec chmod 0755 {} \;
find . -type f -exec chmod 0644 {} \;Additionally, if you know that PHP runs as the user and not as "apache", then you can set PHP files to 600 for an extra level of security like so:
find . -type f -name '*.php' -exec chmod 600 {} \;For those apt with Unix commands, often times, you'll need to add up many numbers given to you in the shell.
The quick and easy way to do this, is to use awk.
For this example, we'll assume you've already got your list of numbers (one per line) in a file called numbers.txt. The command would then be:
cat numbers.txt | awk '{ sum+=$1} END {print sum}'This method is handy when, for example, you're trying to get the sum of the number of occurrences of a string, such as an IP, over many logs.
If you want to extract the 58th line from the file.txt, you can use both head and tail to grab it, e.g.:
head -n 58 file.txt | tail -n 1Let's say you have a really large text file (a MySQL dump, for instance), and you need to extract a part of it. You need to identify the starting and ending line numbers from which to extract. You likely do not want to use the vi editor to do so as it is better to use less with the -N modifier to display line numbers:
less -N file.txtDefine the starting and ending line number you need (let's say 220 - 390). Next, export the content to another file using the sed tool and the > redirection operator:
sed -n '220,390p' file.txt > file_exported.txtIf you're looking to find all files of a specific type that contain a specific piece of code, you can do it by combining the find command with a grep of the text you're looking for.
For example, to find a specific string, let's call it asdfg, in all php files under the /home directory, use a script like this:
#!/bin/sh
#avoid special characters like quotes and brackets
STRING=asdfg
for i in `find . -name "*.php"`; do
{
C=`grep -c --max-count=1 "$STRING" $i`
if [ "$C" -gt 0 ]; then
echo $i;
#perl regex here, on $i, if needed
fi
};
done;
exit 0;If you need to get rid of the mentioned string, a sample Perl regex would be:
perl -pi -e 's/asdfg//' $iwhere it would find all instances of asdfg and replace it with "" (nothing), thus removing it from the files.
Note that a regex becomes more complicated if you need to replace special characters.
Using Google, search how to use regular expressions for string swapping for more information.
df -hIf you have many partitions, e.g., for /home, /var``, /usr`, etc., then you can narrow down the search by only searching the affected partition to make the process go much quicker.
The ncdu tool is not included in default packages, so you'll have to install it first:
yum install ncduNext, launch it to display disk usage in the desired location, usually /:
ncdu /If you know that the largest are /home/ and /var/lib/mysql and want to exclude them from counting to speed up the process, run:
ncdu / --exclude /home --exclude /var/lib/mysqlYou can browse the directories in the provided output with up/down/enter keys to review in detail.
Note, this process can be slow, so be patient.
Start with suspect paths, but if you've already narrowed it down to a partition, start there, e.g.:
cd /var
du | sort -nAfter /var, I would check /usr :
cd /usr
du | sort -nFollowing that, check /home, and /etc, in the same manner.
This will count all files on the box, so if your disk is slow/spins, then this may take a very long time.
cd /
du | sort -ncd /
du -x | sort -nFor all of the above du commands with the sort option, the largest paths will show up at the bottom of the output.
You can reverse the order so that the largest is shown at the top, listed in descending order, with the -r flag:
cd /
du -x | sort -rnThe way this works, is that all of the files have to be found and loaded first. Once loaded, the sort starts up.
No output will be displayed until the full list is loaded, so be very patient. You can Ctrl-C and try a sub-path if you want to be more specific, which should run more quickly.
Say you're creating a script where you need to run something as the User, or you're trying to debug a User cron, but the User doesn't have SSH access. You can use "su" to execute commands as that User.
Let's say that we have the User fred and we want to run the command /usr/bin/id.
You'd run the following as root:
/bin/su -l -s /bin/sh -c "/usr/bin/id" fredNote that. if you use any special characters like "quotes" in the command, they must be escaped with the \ character.
If you're trying to delete files inside a directory and the following command is not working:
# /bin/rm -rf *
/bin/rm: Argument list too long.Try this command from within the target directory instead:
find . -type f -deleteThe find command is much quicker at listing files from a directory, and newer versions of "find" have a -delete option built in, which will allow you to remove files very quickly.
Another solution to delete all files in the current directory, reportedly even faster than "find", is to use Perl:
perl -e 'for(<*>){((stat)[9]<(unlink))}'I don't believe that Perl distinguishes between a file and a directory, so if you have sub-directories, it will probably throw some errors. However, it should, in theory, remove the files anyway.
Many times we need to swap strings or look for a string, but in web hosting, very often, these string matches contain periods/dots.
Perl and grep are powerful in that they have special characters for matching, but unfortunately dots are wildcards, so any domain you're trying to match typically has to be escaped by adding a \ character before the dot. When working with a large number of matches or when scripting, it's easier to simply construct Perl or grep commands to match a fixed string so that no characters are treated as "special".
Let's create a test.txt file with the lines:
server.domain.com
serverAdomainBcom
server1domain2comWhen you run the following, you get 3 results that may not be what you want:
# grep server.domain.com test.txt.orig
server.domain.com
serverAdomain.com
server1domain2comSo, we can add the -F option so that it matches the exact string:
# grep -F server.domain.com test.txt.orig
server.domain.comWith a Perl regex, it doesn't have the command line option. It does, however, have the Q and E options, which mean "everything after Q and before E should not be escaped", thus making life much easier for doing an exact match.
If you want to swap server.domain.com with something.else.com, it should be:
# perl -pi -e 's/^\Qserver.domain.com\E\$/something.else.com/' test.txt; cat test.txt
something.else.com
serverAdomain.com
server1domain2comWe use the ^ to match the start of the line, and $ to match the end of the line for this example.
But if your value was within a line (other values before and after), you wouldn't use them.
Generally, this isn't the best thing to have happen because all of the data is stored there. You'll need to recreate all of the directory structures as well as a few files required for DA to run:
mkdir -p /home/tmp
chmod 1777 /home/tmp/home/make_dirs.sh file end insert code:#!/bin/sh
for i in `ls /usr/local/directadmin/data/users`; do
{
for d in `cat /usr/local/directadmin/data/users/${i}/domains.list`; do
{
mkdir -p /home/${i}/domains/${d}/public_html/cgi-bin
cd /home/${i}/domains/${d}/
ln -s public_html private_html
mkdir -p /home/${i}/domains/${d}/public_ftp
mkdir -p /home/${i}/domains/${d}/stats
mkdir -p /home/${i}/domains/${d}/logs
};
done;
mkdir -p /home/${i}/backups
chown -R $i:$i /home/${i}
chmod -R 755 /home/${i}
};
done;
exit 0;cd /home/
chmod 755 make_dirs.sh
./make_dirs.shIf you've accidentally removed your /var/log directory, or it was removed by someone else, you can recreate it with all permissions like so:
mkdir -m 755 /var/log
cd /var/log
mkdir -m 700 directadmin httpd
mkdir -m 755 exim proftpd httpd/domains
chown diradmin:diradmin directadmin
chown mail:mail eximAnd restart services:
systemctl restart rsyslog
systemctl restart httpd
systemctl restart directadmin
systemctl restart exim
systemctl restart proftpdWhat's a patch? Say you've got a file, let's use the /etc/exim.conf as an example, and you need to make changes to that file many times over.
This may be the case if you have many servers, and you want all of them to have the same changes made.
The best way to make the same changes to the exim.conf, but to also allow the default exim.conf to have differences in other areas, is to use a patch.
cd /etc
cp exim.conf exim.conf.orignano exim.confdiff -u exim.conf.orig exim.conf > exim.conf.patchYou've now got a patch file which can be applied to the original exim.conf for other systems, or if you re-install, etc.. Save it to a location on your website so it can be downloaded to other servers.
You've got a patch file, and your default exim.conf, and you want your changes to be applied.
cd /etc/
wget http://your.server.com/exim.conf.patchpatch -p0 < exim.conf.patchand you're done.
Your patched exim.conf should now have all of the changes that were manually done to the original.