KNOWNHOST WIKI

User Tools

Site Tools


security:misc:checking-access-logs-for-abuse

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
security:misc:checking-access-logs-for-abuse [2019/06/06 12:18]
Jonathan K. W. [Spam Scripts]
security:misc:checking-access-logs-for-abuse [2020/06/16 15:40] (current)
Karson N. [DoS/DDoS]
Line 7: Line 7:
 ----- -----
  
 +\\
 ==== Why Check the Domlogs ==== ==== Why Check the Domlogs ====
 If you suspect that there are traffic-based attacks being carried out against your sites, you'll need to check the domain access logs, or domlogs. Doing so will require that you are the root user and have SSH access. We will discuss the most common types of abuses that may occur, why,  and how to identify them via the domlogs. If you can identify the abuse that is occurring, you can then put in place measures to counter the attack. We will also discuss implementation of different counter-measures against these attacks.  If you suspect that there are traffic-based attacks being carried out against your sites, you'll need to check the domain access logs, or domlogs. Doing so will require that you are the root user and have SSH access. We will discuss the most common types of abuses that may occur, why,  and how to identify them via the domlogs. If you can identify the abuse that is occurring, you can then put in place measures to counter the attack. We will also discuss implementation of different counter-measures against these attacks. 
Line 29: Line 30:
   - Apache connections are hanging due to MaxRequestWorkers having been exceeded   - Apache connections are hanging due to MaxRequestWorkers having been exceeded
  
------ +\\
 ====Apache Domain Access Logs Format ==== ====Apache Domain Access Logs Format ====
  
Line 40: Line 40:
  
 Where each of these variables are defined as follows: Where each of these variables are defined as follows:
- 
  
   * %v  --   Vhost/Domain   * %v  --   Vhost/Domain
Line 77: Line 76:
  
 First, let's search for the current month and year for both http and https logs: First, let's search for the current month and year for both http and https logs:
 +<code>
   grep -s $(date +"%b/%Y:" /usr/local/apache/domlogs/myexampledomain.tld*   grep -s $(date +"%b/%Y:" /usr/local/apache/domlogs/myexampledomain.tld*
 +</code>
  
 Let's refine this by using Awk field separators to retrieve only the information we want: Let's refine this by using Awk field separators to retrieve only the information we want:
 +<code>
   grep -s $(date +"%b/%Y:" /usr/local/apache/domlogs/myexampledomain.tld* | awk {'print $1,$6,$7'   grep -s $(date +"%b/%Y:" /usr/local/apache/domlogs/myexampledomain.tld* | awk {'print $1,$6,$7'
 +</code>
  
 This will give us the client IPs, ,the type of request (GET/POST), and to what path/file (resource) the request was sent. This will give us the client IPs, ,the type of request (GET/POST), and to what path/file (resource) the request was sent.
  
 Now, let's allow Linux to sort this data for us via //sort//, then only list each request with the same values once (//uniq//), but count how many times they occur (//-c flag//), and then sort that by the number of repititions of each request (//sort -n//). Now, let's allow Linux to sort this data for us via //sort//, then only list each request with the same values once (//uniq//), but count how many times they occur (//-c flag//), and then sort that by the number of repititions of each request (//sort -n//).
 +<code>
   grep -s $(date +"%b/%Y:" /usr/local/apache/domlogs/myexampledomain.tld* | awk {'print $1,$6,$7' | sort | uniq -c | sort -n   grep -s $(date +"%b/%Y:" /usr/local/apache/domlogs/myexampledomain.tld* | awk {'print $1,$6,$7' | sort | uniq -c | sort -n
 +</code>
  
 At the bottom of the output, we see the following: At the bottom of the output, we see the following:
 +<code>
  74651 /usr/local/apache/domlogs/mydemodomain.tld:123.45.67.890 "POST /blog/wp-login.php 31175 "-"  74651 /usr/local/apache/domlogs/mydemodomain.tld:123.45.67.890 "POST /blog/wp-login.php 31175 "-"
 +</code>
  
 This seems to indicate a bruteforce attack against a Wordpress login. There were 74.5 thousand of these requests for this month. How much bandwidth did that use, though? This seems to indicate a bruteforce attack against a Wordpress login. There were 74.5 thousand of these requests for this month. How much bandwidth did that use, though?
  
 To find out, we need to write a new command parsing for this IP's requests to the wp-login.php file and add the bytes of each request (field 10) as we go: To find out, we need to write a new command parsing for this IP's requests to the wp-login.php file and add the bytes of each request (field 10) as we go:
 +<code>
   grep  /usr/local/apache/domlogs/mydemodomain.tld | grep 123.45.67.890 | awk 'BEGIN{print "Bandwidth used: }{SUM+=$10}END{print SUM}'   grep  /usr/local/apache/domlogs/mydemodomain.tld | grep 123.45.67.890 | awk 'BEGIN{print "Bandwidth used: }{SUM+=$10}END{print SUM}'
 +</code>
  
 We know that this will result in a large number of bytes, so we may as well convert this to GB to make this easier to read: We know that this will result in a large number of bytes, so we may as well convert this to GB to make this easier to read:
Line 109: Line 113:
  
 So, we can do the following to have the output shown in GB: So, we can do the following to have the output shown in GB:
 +<code>
   grep  /usr/local/apache/domlogs/mydemodomain.tld | grep 123.45.67.890 | awk 'BEGIN{print "Bandwidth used: }{SUM+=$10}END{print SUM/1024/1024/1024}'   grep  /usr/local/apache/domlogs/mydemodomain.tld | grep 123.45.67.890 | awk 'BEGIN{print "Bandwidth used: }{SUM+=$10}END{print SUM/1024/1024/1024}'
 +</code>
  
 We can see now that this IP bruteforcing the wp-login.php for the current month has cost us over 2G thus far (2.16741573531181G). We can see now that this IP bruteforcing the wp-login.php for the current month has cost us over 2G thus far (2.16741573531181G).
  
 We can also confirm this by multiplying the number of requests times the number of bytes since each request used the same number of bytes: We can also confirm this by multiplying the number of requests times the number of bytes since each request used the same number of bytes:
 +<code>
 +  31175 * 74651 = 2327244925  bytes
 +</code>
  
-  31175 * 74651 = 2327244925  bytes 
-   
 You can see how being able to understand each field helps with parsing the logs to better understand your sites' traffic. Let's look at some types of domlog abuses and requests with excessive resource usage. You can see how being able to understand each field helps with parsing the logs to better understand your sites' traffic. Let's look at some types of domlog abuses and requests with excessive resource usage.
  
------ +\\
 ==== Wordpress Login Bruteforcing ==== ==== Wordpress Login Bruteforcing ====
  
Line 131: Line 136:
  
 The command I use the identify this attack for the current day on a cPanel CentOS server running Apache is the following: The command I use the identify this attack for the current day on a cPanel CentOS server running Apache is the following:
 +<code>
   grep -s $(date +"%d/%b/%Y:" /usr/local/apache/domlogs/* | grep wp-login.php |  awk {'print $1,$6,$7'} | sort | uniq -c | sort -n   grep -s $(date +"%d/%b/%Y:" /usr/local/apache/domlogs/* | grep wp-login.php |  awk {'print $1,$6,$7'} | sort | uniq -c | sort -n
 +</code>
  
 This will output a nice list of the frequency of the IPs hitting which sites' wp-login.php via what request method (POST/GET)  for the current day formatted similar to the following: This will output a nice list of the frequency of the IPs hitting which sites' wp-login.php via what request method (POST/GET)  for the current day formatted similar to the following:
  
-{{{{:security:misc:wpbrute.png?direct&900|}}+{{:security:misc:wpbrute.png?nolink&900}} 
 + 
 +-----
  
 You can see that the IPs responsible for the most attempts at bruteforcing your Wordpress login are those listed at the bottom. You can then block these IPs accordingly* until you can implement a more effective protection against this type of abuse.  You can see that the IPs responsible for the most attempts at bruteforcing your Wordpress login are those listed at the bottom. You can then block these IPs accordingly* until you can implement a more effective protection against this type of abuse. 
  
------ +\\
 ==== XMLRPC Attacks ==== ==== XMLRPC Attacks ====
  
Line 147: Line 154:
  
 To stop/prevent XMLRPC attacks, you need to protect the xmlrpc.php file. There are a few considerations with this, though. First of all, some plugins rely heavily on xmlrpc,. Once such popular plugin is Jetpack. So, first you need to decide if you use XMLRPC on your site. If you do not, great! Then that is really easy. Just block access to XMLRPC via .htaccess: To stop/prevent XMLRPC attacks, you need to protect the xmlrpc.php file. There are a few considerations with this, though. First of all, some plugins rely heavily on xmlrpc,. Once such popular plugin is Jetpack. So, first you need to decide if you use XMLRPC on your site. If you do not, great! Then that is really easy. Just block access to XMLRPC via .htaccess:
- +<code> 
-  <Files xmlrpc.php>+<Files xmlrpc.php>
   ErrorDocument 403 default    ErrorDocument 403 default 
   order deny,allow   order deny,allow
   deny from all   deny from all
-  </Files>+</Files
 +</code>
  
 If you do use Jetpack, you could block access conditionally depending on whether the IP is Jetpack's IP or not: If you do use Jetpack, you could block access conditionally depending on whether the IP is Jetpack's IP or not:
- +<code> 
-  <Files xmlrpc.php> +<Files xmlrpc.php> 
   ErrorDocument 403 default    ErrorDocument 403 default 
   order deny,allow   order deny,allow
Line 168: Line 176:
   allow from 64.34.206.0/24   allow from 64.34.206.0/24
   deny from all   deny from all
-  </Files>+</Files
 +</code>
  
-Another option would be to continue to allow access if you are using Jetpack, but enable their [[https://jetpack.com/support/security-features/#protect|Protect]] module, which claims to protect from this attack. +Another option would be to continue to allow access if you are using Jetpack, but enable their ((https://jetpack.com/support/security-features/#protect)) 
 +[[https://jetpack.com/support/security-features/#protect|Protect]] module, which claims to protect from this attack. 
  
 Alternative options include Mod_Security, WAFs like Sucuri's that block system.multicall requests, or security plugins that offer protections.  Alternative options include Mod_Security, WAFs like Sucuri's that block system.multicall requests, or security plugins that offer protections. 
Line 177: Line 187:
  
 To identify this type of attack in the domain access logs, you simply need to look for POST requests to xmlrpc.php file within the suspected time frame and sort the data in a readable format. I use the following command to identify whether any XMLRPC attack has occurred for the current day in a cPanel/CentOS server running Apache: To identify this type of attack in the domain access logs, you simply need to look for POST requests to xmlrpc.php file within the suspected time frame and sort the data in a readable format. I use the following command to identify whether any XMLRPC attack has occurred for the current day in a cPanel/CentOS server running Apache:
 +<code>
   grep -s $(date +"%d/%b/%Y:" /usr/local/apache/domlogs/* | grep xmlrpc |  awk {'print $1,$6,$7'} | sort | uniq -c | sort -n   grep -s $(date +"%d/%b/%Y:" /usr/local/apache/domlogs/* | grep xmlrpc |  awk {'print $1,$6,$7'} | sort | uniq -c | sort -n
 +</code>
  
 This will list output showing the  number of requests that each IP is responsible for and to what site. You may be able to temporarily block IPs based on this output and then enable protection on the xmlrpc.php file for more permanent resolution. This will list output showing the  number of requests that each IP is responsible for and to what site. You may be able to temporarily block IPs based on this output and then enable protection on the xmlrpc.php file for more permanent resolution.
  
-{{{{:security:misc:xmlrpc.png?direct&900|}}+{{:security:misc:xmlrpc.png?nolink&900}}
  
 ----- -----
  
 +\\
 ==== Bot Traffic ==== ==== Bot Traffic ====
  
Line 191: Line 203:
  
 To limit how often a bot crawls your site, you can define a crawl delay in your robots.txt file for all bots. The following contents of the robots.txt file would limit all bots to crawling every 3 seconds: To limit how often a bot crawls your site, you can define a crawl delay in your robots.txt file for all bots. The following contents of the robots.txt file would limit all bots to crawling every 3 seconds:
 +<code>
   User-agent: *    User-agent: * 
   Crawl-delay: 3   Crawl-delay: 3
 +</code>
  
 Unfortunately, not all bots obey these robots.txt file requests. These are often termed 'bad' bots. When you are being bombarded by 'bad' bot traffic, you can block those 'bad' bots via .htaccess. The following is an .htaccess rule that blocks bots that many have deemed 'bad': Unfortunately, not all bots obey these robots.txt file requests. These are often termed 'bad' bots. When you are being bombarded by 'bad' bot traffic, you can block those 'bad' bots via .htaccess. The following is an .htaccess rule that blocks bots that many have deemed 'bad':
 +<code>
   RewriteEngine On   RewriteEngine On
   RewriteCond %{HTTP_USER_AGENT} ^.*(Ahrefs|MJ12bot|Seznam|Baiduspider|Yandex|SemrushBot|DotBot|spbot).*$ [NC]   RewriteCond %{HTTP_USER_AGENT} ^.*(Ahrefs|MJ12bot|Seznam|Baiduspider|Yandex|SemrushBot|DotBot|spbot).*$ [NC]
   RewriteRule .* - [F,L]   RewriteRule .* - [F,L]
 +</code>
  
 To find how often a bot is hitting your sites for the current day, you can use the following command: To find how often a bot is hitting your sites for the current day, you can use the following command:
 +<code>
   grep -s $(date +"%d/%b/%Y:") /usr/local/apache/domlogs/ | grep "bot\|spider\|crawl" | awk {'print $6,$14'} | sort | uniq -c | sort -n | tail -25   grep -s $(date +"%d/%b/%Y:") /usr/local/apache/domlogs/ | grep "bot\|spider\|crawl" | awk {'print $6,$14'} | sort | uniq -c | sort -n | tail -25
 +</code>
  
 If you have a lot of sites on your server, and a lot of bot traffic, you may want to start by editing the .htaccess or robots.txt file for those with the most bot traffic first so that you have the most impact on resources with the least amount of effort. You can use the following command to search for common bots and see which sites are being hit by them the most (if you are seeing a lot of bots not listed in this command from the previous command to find bot traffic, then be sure to add them to the search command below): If you have a lot of sites on your server, and a lot of bot traffic, you may want to start by editing the .htaccess or robots.txt file for those with the most bot traffic first so that you have the most impact on resources with the least amount of effort. You can use the following command to search for common bots and see which sites are being hit by them the most (if you are seeing a lot of bots not listed in this command from the previous command to find bot traffic, then be sure to add them to the search command below):
 +<code>
   grep -s $(date +"%d/%b/%Y:") /usr/local/apache/domlogs/* | grep -i "BLEXBot\|Ahrefs\|MJ12bot\|SemrushBot\|Baiduspider\|Yandex\|Seznam\|DotBot\|spbot\|GrapeshotCrawler\|NetSeer"| awk {'print $1'} | cut -d: -f1| sort | uniq -c | sort -n   grep -s $(date +"%d/%b/%Y:") /usr/local/apache/domlogs/* | grep -i "BLEXBot\|Ahrefs\|MJ12bot\|SemrushBot\|Baiduspider\|Yandex\|Seznam\|DotBot\|spbot\|GrapeshotCrawler\|NetSeer"| awk {'print $1'} | cut -d: -f1| sort | uniq -c | sort -n
 +</code>
  
 Now, before you end up blocking Googlebot because you believe it to be hitting your sites 10 thousand times in one day, make sure that the IPs associated with the bot actually belong to Google. Malicious actors will spoof their user agents to make them seem like a respected bot, such as Googlebot, Yahoo Slurp, Bingbot, etc. To make sure that  they actually are IPs belong to bots that you are about to block/limit, you can search the bot in the domlogs to get the IPs these requests are originating from, and then run a WHOIS on the IP. For example, I found some requests from a Googlebot bot in one  site. To determine whether these requests were from Googlebot, I needed to get a list of the IPs making these requests, and then run a WHOIS on them. The following command will generate a list of IPs in order of frequency: Now, before you end up blocking Googlebot because you believe it to be hitting your sites 10 thousand times in one day, make sure that the IPs associated with the bot actually belong to Google. Malicious actors will spoof their user agents to make them seem like a respected bot, such as Googlebot, Yahoo Slurp, Bingbot, etc. To make sure that  they actually are IPs belong to bots that you are about to block/limit, you can search the bot in the domlogs to get the IPs these requests are originating from, and then run a WHOIS on the IP. For example, I found some requests from a Googlebot bot in one  site. To determine whether these requests were from Googlebot, I needed to get a list of the IPs making these requests, and then run a WHOIS on them. The following command will generate a list of IPs in order of frequency:
 +<code>
   grep -s $(date +"%d/%b/%Y:") /usr/local/apache/domlogs/* | grep bot  | grep -i googlebot | awk {'print $1'} | cut -d: -f2 | sort | uniq -c |sort -n   grep -s $(date +"%d/%b/%Y:") /usr/local/apache/domlogs/* | grep bot  | grep -i googlebot | awk {'print $1'} | cut -d: -f2 | sort | uniq -c |sort -n
 +</code>
  
 Now that you have the list of IPs, for each IP, run the WHOIS command (replace xxx.xxx.xxx.xxx with the actual IP): Now that you have the list of IPs, for each IP, run the WHOIS command (replace xxx.xxx.xxx.xxx with the actual IP):
 +<code>
   whois xxx.xxx.xxx.xxx   whois xxx.xxx.xxx.xxx
 +</code>
  
 For the sake of demonstration, let's say you find the following results when searching for a particular bot user agent: For the sake of demonstration, let's say you find the following results when searching for a particular bot user agent:
- +<code> 
-     23 66.249.66.214 +   23 66.249.66.214 
-     23 66.249.66.92 +   23 66.249.66.92 
-     25 66.249.66.218 +   25 66.249.66.218 
-     25 66.249.66.79 +   25 66.249.66.79 
-     30 66.249.66.77 +   30 66.249.66.77 
-     35 66.249.66.78+   35 66.249.66.78
    2999 114.33.96.109    2999 114.33.96.109
 +</code>
  
 Running a WHOIS on the IPs in the range 66.249.66.* results in something like the following (screenshot truncated for the sake of brevity): Running a WHOIS on the IPs in the range 66.249.66.* results in something like the following (screenshot truncated for the sake of brevity):
  
-{{{{:security:misc:whois.png?direct&750|}}+{{:security:misc:whois.png?nolink&750}}
  
 However, what about that last IP that is not like the others? However, what about that last IP that is not like the others?
  
-{{{{:security:misc:twip.png?direct&750|}}+{{:security:misc:twip.png?nolink&750}}
  
  
Line 240: Line 259:
 Unfortunately, there is no long-term solution for blocking spoofed user agents pretending to bots that I am aware of. This is why it is important to be able to monitor your traffic if you notice any unusual spikes.  Unfortunately, there is no long-term solution for blocking spoofed user agents pretending to bots that I am aware of. This is why it is important to be able to monitor your traffic if you notice any unusual spikes. 
  
------ +\\
 ==== DoS/DDoS ==== ==== DoS/DDoS ====
  
 To identify a DoS (Denial of Service), you would just note a lot of requests coming from a single IP or relatively few IPs causing services to overload and deny legitimate requests. If a single, or a few single IPs are responsible, then this would be a DoS. A DDoS is more difficult to identify because the requests are distributed over many IPs. You would see a dramatic, sudden, and relentless increase in traffic to a site or IP from many different IPs with a DDoS.  To identify a DoS (Denial of Service), you would just note a lot of requests coming from a single IP or relatively few IPs causing services to overload and deny legitimate requests. If a single, or a few single IPs are responsible, then this would be a DoS. A DDoS is more difficult to identify because the requests are distributed over many IPs. You would see a dramatic, sudden, and relentless increase in traffic to a site or IP from many different IPs with a DDoS. 
  
-Combating a DoS is easy. You would just block the responsible IP(s). A DDoS is a bit more complicated. Traffic will need more advanced filtering, such as [[https://support.cloudflare.com/hc/en-us/articles/200170076-What-does-I-m-Under-Attack-Mode-do-|"I’m Under Attack mode"]] that is offered by [[https://www.cloudflare.com/ddos/|Cloudflare]]. If you choose to use Cloudflare, remember to whitelist their IPs in the server's firewall and to install Mod_Cloudflare so that the actual visitors' IPs are logged in the Apache log rather than Cloudflare's. Cloudflare is one of the most recommended services for protecting against DoS attacks. They offer a free introduction plan and, in addition to the DoS protection, you benefit from their global CDN and many other features.  +Combating a DoS is easy. You would just block the responsible IP(s). A DDoS is a bit more complicated. Traffic will need more advanced filtering, such as ((https://support.cloudflare.com/hc/en-us/articles/200170076-What-does-I-m-Under-Attack-Mode-do-))[[https://support.cloudflare.com/hc/en-us/articles/200170076-What-does-I-m-Under-Attack-Mode-do-|"I’m Under Attack mode"]] that is offered by ((https://www.cloudflare.com/ddos/))[[https://www.cloudflare.com/ddos/|Cloudflare]]. If you choose to use Cloudflare, remember to whitelist their IPs in the server's firewall and to install Mod_Cloudflare so that the actual visitors' IPs are logged in the Apache log rather than Cloudflare's. Cloudflare is one of the most recommended services for protecting against DoS attacks. They offer a free introduction plan and, in addition to the DoS protection, you benefit from their global CDN and many other features.  
  
-One may also consider Apache modules that are designed to help mitigate traffic-based attacks. Two such modules are [[https://documentation.cpanel.net/display/EA4/Apache+Module%3A+Evasive|mod_evasive]]  and [[https://documentation.cpanel.net/display/EA/How+To+Mitigate+Slowloris+Attacks|mod_reqtimeout]], both of which cPanel provides documentation for configuring with the cPanel Apache server.  +One may also consider Apache modules that are designed to help mitigate traffic-based attacks. Two such modules are 
 +((https://documentation.cpanel.net/display/EA4/Apache+Module%3A+Evasive)) 
 +[[https://documentation.cpanel.net/display/EA4/Apache+Module%3A+Evasive|mod_evasive]]  and 
 +((https://documentation.cpanel.net/display/EA/How+To+Mitigate+Slowloris+Attacks)) 
 +[[https://documentation.cpanel.net/display/EA/How+To+Mitigate+Slowloris+Attacks|mod_reqtimeout]], both of which cPanel provides documentation for configuring with the cPanel Apache server.  
  
-Yet another option would be the [[https://documentation.cpanel.net/display/CKB/The+ModSecurity+Guardian+Log|The ModSecurity Guardian Log]]. ModSecurity has the added benefit of acting as a Web Application Firewall and protecting all sites on the server simultaneously against application level attacks, such as the XMRPC attack, for example. +Yet another option would be the 
 +((https://documentation.cpanel.net/display/CKB/The+ModSecurity+Guardian+Log))[[https://documentation.cpanel.net/display/CKB/The+ModSecurity+Guardian+Log|The ModSecurity Guardian Log]]. 
 +ModSecurity has the added benefit of acting as a Web Application Firewall and protecting all sites on the server simultaneously against application level attacks, such as the XMRPC attack, for example. 
  
-Remember that with KnownHost hosting, you don't have to worry about the following types of DDoS attacks thanks to our complimentary [[https://www.knownhost.com/ddos-protection.html|DDoS protection]]:+Remember that with KnownHost hosting, you don't have to worry about the following types of DDoS attacks thanks to our complimentary 
 +((https://www.knownhost.com/ddos-protection.html)) 
 +**[[https://www.knownhost.com/ddos-protection.html|DDoS protection]]**:
  
   * UDP Floods   * UDP Floods
Line 260: Line 286:
   * DNS Amplification   * DNS Amplification
   * Fragmented Packet Attacks   * Fragmented Packet Attacks
- 
  
 You do, however, need to prepare yourself for Layer 7 and application layer attacks such as spam and low-level traffic floods as well as application level attacks such as XMLRPC and bruteforcing. We've already mentioned methods to protect against DoS and DDoS attacks, as well as using ModSecurity for protecting at the site level. One can also use WAFs developed specifically for their chosen applications, such as a security plugin for Wordpress, for example.  You do, however, need to prepare yourself for Layer 7 and application layer attacks such as spam and low-level traffic floods as well as application level attacks such as XMLRPC and bruteforcing. We've already mentioned methods to protect against DoS and DDoS attacks, as well as using ModSecurity for protecting at the site level. One can also use WAFs developed specifically for their chosen applications, such as a security plugin for Wordpress, for example. 
Line 266: Line 291:
 As for protection from incoming spam floods,  one could use a myriad of cPanel features such as the following: As for protection from incoming spam floods,  one could use a myriad of cPanel features such as the following:
  
-  * [[https://documentation.cpanel.net/display/78Docs/Exim+Configuration+Manager+-+Basic+Editor#RBLs|Custom RBLs]] +  * ((https://documentation.cpanel.net/display/78Docs/Exim+Configuration+Manager+-+Basic+Editor#RBLs))[[https://documentation.cpanel.net/display/78Docs/Exim+Configuration+Manager+-+Basic+Editor#RBLs|Custom RBLs]] 
-  * [[https://documentation.cpanel.net/display/78Docs/Configure+Greylisting|Greylisting]] +  * ((https://documentation.cpanel.net/display/78Docs/Configure+Greylisting))[[https://documentation.cpanel.net/display/78Docs/Configure+Greylisting|Greylisting]] 
-  * [[https://documentation.cpanel.net/display/78Docs/Spam+Filters|SpamAssassin]] +  * ((https://documentation.cpanel.net/display/78Docs/Spam+Filters))[[https://documentation.cpanel.net/display/78Docs/Spam+Filters|SpamAssassin]] 
-  * [[https://documentation.cpanel.net/display/78Docs/BoxTrapper|BoxTrapper]] +  * ((https://documentation.cpanel.net/display/78Docs/BoxTrapper))[[https://documentation.cpanel.net/display/78Docs/BoxTrapper|BoxTrapper]] 
-  * [[https://documentation.cpanel.net/display/CKB/How+to+Configure+Mail+Filters|Mail Filters]] +  * ((https://documentation.cpanel.net/display/CKB/How+to+Configure+Mail+Filters))[[https://documentation.cpanel.net/display/CKB/How+to+Configure+Mail+Filters|Mail Filters]] 
-  * [[https://documentation.cpanel.net/display/78Docs/BoxTrapper|BoxTrapper]] +  * ((https://documentation.cpanel.net/display/78Docs/BoxTrapper))[[https://documentation.cpanel.net/display/78Docs/BoxTrapper|BoxTrapper]] 
-  * [[https://forums.cpanel.net/threads/disable-catch-all-email-address.74069/|Blackhole Emails Sent to Non-Existent/Default Email Accounts During the Attack]] +  * ((https://forums.cpanel.net/threads/disable-catch-all-email-address.74069/))[[https://forums.cpanel.net/threads/disable-catch-all-email-address.74069/|Blackhole Emails Sent to Non-Existent/Default Email Accounts During the Attack]]
- +
------+
  
 +\\
 ==== Wordpress Heartbeat API ==== ==== Wordpress Heartbeat API ====
  
Line 281: Line 305:
  
 You can detect this in the domain access logs  by looking for POST requests to the wp-admin/admin-ajax.php file. You can use the following query to check for this for the current date: You can detect this in the domain access logs  by looking for POST requests to the wp-admin/admin-ajax.php file. You can use the following query to check for this for the current date:
 +<code>
   grep -s $(date +"%d/%b/%Y:" /usr/local/apache/domlogs/* | grep admin-ajax |  awk {'print $1,$6,$7'} | sort | uniq -c | sort -n    grep -s $(date +"%d/%b/%Y:" /usr/local/apache/domlogs/* | grep admin-ajax |  awk {'print $1,$6,$7'} | sort | uniq -c | sort -n 
 +</code>
  
-If you see that this is occurring, you can utilize the [[https://wordpress.org/plugins/heartbeat-control/|Heartbeat Control Plugin]] to either completely disable or limit the Wordpress Heartbeat API's normal behavior. Even with this plugin, you may still need to do more. Your installed plugins or themes may be calling the admin-ajax.php file as well. If you continue to see high heartbeat usage after installing the Heartbeat Control Plugin, you'll want to identify the plugin or theme responsible, reach out to the developers, and see if there is any way that its usage can be limited. +If you see that this is occurring, you can utilize the  
 +((https://wordpress.org/plugins/heartbeat-control/)) 
 +[[https://wordpress.org/plugins/heartbeat-control/|Heartbeat Control Plugin]] to either completely disable or limit the Wordpress Heartbeat API's normal behavior. Even with this plugin, you may still need to do more. Your installed plugins or themes may be calling the admin-ajax.php file as well. If you continue to see high heartbeat usage after installing the Heartbeat Control Plugin, you'll want to identify the plugin or theme responsible, reach out to the developers, and see if there is any way that its usage can be limited. 
  
 There is no easy way to find the culprit. One method would be to disable all plugins and themes and slowly re-enable them one-by-one and reload the site for each re-enabled plugin while simultaneously tailing the domain access log and grepping for admin-ajax requests. You would need to note the admin-ajax.php POST requests that occur for each plugin/theme to determine which is responsible for the most utilization and CPU consumption. You could use this command to watch the domain access logs for only admin-ajax.php POST requests ( if you are wanting to check for a certain site, replace the asterisk with the domain name followed by an asterisk so that you see requests for both http and https versions of the site): There is no easy way to find the culprit. One method would be to disable all plugins and themes and slowly re-enable them one-by-one and reload the site for each re-enabled plugin while simultaneously tailing the domain access log and grepping for admin-ajax requests. You would need to note the admin-ajax.php POST requests that occur for each plugin/theme to determine which is responsible for the most utilization and CPU consumption. You could use this command to watch the domain access logs for only admin-ajax.php POST requests ( if you are wanting to check for a certain site, replace the asterisk with the domain name followed by an asterisk so that you see requests for both http and https versions of the site):
 +<code>
   tail -f /usr/local/apache/domlogs/* | grep admin-ajax   tail -f /usr/local/apache/domlogs/* | grep admin-ajax
 +</code>
  
 This method is slow and tedious, though. A better method would be like those discussed below since they will let you examine the POST request thoroughly and determine what code called it.    This method is slow and tedious, though. A better method would be like those discussed below since they will let you examine the POST request thoroughly and determine what code called it.   
  
-Another method would be to use a tool to view all requests on each page load in a graphical GUI in the browser, such as [[https://gtmetrix.com/|GTMetrix]]. GTMetrix will analyze the site and generate a report of all requests under the Waterfall tab. Clicking this, and then clicking the plus sign next to the "POST admin-ajax.php" requests listed in the waterfall will allow you to view more information about the requests. After you click the POST wp-admin/admin-ajax.php request link, you will want to click the "Post" tab to view information about the POST request sent. This may list information that will allow you to determine which plugin/theme is responsible for these requests.  Similar tools include [[https://tools.pingdom.com/|Pingdom Website Speed Test]],[[https://www.webpagetest.org/|WebPageTest]] and [[https://giftofspeed.com/|Gift of Speed]].+Another method would be to use a tool to view all requests on each page load in a graphical GUI in the browser, such as ((https://gtmetrix.com/)) 
 +[[https://gtmetrix.com/|GTMetrix]]. 
 +GTMetrix will analyze the site and generate a report of all requests under the Waterfall tab. Clicking this, and then clicking the plus sign next to the "POST admin-ajax.php" requests listed in the waterfall will allow you to view more information about the requests. After you click the POST wp-admin/admin-ajax.php request link, you will want to click the "Post" tab to view information about the POST request sent. This may list information that will allow you to determine which plugin/theme is responsible for these requests. Similar tools include  
 +((https://tools.pingdom.com/)) 
 +[[https://tools.pingdom.com/|Pingdom Website Speed Test]], 
 +((https://www.webpagetest.org/)) 
 +[[https://www.webpagetest.org/|WebPageTest]]  
 +and 
 +((https://giftofspeed.com/)) 
 +[[https://giftofspeed.com/|Gift of Speed]].
  
 Another obvious tool that could assist with finding the responsible plugin/theme would your browser's developer tools. You would want to use the Network feature to examine the POST requests sent on each page load/reload to try to determine the origin of the request.  Another obvious tool that could assist with finding the responsible plugin/theme would your browser's developer tools. You would want to use the Network feature to examine the POST requests sent on each page load/reload to try to determine the origin of the request. 
  
-  * [[https://developers.google.com/web/tools/chrome-devtools/network/|Chrome Network Analyzer]] +  * ((https://developers.google.com/web/tools/chrome-devtools/network/))[[https://developers.google.com/web/tools/chrome-devtools/network/|Chrome Network Analyzer]] 
-  * [[https://developer.mozilla.org/en-US/docs/Tools/Network_Monitor|FireFox Network Monitor]] +  * ((https://developer.mozilla.org/en-US/docs/Tools/Network_Monitor))[[https://developer.mozilla.org/en-US/docs/Tools/Network_Monitor|FireFox Network Monitor]]
- +
-----+
  
 +\\
 ==== Wordpress Cron ==== ==== Wordpress Cron ====
  
 Wordpress cron isn't a 'real' cron. It is a PHP script that is executed on each page visit. This is great for some shared hosting environments where a user may not have access to a real Linux crontab however, it presents a few different problems and thus should be reconfigured for those who do have access to the crontab. First of all, for high-traffic sites, the wp-cron.php file being called for every single page load can have a major performance impact. Secondly, for low-traffic sites, this can result in tasks not running as often as they should. To determine how often the wp-cron has ran in the current day, you can use the following command: Wordpress cron isn't a 'real' cron. It is a PHP script that is executed on each page visit. This is great for some shared hosting environments where a user may not have access to a real Linux crontab however, it presents a few different problems and thus should be reconfigured for those who do have access to the crontab. First of all, for high-traffic sites, the wp-cron.php file being called for every single page load can have a major performance impact. Secondly, for low-traffic sites, this can result in tasks not running as often as they should. To determine how often the wp-cron has ran in the current day, you can use the following command:
 +<code>
   grep -s $(date +"%d/%b/%Y:" /usr/local/apache/domlogs/* | grep wp-cron.php |  awk {'print $1,$6,$7'} | sort | uniq -c | sort -n    grep -s $(date +"%d/%b/%Y:" /usr/local/apache/domlogs/* | grep wp-cron.php |  awk {'print $1,$6,$7'} | sort | uniq -c | sort -n 
 +</code>
  
 To disable the wp-cron.php script's execution on every page load, open the wp-config.php file in your main WordPress folder and add the following just after the WP_DEBUG definition and just before the "/* That's all, stop editing! Happy blogging. */" line: To disable the wp-cron.php script's execution on every page load, open the wp-config.php file in your main WordPress folder and add the following just after the WP_DEBUG definition and just before the "/* That's all, stop editing! Happy blogging. */" line:
 +<code>
   define('DISABLE_WP_CRON', true);   define('DISABLE_WP_CRON', true);
 +</code>
  
-{{{{:security:misc:afterwpcron.png?direct&500|}}+{{:security:misc:afterwpcron.png?nolink&800}}
  
 Next, setup a real cron job to execute the wp-cron.php. How often you set it to execute is really dependent on the site. You could check the contents of the cron in the wp_options table to determine the smallest time frame between scheduled tasks and then run the cron that often. A default Wordpress blog with low traffic should be fine with the cron running every 15 minutes to an hour. You can adjust as you see fit.  Next, setup a real cron job to execute the wp-cron.php. How often you set it to execute is really dependent on the site. You could check the contents of the cron in the wp_options table to determine the smallest time frame between scheduled tasks and then run the cron that often. A default Wordpress blog with low traffic should be fine with the cron running every 15 minutes to an hour. You can adjust as you see fit. 
  
 You can set your cron to use wget, curl, or php. The php command would be the best command since curl and wget must be sent across the network. Thus, by using php, you increase reliability and eliminate whatever network latency you may have otherwise encountered (routing, filtering, dns errors, etc.,). You can use cPanel to add the cron, or you can edit the crontab using 'crontab -e' or your favorite editor (/var/spool/cron/USERNAME). You can set your cron to use wget, curl, or php. The php command would be the best command since curl and wget must be sent across the network. Thus, by using php, you increase reliability and eliminate whatever network latency you may have otherwise encountered (routing, filtering, dns errors, etc.,). You can use cPanel to add the cron, or you can edit the crontab using 'crontab -e' or your favorite editor (/var/spool/cron/USERNAME).
 +<code>
   /usr/bin/php -q /home/USERNAME/public_html/wp-cron.php >/dev/null 2>&1   /usr/bin/php -q /home/USERNAME/public_html/wp-cron.php >/dev/null 2>&1
 +</code>
  
 cPanel has a very helpful interface for configuring cronjobs if you are unfamiliar with the syntax.  cPanel has a very helpful interface for configuring cronjobs if you are unfamiliar with the syntax. 
  
-{{{{:security:misc:cpcrons.png?direct&800|}}+{{:security:misc:cpcrons.png?nolink&800}}
  
 <WRAP center round tip 60%> <WRAP center round tip 60%>
Line 329: Line 368:
 </WRAP> </WRAP>
  
------ +\\
 ==== Spam Scripts ==== ==== Spam Scripts ====
  
Line 342: Line 380:
  
 The following prints out the site, the external IP making the request, and the script being requested in order of increasing number of requests per IP/site: The following prints out the site, the external IP making the request, and the script being requested in order of increasing number of requests per IP/site:
 +<code>
   grep "sotpie" /usr/local/apache/domlogs/username/* | grep POST | awk {'print $1,$7'} | sort | uniq -c | sort -n   grep "sotpie" /usr/local/apache/domlogs/username/* | grep POST | awk {'print $1,$7'} | sort | uniq -c | sort -n
 +</code>
  
 Let's say that you dot not have the LFD alert to let you know from where the script is located on the server. You could search first for the location of any scripts responsible for sending email via the Exim mainlog instead. You could use this: Let's say that you dot not have the LFD alert to let you know from where the script is located on the server. You could search first for the location of any scripts responsible for sending email via the Exim mainlog instead. You could use this:
 +<code>
   grep "cwd="  /var/log/exim_mainlog | awk -F"cwd=" '{print $2}' | awk '{print $1}' | sort | uniq -c | sort -n    grep "cwd="  /var/log/exim_mainlog | awk -F"cwd=" '{print $2}' | awk '{print $1}' | sort | uniq -c | sort -n 
 +</code>
  
 Or, you could just search for POST requests for the current day via the domlogs, and then exclude any common POST requests: Or, you could just search for POST requests for the current day via the domlogs, and then exclude any common POST requests:
 +<code>
   grep -s $(date +"%d/%b/%Y:" /usr/local/apache/domlogs/* | grep POST | grep -vi "wp-cron\|admin-ajax\|xmlrpc\|wp-config" | awk {'print $1,$7'} | sort | uniq -c | sort -n   grep -s $(date +"%d/%b/%Y:" /usr/local/apache/domlogs/* | grep POST | grep -vi "wp-cron\|admin-ajax\|xmlrpc\|wp-config" | awk {'print $1,$7'} | sort | uniq -c | sort -n
 +</code>
  
 You will likely have to further filter the output, but you will see what types of POST request are occurring for your sites and be able to determine if any scripts are being called heavily, indicating the spam script source.  You will likely have to further filter the output, but you will see what types of POST request are occurring for your sites and be able to determine if any scripts are being called heavily, indicating the spam script source. 
  
 There are many ways to protect against this type of abuse. It really just comes down to securing your sites thoroughly so that no malicious scripts can be uploaded and so that no legitimate functionality can be abused. An example of a legitimate script being abused would be a contact form being used to send spam. Just checking the Exim mainlog for mail-sending scripts and their frequency and POST requests logged in the domlogs could confirm that this is occurring. There are many ways to protect against this type of abuse. It really just comes down to securing your sites thoroughly so that no malicious scripts can be uploaded and so that no legitimate functionality can be abused. An example of a legitimate script being abused would be a contact form being used to send spam. Just checking the Exim mainlog for mail-sending scripts and their frequency and POST requests logged in the domlogs could confirm that this is occurring.
 +<code>
   grep -s $(date +"%d/%b/%Y:") /usr/local/apache/domlogs/username/domain.tld* | grep POST |awk {'print $1,$6,$7'} | sort | uniq -c | sort -n   grep -s $(date +"%d/%b/%Y:") /usr/local/apache/domlogs/username/domain.tld* | grep POST |awk {'print $1,$6,$7'} | sort | uniq -c | sort -n
 +</code>
  
 You may see results like this: You may see results like this:
  
-{{{{:security:misc:domainaccesslogscontactformabuse.png?direct&1200|}}+{{:security:misc:domainaccesslogscontactformabuse.png?nolink&1200}}
  
 Adding reCaptcha and blocking the responsible IP (provided the attacker was using the same IP and not a myriad of proxy IPs) should be quite effective at mitigating this attack.  Adding reCaptcha and blocking the responsible IP (provided the attacker was using the same IP and not a myriad of proxy IPs) should be quite effective at mitigating this attack. 
  
------ +\\
 ==== Malicious Requests ==== ==== Malicious Requests ====
  
Line 397: Line 438:
 These will often  show a myriad of requests that were likely initiated as exploitation attempts, including XSS, LFI, RFI, RCE, Symlink attacks, open redirects, path traversal attacks, SQLi, and Apache log poisoning. You will want to review the Apache status codes returned with the results so that you can attempt to determine if any potential attacks were successful.   These will often  show a myriad of requests that were likely initiated as exploitation attempts, including XSS, LFI, RFI, RCE, Symlink attacks, open redirects, path traversal attacks, SQLi, and Apache log poisoning. You will want to review the Apache status codes returned with the results so that you can attempt to determine if any potential attacks were successful.  
  
------ +\\
 ==== Vulnerability Scanning ==== ==== Vulnerability Scanning ====
  
 Exploit scanners will scan for the existence of certain files, such as previously exploited sites that still have a backdoor. You will see many requests searching for popular names for backdoors, such as shell.php, ak47.php, ak48.php, lol.php, b374k.php, cmd.php, system.php, 1.php, zzz.php, qq.php, hack.php, etc. This list is long. This is why features that allow you to block IPs that have exceeded a large number of 404 errors in a given time exist and can be helpful, as these requests can be from an automated vulnerability scanner. These could also be from a bot indexing your site, so unless you know you are under this attack, or you are certain that your permalinks are configured well and that you have no broken links on your site, you should be careful with this setting. It should be set quite high, at least 200 or more per hour. CSF/LFD offers this setting via LF_APACHE_404 and LF_APACHE_404_PERM in the /etc/csf/csf.conf file.  Exploit scanners will scan for the existence of certain files, such as previously exploited sites that still have a backdoor. You will see many requests searching for popular names for backdoors, such as shell.php, ak47.php, ak48.php, lol.php, b374k.php, cmd.php, system.php, 1.php, zzz.php, qq.php, hack.php, etc. This list is long. This is why features that allow you to block IPs that have exceeded a large number of 404 errors in a given time exist and can be helpful, as these requests can be from an automated vulnerability scanner. These could also be from a bot indexing your site, so unless you know you are under this attack, or you are certain that your permalinks are configured well and that you have no broken links on your site, you should be careful with this setting. It should be set quite high, at least 200 or more per hour. CSF/LFD offers this setting via LF_APACHE_404 and LF_APACHE_404_PERM in the /etc/csf/csf.conf file. 
  
------ +\\
 ==== Conclusion ==== ==== Conclusion ====
  
Line 445: Line 484:
  
 </file> </file>
- 
- 
- 
  
 For the cPanel plugin NginxCP , you can use this: For the cPanel plugin NginxCP , you can use this:
security/misc/checking-access-logs-for-abuse.1559841506.txt.gz · Last modified: 2019/06/06 12:18 by Jonathan K. W.