mitchellkrogza / nginx-ultimate-bad-bot-blocker

Nginx Block Bad Bots, Spam Referrer Blocker, Vulnerability Scanners, User-Agents, Malware, Adware, Ransomware, Malicious Sites, with anti-DDOS, Wordpress Theme Detector Blocking and Fail2Ban Jail for Repeat Offenders
Other
3.84k stars 474 forks source link

[INSTALLATION] if directive not allowed here #359

Open imunisasi opened 4 years ago

imunisasi commented 4 years ago

nginx: [emerg] "if" directive is not allowed here in /etc/nginx/bots.d/blockbots.conf:58 nginx: configuration file /etc/nginx/nginx.conf test failed

Ubuntu 19,04 minimal Nginx 1.16.1

mitchellkrogza commented 4 years ago

Your includes are in the wrong place. Please post your config.

andrewjs18 commented 4 years ago

I also have this issue:

sudo nginx -t nginx: [emerg] "if" directive is not allowed here in /etc/nginx/bots.d/blockbots .conf:58 nginx: configuration file /etc/nginx/nginx.conf test failed

my vhost config is a default one so far and not configured for my site, yet...


  listen 80;
  listen [::]:80;
  server_name _;
  root /usr/share/nginx/html/;

    ##
    # Nginx Bad Bot Blocker Includes
    # REPO: https://github.com/mitchellkrogza/nginx-ultimate-bad-bot-blocker
    ##
        include /etc/nginx/bots.d/ddos.conf;
        include /etc/nginx/bots.d/blockbots.conf;

  index index.php index.html index.htm index.nginx-debian.html;

  location / {
    try_files $uri $uri/ /index.php;
  }

  location ~ \.php$ {
    fastcgi_pass unix:/run/php/php7.4-fpm.sock;
    fastcgi_param SCRIPT_FILENAME $request_filename;
    include fastcgi_params;
    include snippets/fastcgi-php.conf;
  }

 # A long browser cache lifetime can speed up repeat visits to your page
  location ~* \.(jpg|jpeg|gif|png|webp|svg|woff|woff2|ttf|css|js|ico|xml)$ {
       access_log        off;
       log_not_found     off;
       expires           360d;
  }

  # disable access to hidden files
  location ~ /\.ht {
      access_log off;
      log_not_found off;
      deny all;
  }
}
andrewjs18 commented 4 years ago

also to add, I'm running the following ubuntu & nginx versions:

ubuntu 20.04 nginx version: nginx/1.17.10

Efreak commented 3 years ago

I had this issue as well, and found at least a partial cause: You're inserting content into the wrong place in the config file. You need to not only ignore commented-out lines, you also need to check for multiple server definitions in a single file.

diff of /etc/nginx/sites-available/default ``` diff --git a/sites-available/default b/sites-available/default index d35fb66..f8e356f 100644 --- a/sites-available/default +++ b/sites-available/default @@ -96,18 +96,27 @@ server { # Virtual Host configuration for example.com # # You can move that to a different file under sites-available/ and symlink that # to sites-enabled/ to enable it. # #server { # listen 80; # listen [::]:80; # # server_name example.com; # + + + ## + # Nginx Bad Bot Blocker Includes + # REPO: https://github.com/mitchellkrogza/nginx-ultimate-bad-bot-blocker + ## + include /etc/nginx/bots.d/ddos.conf; + include /etc/nginx/bots.d/blockbots.conf; + # root /var/www/example.com; # index index.html; # # location / { # try_files $uri $uri/ =404; # } #} ```

In the first case, simply ignoring the commented lines would be good enough.

diff of /etc/nginx/sites-available/files.REMOVED.com ``` diff --git a/sites-available/files.REMOVED.com b/sites-available/files.REMOVED.com index dcf95a6..a65b0c7 100644 --- a/sites-available/files.REMOVED.com +++ b/sites-available/files.REMOVED.com @@ -21,122 +21,131 @@ server { # SSL configuration # # listen 443 ssl default_server; # listen [::]:443 ssl default_server; # # Note: You should disable gzip for SSL traffic. # See: https://bugs.debian.org/773332 # # Read up on ssl_ciphers to ensure a secure configuration. # See: https://bugs.debian.org/765782 # # Self signed certs generated by the ssl-cert package # Don't use them in a production server! # # include snippets/snakeoil.conf; root /home/efreak/public_html; # Add index.php to the list if you are using PHP index index.html index.htm index.nginx-debian.html; server_name files.REMOVED.com; location /files { alias /home/efreak/public_html; client_body_temp_path /var/www/webdav/temp; dav_methods PUT DELETE MKCOL COPY MOVE; dav_ext_methods PROPFIND OPTIONS; location ~ \.(zip|rar|7z|tar|tgz|txz|tbz2|gz|xz|bz2)$ { auth_basic "Restricted site."; auth_basic_user_file /home/efreak/webdav-users.passwd; } limit_except GET PROPFIND OPTIONS HEAD { auth_basic "Restricted site."; auth_basic_user_file /home/efreak/webdav-users.passwd; } create_full_put_path on; dav_access user:rw group:rw all:rw; autoindex on; autoindex_exact_size off; } location / { proxy_pass http://127.0.0.1:8883; proxy_pass_request_headers on; proxy_set_header X-Forwarded-Host $http_host; proxy_set_header X-Forwarded-For $remote_addr; } # pass PHP scripts to FastCGI server # #location ~ \.php$ { # include snippets/fastcgi-php.conf; # # # With php-fpm (or other unix sockets): # fastcgi_pass unix:/var/run/php/php7.0-fpm.sock; # # With php-cgi (or other tcp sockets): # fastcgi_pass 127.0.0.1:9000; #} # deny access to .htaccess files, if Apache's document root # concurs with nginx's one # #location ~ /\.ht { # deny all; #} listen [::]:443 ssl; # managed by Certbot listen 443 ssl; # managed by Certbot ssl_certificate /etc/letsencrypt/live/files.REMOVED.com-0001/fullchain.pem; # managed by Certbot ssl_certificate_key /etc/letsencrypt/live/files.REMOVED.com-0001/privkey.pem; # managed by Certbot include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot } # Virtual Host configuration for example.com # # You can move that to a different file under sites-available/ and symlink that # to sites-enabled/ to enable it. # #server { # listen 80; # listen [::]:80; # # server_name example.com; # # root /var/www/example.com; # index index.html; # # location / { # try_files $uri $uri/ =404; # } #} server { if ($host ~ ^files-web\.REMOVED\.com$) { return 301 https://files.REMOVED.com$request_uri; } # managed by Certbot listen [::]:443 ssl; # managed by Certbot listen 443 ssl; # managed by Certbot ssl_certificate /etc/letsencrypt/live/files.REMOVED.com-0001/fullchain.pem; # managed by Certbot ssl_certificate_key /etc/letsencrypt/live/files.REMOVED.com-0001/privkey.pem; # managed by Certbot include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot server_name files-web.REMOVED.com; return 404; # managed by Certbot + + + ## + # Nginx Bad Bot Blocker Includes + # REPO: https://github.com/mitchellkrogza/nginx-ultimate-bad-bot-blocker + ## + include /etc/nginx/bots.d/ddos.conf; + include /etc/nginx/bots.d/blockbots.conf; + } ```

In this second case, there's a second server definition with a redirect to change a subdomain. In this case you need to also check each file for multiple server definitions.

itoffshore commented 3 years ago

the code that controls where the includes are inserted is around line 460 of setup-ngxblocker:

line=$(find_includes $MAIN_CONF sendfile last http first '\}' last )

different distributions use different style nginx.conf so it is difficult to cover every use case

L8X commented 1 year ago

Bump @mitchellkrogza

Tloram commented 2 months ago

This is still broken. Issue occurs on a standard Nginx vhost managed by Let’s Encrypt Certbot (which creates 2 server blocks in each vhost file).