amuck-landowner

NGINX Security / Lockdown & Tips

KwiceroLTD

New Member
Verified Provider
Here's what I do to lockdown NGINX / add more security. Written in the form of a tutorial/guide.

1. Information Leakage

NGINX, like other servers likes to leak information, so I compiled my own version, changed the tag (nginx) or vps-nginx (still leaving original nginx tag in there, just indicating it's a modified version), and stripped out some common leakage of information.

How to do this?

Download NGINX source, go to src/http/ngx_http_header_filter_module.c, line 49, looks similar to this:

uUf7mdD.png

I change it to:

static char ngx_http_server_string[] = "Server: vps-nginx" CRLF;
static char ngx_http_server_full_string[] = "Server: vps-nginx" CRLF;

Doing so changes the output on server, and it strips the nginx version.

Now, this is optional but you can change the version output variable NGINX_VER:

Navigate to src/core/nginx.h, line 13 is a defined variable of nginx version information, I update to suit:

Inq43qp.png

Changes: NGINX_VER now outputs upon running it vps-nginx/1.0.0

Server tokens can be turned off by adding:
server_tokens off;

In your configuration file.

Minimal permissions:

I do mounting for nginx files and lockdown permissions, example mount:

LABEL=/httpcontent     /httpcontent          ext3   defaults,nosuid,noexec,nodev 1 2

If you want to know how to do mounting, here's a tutorial!

Spam Decliner:

I'm personally not a fan of referal spam, etc. so here's some configuration you can add to sites-enabled, to rid some of the spam:


if ($http_user_agent ~* LWP::Simple|wget|libwww-perl) {
return 403;
}
if ($http_user_agent ~ (Googlebot|android|msnbot|Purebot|Baiduspider|Lipperhey|Mail.ru|scrapbot) ) {
return 403;
}
if ($http_referer ~* (jewelry|forsale|organic|love|viagra|nude|girl|nudit|casino|poker|porn|sex|teen|babes) ) {
return 403;
}
if ($request_method !~ ^(GET|HEAD|POST)$ ) {
return 444;
}
valid_referers none blocked google.com yahoo.com;
if ($invalid_referer) {
return 403;
}


What the code does? First block (if statement) stops download bots.

Second blocks some common useragents, I block search engines, and Android (we've had our fair share of problems)

Third block stops common referal spam. Fourth block checks if you're not sending a get,head,post method, it denies you access.

Next we check for invalid referer based off list in valid_referers. Not on there? Denied.

Now, if you want to disable direct IP access, you can do this:


if ($host !~ ^(mydomainhere.com|www.mydomaihere.com)$ ) {
return 444;
}

Custom error pages anyone?

Add this to sites-enabled file:


error_page 404 500 502 503 504 /error.html;
location = /error.html {
root /var/www;
}

(if you're not using /var/www to store error pages, change the directory to suit where you store them).

PHP Tips:

If you want to help lockdown your PHP a bit more, here's a few configuration options you can use, /etc/php.ini file:

Code:
disable_functions = phpinfo, system, mail, exec
file_uploads = Off
expose_php = Off
allow_url_fopen = Off
register_globals = Off
 

tonyg

New Member
if ($http_user_agent ~ (Googlebot|android|msnbot|Purebot|Baiduspider|Lipperhey|Mail.ru|scrapbot) ) {
        return 403;
    }

You're blocking Googlebot, android and msnbot?

Edit: Just read that you block search engines.

Honestly, I don't even want to know the reasoning. But I would advice to remove these blocks from the tutorial...for the sake of the reader.
 
Last edited by a moderator:

KwiceroLTD

New Member
Verified Provider
You're blocking Googlebot, android and msnbot?

Edit: Just read that you block search engines.

Honestly, I don't even want to know the reasoning. But I would advice to remove these blocks from the tutorial...for the sake of the reader.
I block search engines because I prefer not to be found. The server is a private server.
 

joepie91

New Member
I block search engines because I prefer not to be found. The server is a private server.
Blocking useragents is not the correct way to go about that, and will not work.

EDIT: I also don't see how any of this adds any "security" at all, other than the PHP bit.
 
Last edited by a moderator:
  • Like
Reactions: vld

sleddog

New Member
Isn't that supposed to be done via robots.txt ?
robots.txt is for the well-behaved robots. Blocking is for the rest :)

But really, for a private site/server I think an overall access policy might work better, e.g.

Code:
location / {
	# Allow access from designated network addresses or require htpasswd authentication.
	satisfy any;
	allow 127.0.0.1;
	allow 192.168.1.0/24;
	allow 172.22.1.4;
	deny  all;
	auth_basic "Restricted Access";
	auth_basic_user_file /path/to/.htpasswd;
}
 

joepie91

New Member
robots.txt is for the well-behaved robots. Blocking is for the rest :)

But really, for a private site/server I think an overall access policy might work better, e.g.


location / {
# Allow access from designated network addresses or require htpasswd authentication.
satisfy any;
allow 127.0.0.1;
allow 192.168.1.0/24;
allow 172.22.1.4;
deny all;
auth_basic "Restricted Access";
auth_basic_user_file /path/to/.htpasswd;
}
That is indeed the correct approach, although I'd recommend having security on a (web) application level where possible, rather than on a HTTPd level. It's really easy to accidentally mess up otherwise.
 
Top
amuck-landowner