Laravel 5.5 Official Recommended Nginx Configuration Tutorial

  • 2021-08-10 07:08:57
  • OfStack

Preface

This article mainly introduces the content of Laravel 5.5 officially recommended Nginx configuration, and shares it for your reference and study. Let's not say much below, let's take a look at the detailed introduction.

Version 5.5 of Laravel officially released the configuration of Nginx server. Chinese document: Server configuration Nginx


server {
 listen 80;
 server_name example.com;
 root /example.com/public;

 add_header X-Frame-Options "SAMEORIGIN"; 
 add_header X-XSS-Protection "1; mode=block"; 
 add_header X-Content-Type-Options "nosniff"; 

 index index.html index.htm index.php;

 charset utf-8;

 location / {
 try_files $uri $uri/ /index.php?$query_string;
 }

 location = /favicon.ico { access_log off; log_not_found off; } 
 location = /robots.txt { access_log off; log_not_found off; } 

 error_page 404 /index.php;

 location ~ \.php$ {
 fastcgi_split_path_info ^(.+\.php)(/.+)$;
 fastcgi_pass unix:/var/run/php/php7.1-fpm.sock;
 fastcgi_index index.php;
 include fastcgi_params;
 }

 location ~ /\.(?!well-known).* {
 deny all;
 }
}

I am not good at Nginx. I believe many friends are like me. Let's learn the related knowledge of Nginx:)

1. add_header X-Frame-Options "SAMEORIGIN";

The X-Frame-Options response header is used to indicate to the browser whether 1 page is allowed in the < frame > , < iframe > Or < object > Gets or sets the markup displayed in the. Websites can use this function to ensure that the content of their website is not embedded in other people's websites, thus avoiding the attack of click hijacking (clickjacking).

X-Frame-Options has three values:

DENY

Indicates that this page is not allowed to be displayed in frame, even if it is nested in pages with the same domain name.
SAMEORIGIN

Indicates that the page can be displayed in frame of the same domain name page.
ALLOW-FROM uri

Indicates that the page can be displayed in frame from the specified source.
This response header setting should be more common. Before, the security teams of foreign customers used tools to scan the related vulnerabilities of our project, including this clickjacking problem, which was finally solved by this setting.

2. add_header X-XSS-Protection "1; mode=block";

XSS is a cross-site scripting attack, which is a common means of network attack. Change the field to indicate whether the browser opens the XSS filtering mechanism built in the browser for the current page. 1 indicates the allowed filter, and mode = block indicates that the browser disables loading of the entire page after an XSS attack is detected.

Reference article: Summary of knowledge points of Prophet XSS Challenge

3. add_header X-Content-Type-Options "nosniff";

This response header setting disables the behavior of browsers guessing Content-Type types. Because Content-Type types are not well configured on the server in many cases, browsers will determine the types according to the data characteristics of documents. For example, attackers can make requests that were originally parsed as pictures be parsed as JavaScript.

We found that the above three common anti-attack configurations are still very practical and recommended. Before, our server only used add_header X-Frame-Options "SAMEORIGIN"; Configuration.

4. Do not record favicon. ico and robots. txt logs


 location = /favicon.ico { access_log off; log_not_found off; }
 location = /robots.txt { access_log off; log_not_found off; }

favicon. ico website avatar, the default is the small icon of the website on the browser tab and the small icon displayed when collecting.

If favicon. ico is not specified in html header, the browser will access http://xxx. com/favicon. ico by default. If this file does not exist, it will result in 404, which will be recorded in access_log and error_log at the same time. This recording to the log file is unnecessary, so it can be cancelled.

robots. txt is usually a file that search engine spiders (crawlers) will crawl. In industry specifications, when spiders crawl a website, they will first crawl the file to know which directory files in the website do not need to crawl. In SEO, the correct configuration of robots. txt is very effective for SEO. It's really not necessary to log this file, and most websites don't have robots. txt files.

These configurations can be used on most websites, not only Nginx servers, but also Apache servers. If you are using other web servers, the above similar configurations are also recommended.

Summarize


Related articles: