Entries tagged http

404 with content using basic auth with nginx

Posted on 19. Juli 2022 Comments

When using e.g. Laravel Forge you can create a section of the website that’s restricted and uses HTTP Basic Access Authentication for access control. When the credentials aren’t entered correctly, the server returns a 401 error.

Basic Authentication prompt

However, when you would like to restrict a section that’s part of an application, whatever rules you defined in your /etc/nginx/sites-available/domain.tld.conf you have to now add to the new location section.

Example

On a staging server, you want to restrict access to /register and /login. Head to servers/X/sites/Y/security URL by clicking on the server, then select a site and click on „Security“. These entries create 2 files in /etc/nginx/forge-conf/domain.tld/server where ID is the ID you can see in Forge..

  • .htpasswd-ID
  • protected_site-ID.conf
Security Rules in Laravel Forge

However, when you now navigate to e.g. /login and enter your credentials, you will see a 404 for /login in the debug console. A request has been made and returned the content – just with the wrong status code. This is happening because there is no file called „login“ in the webserver public folder and nginx hasn’t been instructed to use PHP for this particular location.

tl;dr: You need to add this line to the location entry in the .conf files:

try_files $uri $uri/ /index.php?$query_string;

The whole file now looks like this:

location /register {
auth_basic "Restricted Area";
auth_basic_user_file /etc/nginx/forge-conf/domain.tld/server/.htpasswd-42;
try_files $uri $uri/ /index.php?$query_string;
}

Simple wget crawler for list of files

Posted on 6. August 2015 Comments

This script could be helpful to download a set of files from a webserver, that you don’t have (S)FTP access to. The input file consists of a list of filenames, one name each line.


#!/bin/bash
file=$1
WEBSERVER="http://webserver.tld/folder/"
while IFS= read -r line; do
FULLURL="$WEBSERVER$line"
wget -nc -R --spider $FULLURL
done < "$file"

At first, the first command line argument is saved into the variable file. Then the Webserver address is saved to the WEBSERVER variable. IFS stands for Internal Field Separator. It’s used to read line by line through the file in the while loop that ends in the last line. Inside of the loop, the read line is concatenated with the webserver address into FULLURL. Then, wget is used with the parameters -nc for checking if the file is not already present in the current folder, -R for downloading and –spider for checking the existence on the webserver.

You can find the script on GitHub.