Basic Web Authentication -- 101 By Anthony Thyssen Using encryption on the Internet is the equivalent of arranging an armored car to deliver credit-card information from someone living in a cardboard box to someone living on a park bench. -- Gene Spafford The equivalent of an armored car should always be used to protect any secret kept in a cardboard box. -- Anthony Thyssen, On the use of Encryption Original source of page... https://antofthy.gitlab.io/info/www/passwd_protect_101.txt ------------------------------------------------------------------------------- 0/ WARNING... Ensure authentication happens AFTER any automatic redirection from HTTP to HTTPS. This MUST be checked using low level web tools such as "curl" and "wget". Authenticating using HTTP results in your web password being sent in the clear across the network. This is looked at more closely in step 4... Do not skip this important step. Also, be warned that ".htaccess" files of a sub-directory can abort or override some ".htaccess" directives in a parent directory. This depends on many factors, and while some directives may be effected, others directives may not. It is very confusing. As such what may work to protect the top-level directory, may not work for a direct attempt to access file in a sub-directory with its own ".htaccess" file. This is why it MUST be tested, not just at the top level, but within sub-diretories too. Some form of automated penetration testing should be a integral part of any secure web page design. Consider adding some right at the start... 1/ Create the directory and its index.html file For example... cd ~/public_html # or whereever you serve web pages mkdir passwd_prot chmod 755 passwd_prot cd passwd_prot echo "

Password Protected

" > index.html chmod 644 index.html 2/ Create a password file ".htpasswd" For example... htpasswd -c -5 .htpasswd username chmod 644 .htpasswd The "-c" (create) flag will createt he file is it does not exist while the "-5" is used to specify a SHA-512 hashing function. Starting the filename with ".ht" generally ensures apache web servers will not serve the file, just as it will not serve ".htaccess" files. However while the password file can exist in the web area does not mean it is a good idea to place it there. Nginx servers for example do not protect such files. Placing the password file somewhere completely outside the public accessible directory tree, is a far better idea. Also it is outside the public directory tree, you can call the file anything you like. However be sure the web server itself can still read it (that is the file and directory path has appropriate read/access permissions). The file has lines of the form... username:passwd_crypt Where "passwd_crypt" is the password hash, which can not be decrypted. You can add more users to the file by adding more lines to the same file. Or you can just add one user that everyone can use, though that is not recommended. If shared that password should be changed often. For more information on password encryptions see https://httpd.apache.org/docs/2.4/misc/password_encryptions.html or type htpasswd --help You can also add more ':' separated fields to the ".htpasswd" file for other uses, for example to store the users full name, or special flags giving other access rights, though the use of such 'extra fields' will depend on the application, which will have to read the file itself. It is also possible to also set up a ".htgroup" file, to group the users, to say define administrators or other status, but again it is up to the application programs to make use of that information. ASIDE: The ".htpasswd" file can also be generated and updated using the "ansible" task 'community.general.htpasswd'. 3/ Enable Authentication using the password file... For Apache... Create a ".htaccess" file (if not already present), or adjust the server (vhost) setup (system administration privileges). SSLRequireSSL AuthName "Password Protected Directory" AuthUserFile /path/to/passwd/file/.htpasswd AuthType Basic require valid-user Make sure server can read these files or no one will be able to access anything in the directory (a fallback security measure by apache)! chmod 644 .htaccess Note: Some browsers (google-chrome) do NOT display the given "AuthName" you provided in the above, but just the server it is on! Argghh... WARNING: The "SSLRequireSSL" enforces the requirement that HTTPS is in use. However it also means that if a client tried to use HTTP, they will receive a horrible "forbidden 403 error". See next... 4/ Force Secure Communications You must ensure the page is only accessed via HTTPS (encrypted) protocol Otherwise passwords will be sent across the network 'in the clear'. This must be done BEFORE the server even requests a password for that directory, but that may not always happen! Rather than use "SSLRequireSSL", the better way is to automatically redirect client using HTTP to HTTPS, globally. WARNING: Some browsers will send a the password they have for a web site, even before that password is requested by the web site. If the URL being used is HTTP then that password will be in the clear! This is major security hole that is being fixed in most browsers, but it is not something that a web server can fix. Just ensure your site always redirects to HTTPS, that way any links or bookmarks for your site will also use HTTPS. --- Method 1, Server Configuration... The best solution is to modify the web servers configuration files. But that requires system administration privileges, which you may not have. That is, something like this is added to the appropriate HTTP protocol, VirtualHost, Directory, or Location, configuration... For example using VirtualHost to redirect everything for the server to use HTTPS. NOTE that this requires hardcoding the servers name! NameVirtualHost *:80 ServerName www.example.com Redirect / https://secure.example.com/ OR using mod_rewrite for a specific directory... RewriteEngine On RewriteCond %{HTTPS} !=on RewriteRule . https://%{SERVER_NAME}%{REQUEST_URI} [R,L] Doing this on the server insures the HTTP to HTTPS redirection occurs BEFORE the directory and thus before the ".htaccess" authentication is even attempted. The request will also be completely preserved during the redirect to HTTPS. The above directives will work in a ".htaccess" file, BUT, doing it that way results in authentication happening BEFORE the HTTP to HTTPS redirection, making it rather pointless. That is the passwords can be sent in the clear across the network! Thus adding the above to ".htaccess" is not a good solution. --- Method 2, URL Rewrite Redirection... By adjusting the "require" handling in the ".htaccess" file, you can get the authentication to abort when it is accessed via HTTP protocol. That way any redirection in the ".htaccess" can happen without authentication taking place. After redirection has finished then the normal authentication can continue. # Add to the previous authentication code, to abort if HTTP is used Require expr "%{ENV:HTTPS} !~ /on/i" # Redirect HTTP to HTTPS RewriteEngine On RewriteCond %{HTTPS} !=on RewriteRule ^ https://%{SERVER_NAME}%{REQUEST_URI} [R=301,L] See below for a full ".htaccess" example file. --- Method 3, Hack to redirect "Forbidden Error"... This is a OLDER Hack for a pure ".htaccess" HTTPS redirection. This uses the "Forbidden 403 Error" from a SSLRequireSSL directive to redirect clients to HTTPS before authentication takes place. But it also requires hardcoding the URL path into the configuration. Adjust and add something like the following to your ".htaccess" ("example.com" should be your servers DNS name). SSLOptions +StrictRequire SSLRequireSSL SSLRequire %{HTTP_HOST} eq "example.com" ErrorDocument 403 https://example.com/password-protected-directory/ What this does is redirect a client to HTTPS if a "Forbidden 403 Error" is generated... Presumably by the "SSLRequireSSL". However it is a hack. It does not preserve any sub-page components of the calling URL when redirecting the request. And will also do the same thing for any 'Forbidden' errors that may pop up, such as those cause by file permissions or directory access restrictions. ------------------------------------------------------------------------------- Optional but recommended to be added... These should be looked at for non-password protected directories too but becomes critical for directories involving applications and executables. 5/ Disable Auto-Indexing The top level directory should have a "index.html" or "index.php" or equivalent to prevent directory listing. BUT what about the sub-directories? EG: "css/" "images/" "cache/" etc. We recommend you completely disable auto-indexing, in the ".htaccess" file, just in-case it is enabled in upper level web directories, or by the server. Of course auto-indexing is useful for file servers, so clients can get a listing of all files without you needing to update the "index.html" file all the time. It is better to keep it disabled, and only enable it explicitly in directories want to generate a directory listing. # Disable Auto-Indexing Options -Indexes You may also want to disable other options you may not be using, like ExecCGI, SSI Includes, or to Follow Symbolic Links # Disable all options Options None Here is a list of common Apache options used (or inherited)... Indexes Generate a directory index if no index.html FollowSymLinks Follow symbolic links as normal SymLinksIfOwnerMatch Allow symbolic link if owners of link/file match ExecCGI Execute .cgi files present in the directory Includes Full SSI (server side includes) IncludesNoExec Use includes, but don't execute anything 6/ Disable access to specific files... This is normally provided by default by apache web server for files that start with ".ht" like ".htaccess" and ".htpasswd". BUT should be extended to all "." files, and any other sensitive files that you may have in the secure directory, like configuration files. # Standard apache configuration restricts ".ht*" file access # # Require all denied # # Restrict access to all 'dot' files Require all denied # Deny access to files with specific suffix... Require all denied # And other sensitive files you may have Require all denied Note the apache Auto-Indexing also will remove the denied files and directories from its generated file list, but you should not rely on this, or even allow auto-indexing a secure directory anyway, unless absolutely necessary (for file serving). You may also like to consider disabling server access to PHP files that should not be directly referenced by clients (including php files that are included by PHP itself). This can be easily forgotten, especially php configuration files or ones holding security secrets. The best idea is to move all PHP includes are completely outside the 'public' accessible area. That way only the top 'public' PHP file could be served in the 'public' area. This is a part of 'Larval' configuration, if its security setup is followed properly. Ensuring the UNIX file permissions on restricted files that are not world access/readable is also a good idea. BUT you should NOT rely on file permissions to prevent such access! File permissions can and do get reset by accident, especially during file transfers, or during installation on new systems. 7/ Disable access to sub-directories... You want to prevent users downloading things from directories used for the purposes of... Administration Database stores Session keys and information Cached data Temporary files Version control management (".cvs", ".svn", ".git", "CVS") In the parent directory you will want to deny access to files like ".htaccess" and ".htpasswd". NOTE: "Apache" Web server does this server wide already using the glob pattern ".ht*", but "Nginx" web server does not though it probably should! Generally you want to deny access to 'dot' files and directories. However there can be exceptions. For example ".well-known" is a sub-directory that is used for special download files that identify that you have write access to the web server for the purposes of SSL certificates (EG using "certbot"), and this may need to be accessible. # Restrict access to sub-directories (especially 'dot' sub-directories) # But not ".well-known" for SSL Certificate Renew. RewriteEngine On RewriteRule "(^|/)\.(?!well-known)" - [F] RewriteRule ^cache/ - [F] RedirectMatch 404 /\..*$ Or # Deny access to specific source repository sub-dirs Require all denied Require all denied Or redirect (error) the directory and files in "cache" RedirectMatch 403 ^/path/to/parent_directory/cache/?$ RedirectMatch 403 ^/path/to/parent_directory/cache/.*$ An alternative is to add ".htaccess" files into each sub-directory to deny or limit access to the files within. For example this ".htaccess" file deny all access into this sub-directory. Options None Order Allow,Deny #Allow from localhost #Allow from 192.168. Deny from all Or limit access to specific file suffixes in the directory with this ".htaccess" file. # only serve images from this sub-directory Order Allow,Deny Deny from all Order Deny,Allow Allow from all You can also use file permissions to restrict access. However that will also restricts access to applications being run by the web server (PHP, CGI, etc). However you should not rely on file permissions, as these could be accidentally reset when the code is copied or transferred to other places. Using file permissions with web servers is not a good security measure. 8/ OPTIONAL: Limit access to your images, so only YOUR page can use. Prevent your images being used by other web servers on their pages. That is to stop the images being used on other web servers, with you paying the cost of the disk and networking to serve those images! Something known as "image hotlinking" NOTE: This does not prevent users accessing and coping the images, by 'faking' the reference link information, but it does stop their general use in web pages on other web sites, forcing them to at least download and host their own copy of the images for their own pages. Basically you limit access to image to only requests with a referrer that matches pages served by your server, so that in general they can only be used on web pages you also served. RewriteEngine On RewriteCond "%{HTTP_REFERRER}" "!https?://example\.com" [NC] RewriteRule "\.(gif|jpg|jpeg|png)$" "-" [F,NC] Rather than deny, you could serve a different (denied/broken) image. RewriteRule "\.(gif|jpg|jpeg|png)$" "/images/go-away.png" [R,NC] Unfortunately this hard codes the server name into the ".htaccess" file. As you can NOT use a server %{} variable in the conditional part. As such this does NOT work, as it will NEVER match! RewriteEngine On RewriteCond "%{HTTP_REFERRER}" "!https?://%{SERVER_NAME}" [NC] RewriteRule "\.(gif|jpg|jpeg|png)$" "-" [F,NC] An alternative is to use 'SetEnvIf' module (if installed). But it still requires a hard coded server name, somewhere. SetEnvIf Referer "example\.com" localreferer Require env localreferer ------------------------------------------------------------------------------- Full Example Example of a full ".htaccess" file for a password protected directory... This must be modified to suit the specific situation you are using it for. =======8<--------CUT HERE---------- # # ".htaccess" web server control file # # Server options, common Options are # Indexes Generate a directory index if no index.html # FollowSymLinks Follow symbolic links as normal # SymLinksIfOwnerMatch allow symbolic link if the link and file match # ExecCGI execute .cgi files for the resulting document # Includes Full server side includes # IncludesNoExec server side includes but don't execute anything # Options IncludesNoExec FollowSymLinks ExecCGI # # Disable Auto-Indexing (and all other options if they are not required) Options None # Authentication AuthName "Password Protected Directory" AuthUserFile /path/to/passwd/file/.htpasswd AuthType Basic Require expr "%{ENV:HTTPS} !~ /on/i" Require valid-user # Redirect HTTP to HTTPS RewriteEngine On RewriteCond %{HTTPS} !=on RewriteRule ^ https://%{SERVER_NAME}%{REQUEST_URI} [R=301,L] # Restrict access to sub-directories (especially 'dot' sub-directories) RewriteEngine On RewriteRule "(^|/)\.(?!well-known)" - [F] RewriteRule ^cache/ - [F] RedirectMatch 404 /\..*$ # Deny access to specific source repository sub-dirs (dotfile or otherwise) Require all denied Require all denied # Restrict access to all 'dot' files, (not just ".ht*", ".git" etc...) Require all denied # Deny access to files with specific suffixes... Require all denied # And other sensitive files you may have Require all denied =======8<--------CUT HERE---------- -------------------------------------------------------------------------------