And by Local Network, I mean here on my workstation, not at the peer or VPN docker/server/provider.
I am not very knowledgeable with VPNs but I got to learn something new and when I do, I make a note of it. The problem was that when I have my VPN active to poke things at home, I could not print documents at my workstation at work.
Googled a lot, trid a few things and then realised that adding IP addresses to AllowedIPs in the PEER section, adds an exception for an IP address on the server’s side, NOT my workstation.
“Ooooh, what does this checkmark do?”
Confusingly enough, WireGuard names things differently between the Windows and the iOS app. So here’s what you need to check to gain access to your workstation’s local network:
Open the WireGuard control panel.
Click once on the vpn you want to change
Click the EDIT button on the bottom right
iOS: UN-Tick the box on the bottom left that says: “Exclude private IPs”, then click SAVE
Windows: UN-Tick the box on the bottom left that says: “Block untunneled traffic (kill-switch)”, then click SAVE
Yes ,this poses a security risc, so I made two VPN profiles. One with and one without so I can easily switch from one to the other.
Just another snippet of code that can be implemented somewhere:
set-perms.sh
#!/bin/bash
# Function to change ownership of the www folder
change_ownership() {
sudo chown -R "$1":www-data /var/www
echo "Ownership of the www folder has been set to $1:www-data."
}
# Loop until a valid username is provided
while true; do
# Prompt the user to enter the desired username
read -p "Enter the username for permissions: " username
# Check if the username provided exists
if id "$username" &>/dev/null; then
change_ownership "$username"
break # Exit the loop if a valid username is provided
else
echo "Error: User $username does not exist."
fi
done
echo "done!"
So a new release came out and it is important to get this update as soon as possible! This manual is a transcript of the way that I have updated my Mastodon instance. Please make sure you make proper backups and use your brain while updating things.
A guide to making a Mastodon backup can be found here.
Linux flavour: Debian
Update from: 4.2.xx
Log into your server
su - mastodon
cd /home/mastodon/live
git fetch --tags
git checkout [type the most recent version here, starting with the letter v. For example; v4.2.5
git checkout v4.2.10
If you get a ruby version error, please see bottom of this article for a fix! bundle install
yarn install
RAILS_ENV=production bundle exec rails db:migrate
#NOTE: You might get a ruby error which then suggests you to enter the command "bundle install". Do that and then run the RAILS command again.
My system was unable to find the required v3.2.3 of Ruby and I have fixed this by doing the following steps:
Please make sure that your path is correct.
git -C ~/.rbenv/plugins/ruby-build pull
rbenv install 3.2.3
*WAIT TILL DONE* (it may take a little while)
To check all the installed versions type:
rbenv versions
To set v3.2.3 as the global version, type:
rbenv global 3.2.3
To double-check the active, installed version, type:
rbenv versions
Done!
This manual is a transcript of the way that I have updated my Mastodon instance. Please make sure you make proper backups and use your brain while updating things.
Original post: https://3xn.nl/projects/2023/09/20/crude-solution-to-ban-bots-by-their-user-agent/
I’ve very much simplified the script that instantly redirects unwanted traffic away from the server. Currently, I am using a very cheap VPS to receive all that traffic.
Here ya go:
<?php
// CC-BY-NC (2023)
// Author: FoxSan - fox@cytag.nl
// This is a functional but dirty hack to block bots, spiders and indexers by looking at the HTTP USER AGENT.
// Traffic that meets the conditions is being yeeted away to any place of your choice.
//////////////////////////////////////////////////////////////
// Emergency bypass
// goto end;
//////////////////////////////////////////////////////////////
// attempt to basically just yeet all bots to another website
$targetURL = "https://DOMAIN.TLD/SUB/";
// Function to check if the user agent appears to be a bot or spider
function isBot()
{
$user_agent = $_SERVER['HTTP_USER_AGENT'];
$bot_keywords = ['bytespider', 'amazonbot', 'MJ12bot', 'YandexBot', 'SemrushBot', 'dotbot', 'AspiegelBot', 'DataForSeoBot', 'DotBot', 'Pinterestbot', 'PetalBot', 'HeadlessChrome', 'GPTBot', 'Sogou', 'ALittle Client', 'fidget-spinner-bot', 'intelx.io_bot', 'Mediatoolkitbot', 'BLEXBot', 'AhrefsBot'];
foreach ($bot_keywords as $keyword) {
if (stripos($user_agent, $keyword) !== false) {
return true;
}
}
return false;
}
// Check if the visitor is a bot or spider
if (isBot()) {
// yeet
header("Location: $targetURL");
// Exit to prevent further processing
exit;
}
end:
// If the visitor is not a bot, spider, or crawler, continue with your website code.
//////////////////////////////////////////////////////////////////////
?>
Here’s a list of stuff that I have in my .htaccess files on various websites.
I want to work on my website, but any other visitor should be booted to another website so I can work in peace. Sidenote: It's forever since I last used this, so it might work. Or not.
---
# YOUR IP address goes here:
RewriteCond %{REMOTE_ADDR} !^000\.000\.000\.000$
# And provides you access to:
RewriteCond %{REQUEST_URI} !^https://DOMAIN.TLD$ [NC]
# Fine, go have all the media as well
RewriteCond %{REQUEST_URI} !\.(jpg|jpeg|png|gif|svg|swf|css|ico|js)$ [NC]
# Any other visitor can go visit the following website:
RewriteRule .* https://DOMAIN.TLD/ [R=302,L]
# Hey, no viewing access to this file
<FilesMatch "^.ht">
Order deny,allow
Deny from all
</FilesMatch>
# Disable Server Signature
ServerSignature Off
# SSL all the things!
RewriteCond %{HTTPS} !=on
RewriteRule ^/?(.*) https://%{SERVER_NAME}/$1 [R,L]
# No WWW
RewriteCond %{HTTP_HOST} ^www\.DOMAIN\.TLD$
RewriteRule ^/?$ "https\:\/\/DOMAIN\.TLD\/" [R=301,L]
# Do we like Symlinks? Yeah we do.
Options +FollowSymlinks
# No open directories or directory listings. What is this... 1998?
Options All -Indexes
IndexIgnore *
# Rewrite rules to block out some common exploits.
RewriteCond %{QUERY_STRING} base64_encode[^(]*\([^)]*\) [OR]
RewriteCond %{QUERY_STRING} (<|%3C)([^s]*s)+cript.*(>|%3E) [NC,OR]
RewriteCond %{QUERY_STRING} GLOBALS(=|\[|\%[0-9A-Z]{0,2}) [OR]
RewriteCond %{QUERY_STRING} _REQUEST(=|\[|\%[0-9A-Z]{0,2})
RewriteRule .* index.php [F]
# PHP doohickies
php_flag register_globals off
php_flag safe_mode off
php_flag allow_url_fopen off
php_flag display_errors off
php_value session.save_path '/tmp'
php_value disable_functions "exec,passthru,shell_exec,system,curl_multi_exec,show_source,eval"
# File Injection Protection, or a code-condom. What.
RewriteCond %{REQUEST_METHOD} GET
RewriteCond %{QUERY_STRING} [a-zA-Z0-9_]=http:// [OR]
RewriteCond %{QUERY_STRING} [a-zA-Z0-9_]=(\.\.//?)+ [OR]
RewriteCond %{QUERY_STRING} [a-zA-Z0-9_]=/([a-z0-9_.]//?)+ [NC]
RewriteRule .* - [F]
# /proc/self/environ? Go away!
RewriteCond %{QUERY_STRING} proc/self/environ [NC,OR]
# Disallow Access To Sensitive Files. Enter your own file names.
RewriteRule ^(htaccess.txt|configuration.php(-dist)?|joomla.xml|README.txt|web.config.txt|CONTRIBUTING.md|phpunit.xml.dist|plugin_googlemap2_proxy.php)$ - [F]
# Don't allow any pages to be framed - Defends against CSRF
<IfModule mod_headers.c>
Header set X-Frame-Options SAMEORIGIN
</IfModule>
# Uh. I forgot.
<IfModule mod_autoindex.c>
IndexIgnore *
</IfModule>
# NO SNIFFYWIFFY OwO
<IfModule mod_headers.c>
Header always set X-Content-Type-Options "nosniff"
</IfModule>
# NEEDS TESTING
# Turn on IE8-IE9 XSS prevention tools
#Header set X-XSS-Protection "1; mode=block"
# NEEDS TESTING TOO
# Only allow JavaScript from the same domain to be run.
# Don't allow inline JavaScript to run.
#Header set X-Content-Security-Policy "allow 'self';"
# Example if you don't like Russia and Turkey (Optional A1 is to block anonymous proxies)
RewriteCond %{ENV:GEOIP_COUNTRY_CODE} ^(RU|TR)$
RewriteRule .* https://DOMAIN.TLD/directorywithindexdothtml/ [R=302,L]
Okay, this is a very crude way to block bots, spiders and crawlers by their user-agent, but so far, this has been very, very efficient.
Even when one chooses ” yes “, the question will be repeated. This is not a problem, because no one in their right mind is going to add “bot”, “spider” or “crawler” as their user-agent.
So here’s the PHP script that I rammed into a certain website to prevent it from being DDOSsed by (malicious) bots.
<?php
// CC-BY-NC (2023)
// Author: FoxSan - fox@cytag.nl
// This is a functional but dirty hack to block bots, spiders and indexers by looking at the HTTP USER AGENT.
// The form is, iirc, not even working, but that's fine if you only want human visitors.
// It can also throw a 403, but the effect is the same.
////////////////////////////////////////////////////////////////////////////////
// Emergency bypass
// goto end;
////////////////////////////////////////////////////////////////////////////////
// Function to check if the user agent appears to be a bot or spider.
// Enter the bots you would like to block in a list as shown below.
function isBot()
{
$user_agent = $_SERVER["HTTP_USER_AGENT"];
$bot_keywords = ['bytespider',
'amazonbot',
'MJ12bot',
'YandexBot',
'SemrushBot',
'dotbot',
'AspiegelBot',
'DataForSeoBot',
'DotBot',
'Pinterestbot',
'PetalBot',
'HeadlessChrome',
'AhrefsBot'];
foreach ($bot_keywords as $keyword) {
if (stripos($user_agent, $keyword) !== false) {
return true;
}
}
return false;
}
// Check if the visitor is a bot or spider
if (isBot()) {
// This visitor appears to be a bot or spider, so display a choice.
// Check if the choice form is submitted
if (isset($_POST["submit"])) {
// Check the choice made by the visitor
$choice = isset($_POST["choice"]) ? $_POST["choice"] : "";
if ($choice === "yes") {
// User selected "Yes," block access
echo "Access denied. If you believe this is an error, please contact us by writing the word [MAILBOX] before the at sign, followed by [DOMAIN.TLD]";
} elseif ($choice === "no") {
// User selected "No," proceed to end
goto end;
}
} else {
// Output the message to the user and make the choice mandatory
echo "Your user agent suggests you might be a bot, spider, or crawler. Are you one of these three?";
// Output the radio button choices within a form
echo '</p>
<form method="post" action="">';
echo ' <label><input type="radio" name="choice" value="yes" required>Yes</label>';
echo ' <label><input type="radio" name="choice" value="no">No</label>';
echo ' <button type="submit" name="submit">Proceed</button>';
echo "</form>
<p>";
}
// Exit to prevent further processing
exit();
}
end:
// Original website code starts from here.
/////////////////////////////////////////////////////////////
?>
I have installed CloudPanel and the new website caused a “Too many redirects” bug. This is because my SSL certificates are controlled by a proxy and this can cause some confusion between the systems. Also, because CloudPanel installs its own certificates.
This application can also install a Let’s Encrypt certificate, but this works only in more conventional systems. Mine is going through a DNS to a Proxy that listens to a certain IP address and that proxy redirects the request to a Virtual Machine on one of my servers.
So, here is my, probably unconventional method of disabling the SSL certificates on my CloudPanel installation:
Done! Your website should now say “Hello world :-)”
You can see that I have disabled the listen to port 443, the certificate keys, the forced https and the path to the keys. I chose to switch off the forced HTTP, because my proxy is already taking care of that.
This post is subject to change, but this helps you along your way!