From 773dfee8453d112d800293e155168cbf8cb8e148 Mon Sep 17 00:00:00 2001 From: WeebDataHoarder <57538841+WeebDataHoarder@users.noreply.github.com> Date: Wed, 16 Apr 2025 10:50:40 +0200 Subject: [PATCH] Rename Why -> Why do this on README --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index c7231d7..bba0977 100644 --- a/README.md +++ b/README.md @@ -13,7 +13,7 @@ The tool is designed highly flexible so the operator can minimize impact to legi [Challenges](CHALLENGES.md#challenges) can be transparent (not shown to user, depends on backend or other logic), [non-JavaScript](#non-javascript-challenges) (challenges common browser properties), or [custom JavaScript](#custom-javascript-wasm-challenges) (from Proof of Work to fingerprinting or Captcha is supported) -See _[Why?](#why)_ section for the challenges and reasoning behind this tool. +See _[Why do this?](#why-do-this)_ section for the challenges and reasoning behind this tool. This documentation and go-away are in active development. See [What's left?](#what-s-left) section for a breakdown. @@ -233,7 +233,7 @@ Important notes: * Add or modify rules to target specific pages on your site as desired. * By default Googlebot / Bingbot / DuckDuckBot / Kagibot / Qwantbot / Yandexbot are allowed by useragent and network ranges. -## Why? +## Why do this? In the past few years this small git instance has been hit by waves and waves of scraping. This was usually fought back by random useragent blocks for bots that did not follow [robots.txt](/robots.txt), until the past half year, where low-effort mass scraping was used more prominently.