I ran this command to see if the PDF menu was small enough for my capped internet connection:
$ torsocks curl -LI 'https://cafevanbommel.nl/wp-content/uploads/2023/11/Van-Bommel-Menukaart-November-2023-FOOD.pdf'
HTTP/2 200
date: Tue, 09 Apr 2024 16:01:40 GMT
content-length: 1480
cache-control: no-cache, no-store, must-revalidate, max-age=0
cache-control: no-store, max-age=0
server: imunify360-webshield/1.21
PDF was only 1k, so of course I have no objections. Fetched it using wget
, and it was just ASCII text in the form of HTML-wrapped javascript. WTF?
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="robots" content="noindex, nofollow">
<title>One moment, please...</title>
<style>
body {
background: #F6F7F8;
color: #303131;
font-family: sans-serif;
margin-top: 45vh;
text-align: center;
}
</style>
</head>
<body>
<h1>Please wait while your request is being verified...</h1>
<form id="wsidchk-form" style="display:none;" action="/z0f76a1d14fd21a8fb5fd0d03e0fdc3d3cedae52f" method="GET">
<input type="hidden" id="wsidchk" name="wsidchk"/>
</form>
<script>
(function(){
var west=+((+!+[])+(+!+[]+!![]+!![]+[])+(+!+[]+!![]+!![]+!![]+!![]+!![]+!![])+(+!+[]+!![]+[])+(+!+[])+(+!+[]+!![]+[])+(+!+[]+!![]+!!
[]+!![])+(+!+[]+!![]+!![]+!![]+!![]+[])),
east=+((+!+[]+!![]+!![]+!![]+!![]+!![])+(+!+[]+!![]+!![]+!![]+!![]+!![]+[])+(+!+[]+!![]+!![]+!![]+!![]+!![]+!![]+!![])+(+!+[]+!!
[]+!![]+!![]+!![]+!![]+!![]+[])+(+!+[])+(+!+[]+!![]+!![]+[])+(+!+[]+!![]+!![]+!![]+!![]+!![]+!![]+!![]+!![])),
x=function(){try{return !!window.addEventListener;}catch(e){return !!0;} },
y=function(y,z){x() ? document.addEventListener('DOMContentLoaded',y,z) : document.attachEvent('onreadystatechange',y);};
y(function(){
document.getElementById('wsidchk').value = west + east;
document.getElementById('wsidchk-form').submit();
}, false);
})();
</script>
</body>
</html>
To troubleshoot, I loaded the same link in a GUI browser. PDF.js fetched a proper PDF that turned out to be 1.6mb. Fuck this shit. It’s not as bad as some restaurants (~20mb menus loaded with pics), but still, it could have sucked my credit dry because the asshole web dev pulled this shit. The content-length
header exists for a reason.
I wonder to what extent the restaurant’s web admin is just naive about what’s happening, considering the “imunify360” in the header, which suggest some shitty MitM might have done this without the Wordpress user really knowing.
But what’s driving the protectionism? I should be able to, for example, have a scraper bot harvest all the PDF restaurant menus before visiting a region. They should want my business.