As I understand it, and I don't claim to be an expert, web crawler bots automatically scan web pages, for example to inform search engines about what is out there. Pages are revisited periodically to check for changes. This has been going on for many years. It may be that such page visits are now better detected than formerly, so appear in website statistics, for example as 'guests'. If the page exists and access is not somehow restricted (as in the "dark web"), bots will find it and analyse it. This is a feature of the www, it's not necessarily sinister (though some bots are probably up to no good) and is something we cannot avoid, like pop-ups and requests to accept or decline cookies....