827
Comments (78)
sorted by:
You're viewing a single comment thread. View all comments, or full comment thread.
5
fegeline 5 points ago +5 / -0

When you archive a page, a robot will visit the site and download the content in order to archive a snapshot of it at a particular time. What they're doing is that they try to identify these robots and block them so that their articles can't be archived.

Apparently they're not very good at it since humans appears to be blocked out too. Also, there is no fool proof way to block archiving of a site, because a site can be archived in billions of different ways.