Whats important here that this is repeatable. Can we go to the ORIGINAL host of this data you are presenting? My worry is that this means nothing if we cant get the original CRC HASH MATCHED dataset from the original source and reproduce these results. I'm in genuine fear that the source of this dataset will be scrubbed.
Waste of time. This is a clown show. Everything that they have ever said has been a lie. All framed by "context" -- "meaningful information" -- All coming from the same Arbiters of truth. Fuck these guys. This is a clown show.
Aside from the fact that I repay my debts, and dont want my federal dollars going to footing someone elses bill-- I know some from college that intentionally took out HUGE student loans, lived like kings during college. Some of them ended up working for the state. I asked them why did you so vehemently seek public/state jobs-- Their answer was, in my state: if you work a public sector job for 10 years your student loans are WIPED. Guess who they voted for? Cmon man.
Some of them are exploiting forbearance, and deferral programs to pay back as little as possible before their 10 years are up. It boils my blood. Is my story common? Probably not, but im certainly jaded to giving a shit about forgiving someone elses debt with my money-- When i pay mine.
What language are you writing it in? We can buy some proxy lists, or use public lists to load and use for each http request or scrape, and keep rotating them. You could batch fire off more than one request if they all go through a subsection of the proxy list, and continue down the list until it restarts. That should alleviate any spam/block concerns.
You could also run the scripts on some cheapo online webservers like on digital ocean and use their larger bandwidth capacity to do these bigger batches as I described previously.
There are options, whatever language you are writing this in, i HIGHLY recommend take the time to scrape public proxy lists, or use a list from NordVPN (if possible and if you pay) and do each single request 1 IP at a time.
Example - 10 checks use 10 different IPs and do 10 scrapes at a time, move down the list, do it again. Keep upping the batch until if shits out, then back it down. (You will have to test and see how much these hosts of the information you are querying are able to handle as well).
Hes Newsom's gay pet. The only reason he probably didnt go into a lockdown cause he knows his ass is about to get lynched by his constituents... in minecraft.