3864
Comments (240)
sorted by:
You're viewing a single comment thread. View all comments, or full comment thread.
2
HocusLocus 2 points ago +2 / -0

So (bless you for taking your time) how is this present level of Cloudflare working? It seems as if I -- a user who has logged on maybe a dozen times in TDW's history, so my session cookies are very long lived and I've never hit TDW with any script-like behavior -- when I go into total bicycle mode I am forced to conclude my session cookies are NOT being whitelisted at the front end.

Does whitelisting strings given to Cloudflare, such as the small gobblegook in my (valid, real human) session ID, cost extra money or is a limited resource?

From how you describe it Cloudflare is trying to identify sessions to blacklist. The DDoSers have thousands of zombie sessions and they start to issue (slow for each but lots en masse) zombie clicks and from what I've learned, resource intensive ops like searches, at the same time?

Mods are cagey to discuss it but as an old IT Boomer who invented the Internet (it wuz me I swear) I'm curious about how it's being done with a thought to better (especially cheap) countermeasures.

3
deleted 3 points ago +3 / -0
2
HocusLocus 2 points ago +2 / -0

You have explained it VERY well -- OSI 4 says it all. Sucks. Without a front end that completely accepts the https request headers and performs cookie-based whitelisting there is no way I could easily be declared a 'friendly'.

Now it just so happens that my public IPv4 source could identify me as friendly, but I know there are so many massively-NATed networks out there that bad actors could hide behind.

I am assuming Cloudflare does have the customer's https privates and does have the ability to see through TLS if they wanted to, right? Because they have to accept all connections and then hand off the good ones? That would be necessary to issue temporary redirects with Cloudflare junk added to the ?x=x request string.

When I think of session whitelisting it goes by stages of increasing desirability,

  • Website provides Front End with unique whitelist strings. This becomes ridiculous very quickly if there are thousands of logged-in users.
  • better, Website supplies Front End with a public cert and 'signs' session IDs with private key. Front end now does not need to 'match' unique session strings for every user, all they are doing is verifying signatures.
  • When I suggest RSA for this purpose, it need not be the insanely large primes we use for total cryptographic assurance, even smaller keylength could be effective, especially if Website could evolve certificates (that are only being shared between Website and Front End) in quickly.

If session whitelisting could be achieved, in this or the next generation of front end infrastructure, then Website (such as TDWIN) might... instead of trying to identify bad logged-in actors when there is a flood, maintain a 'good citizen score' of logged-in users over time that eventually declares them human.