It sucks.
People I used to respect. The cool uncles, and laid back friends, now all totally convinced that things are going badly. Don't even wanna met in-person unless I've been vaccinated or tested routinely. It's INSANE
Anybody else in the same boat?
It's almost always their women that fuck them up.
I would argue that Christianity is right about women.