What has happened to women?
🐂💩 BULLSHIT💩🐂
I’m a 29 y/o white woman. I have a 9 y/o and a healthy relationship. I’m educated, and I feel like a thriving young woman. What did I get that women everywhere else apparently haven’t? I’m disgusted with women and their representation. Watching that dumb woman certifying the AZ votes literally AS A HEARING IS GOING ON, How are they happy with the female culture the way it is now? It’s so damn gross. Men don’t like it, normal women don’t like it. What has happened? How did it get here?
Degeneracy being pushed in Hollywood. Our culture has been corrupted immensely. Normalcy is frowned upon by the media. Don't allow them to get away with it. Hold strong to your morals.
Hollywood is definitely a good guess