Good thing no one outside Hollywood fucking cares what they say anymore. This hasn't happened over the past four years. It's been an ongoing thing since Obama came around and Hollywood got political like fuck. It's only in the past 5 years they got woke as fuck. But that was most likely a hidden feature for many years.
Hollywood's (or any movie star) influence has really dropped off a cliff in recent years. They only make headlines because the media is also far left as fuck and will print any socialist propaganda now.
I'm not even talking about how much COVID has shown them up to be the assholes they have always been.
Don’t underestimate their appeal and reach. There are still millions of people who view celebs in a religious way. Like they somehow have it all figured out in life because they blew Weinstein and became “famous”.
Good thing no one outside Hollywood fucking cares what they say anymore. This hasn't happened over the past four years. It's been an ongoing thing since Obama came around and Hollywood got political like fuck. It's only in the past 5 years they got woke as fuck. But that was most likely a hidden feature for many years.
Hollywood's (or any movie star) influence has really dropped off a cliff in recent years. They only make headlines because the media is also far left as fuck and will print any socialist propaganda now.
I'm not even talking about how much COVID has shown them up to be the assholes they have always been.
Don’t underestimate their appeal and reach. There are still millions of people who view celebs in a religious way. Like they somehow have it all figured out in life because they blew Weinstein and became “famous”.
I want Hollywood taken down by the Amish