In the last years we've seen a lot of companies adopting bullshit "social justice" propaganda and the cursed "woke" discourse. Nowadays it's hard to find a neutral company, that tries to sell their stuff based solely on the quality of the product.
They should be just selling their stuff, but no, they've started selling "ideas", this is literally a cultural war at this point.
I wonder what has changed in the last years that made them this way. Perhaps China's influence to destroy the ocident? Perhaps a new tactic of the elite to sow discord on the people and keep they divided (D&C tactics)? I'm pretty sure it's about money and influence.
I just know that it is repulsive.
Its the new business model. Its too hard selling an actual product thats objectively better than the rest...so by attaching morality to their brand, they appear superior to people who are infact mentally inferior
Consumers have become desensitized to marketing gimmicks, so every company tries to sell itself as a lifestyle brand nowadays.
Why a company that sells hamburgers or cheap cotton t-shirts needs to be a lifestyle brand is beyond me, but that's the trend with marketing douchebags.