I noticed that living in American culture is quite strange. Almost everything is about sex and “love.” Everything is sexualized including the LGBTQ+. Thoughts on this?
Also as someone who was raised in America, I’m still trying to shake off the rest of the side effects from this culture, such as porn addiction and feeling lost for not having a partner. I learned to listen to music that doesn’t include sexual stuff (my favorite is jazz lol.) However everywhere else no matter where I go there is sexual stuff, even included in ads lol.
I’ve always noticed that as an outsider, it’s extremely weird and normalized like wtf is a cheerleader, why tf is there a half naked women laying on the car ad and wtf is a bikini carwash
It gets even more disgusting. I’ve once spoken to a 14 year old girl who wanted to be a porn star. She doesn’t know the horrors of that industry, yet, thanks to American culture, that sort of life is glorified. Sex workers are workers, however they didn’t choose that life, they were forced into it by this terrible capitalist system.