I noticed that living in American culture is quite strange. Almost everything is about sex and “love.” Everything is sexualized including the LGBTQ+. Thoughts on this?
Also as someone who was raised in America, I’m still trying to shake off the rest of the side effects from this culture, such as porn addiction and feeling lost for not having a partner. I learned to listen to music that doesn’t include sexual stuff (my favorite is jazz lol.) However everywhere else no matter where I go there is sexual stuff, even included in ads lol.
You’re right. I’ve been watching anime recently and I’ve seen blatant sexualization of women. I say this is a product of western culture. You’re right that it didn’t start from the US, in fact the US culture was a result of this patriarchal disgusting culture.