I noticed that living in American culture is quite strange. Almost everything is about sex and “love.” Everything is sexualized including the LGBTQ+. Thoughts on this?

Also as someone who was raised in America, I’m still trying to shake off the rest of the side effects from this culture, such as porn addiction and feeling lost for not having a partner. I learned to listen to music that doesn’t include sexual stuff (my favorite is jazz lol.) However everywhere else no matter where I go there is sexual stuff, even included in ads lol.