I noticed that living in American culture is quite strange. Almost everything is about sex and “love.” Everything is sexualized including the LGBTQ+. Thoughts on this?
Also as someone who was raised in America, I’m still trying to shake off the rest of the side effects from this culture, such as porn addiction and feeling lost for not having a partner. I learned to listen to music that doesn’t include sexual stuff (my favorite is jazz lol.) However everywhere else no matter where I go there is sexual stuff, even included in ads lol.
deleted by creator
deleted by creator
Thank you for giving me insight on how patriarchy spread into the west. I never knew it was something like this. I always thought it had a lot to do with religion. But really the rich and the elite were mostly behind this all along.
And this culture is widely put in entertainment from all around the world. Especially in Western entertainment, where the woman is always saved by a man.
I’ve heard that in some middle eastern countries women couldn’t go out in public without a man by their side. Could you give more info on this? I don’t wanna sound like a liberal talking more about something I know nothing much of. Would love to learn more about this. Correct me if I’m wrong on that speculation.
Woah, who knew that such practices a long time ago could lead into modern times. These examples tie things together perfectly.
What could be done about this culture as a whole and as an individual?
deleted by creator
The word your looking for is “mahrem” it’s a man who walks with the woman and can not marry like her dad,brother,son, husband etc… But the “mahrem” is only for hadj, a woman is allowed to leave the house without one just fine, and yes a lot of families in strict middle eastern countries don’t let women go out without a man, but it’s pure misunderstanding of islam and mixing it with idk what, like how women driving wasn’t allowed until 2018 in saudia.
spoiler
Warning shit that I’m not sure about ⚠
A lot of people Muslim countries didn’t have very good understanding of Islam for a while, only people who went to Islamic schools did, but I think the introduction of westerners through colonialism and missionaries in those countries might’ve had an impact on the culture with Western sexism added in? I am not really sure I’ll try to do some research when I have free time.