If you think it’s bad now, it’s gonna get worse in the coming years.
Reading and analysing what’s been happening around us for the past 10-20 years, it’s clear that we’re in a Nazi Germany situation. And when fascism will be here, we will ask “how did that happen?”
The future looks bleak.
Makes me wonder, why the hell are these even happening? I thought people were aware why such rights were established in the first place? Or do they not teach that/have stopped teaching that in Western Academia?
In my experience I know all too much people who think the civil rights era, gay marriage, and trans care, all were established legally by just voooooooooooting hard. In a way the school system sets up people to not understand their rights so they can be taken easier.
What the fuck.
And people have the gall to say “SyStEMic RAcISM IS NOT reAl!”
capitalist “education”
“Well we want to have cool happy jetpack spacefuture with trans rights, BUT THOSE EBIL COMMIE RUSSKIES AND CHINESE ARE PREVENTING IT AND WANT TO START NUCLEAR WAR TO TAKE AWAY OUR GAY MARRIAGE RIGHTS so we must pump more funds into the MIC and crush those filthy untermen- eeer I mean we must illuminate those who don’t share our values!”