Do you think the US will be able to change for the better regarding things like gun laws, healthcare, equality and overall politics?
Do you think the US will be able to change for the better regarding things like gun laws, healthcare, equality and overall politics?
It’s possible, but things are getting worse so it seems pretty unlikely.
Yeah, it almost seems as if it has just gradually been getting worse over the past few years. Hopefully this stirs up the people enough to bring some positive change.