America used to be a place you could come to work to be able to support yourself and your family.
Looking back over the last quarter century it seems to me that we’re seeing a reversal of this trend. American born citizens so overburdened by the failures of their country that they dream of moving somewhere they can support themselves, and maybe finally begin to live their lives.
What are your thoughts?
That’s… Wow. The fact that I couldn’t even imagine a hurt worker getting 90 percent pay while recovering tells you where the US still is.