No, I'd rather stick around and fight for what I believe in.
Although there's a number of countries I believe I wouldn't mind living in, and might even prefer, like France or Spain, but they don't want ugly Americans
ting up the joint. So it's basically not even a possibility if that was something I was interested in doing - you are not getting a work permit there, and would be relegated to a life of
cash jobs, if you were able to find work at all.
Don't mistake my critiques of America for a dislike of my country. It's the opposite, in fact. I love this country so much, that it annoys me to see what we're becoming, and I know we're better than this. I want to see us return to greatness, and I believe that it's possible if we take some responsibility for where we've
ed up and do what's necessary to fix it.
Politicians pandering to the lowest common denominator to win/keep their offices, instead of doing what's truly best for this country, and what they really believe in doesn't serve the greater good, it keeps us stuck in the same old cycle of mediocrity.
There's a reason that politicians running for national offices suddenly find religion, and it's not because they believe, it's because they're pandering to voters who let religion color their politics. There's a reason that east coast elites start talking like down home good old boys when they're visiting these places in the flyover states - they're pandering to the same people there that they're laughing at the minute they get back on the plane.
There's a few basic things that need to change before we can really ever move forward. Prison reform and the war on drugs, as I've mentioned previously, and then election reform along with term limits and changes in campaign finance laws are probably the biggest.
Career politicians have ruined this country. This post is long enough, so I'm not gonna elaborate on that any further here, but nearly anyone with half a brain should realize this and agree with it.