I think everyone can agree that politics has become divisive, polarizing, uncivil, and uncompromising. Those of us on the left believed that the vicious rhetoric inevitably led to the irrational violence we saw in Tucson. Those on the right looked at the same event and decided it was time to circle the wagons defensively; sure that someone was going to come and take their guns away from them. No common ground there!
Then there’s my own feelings about the right-wing. (For simplicity’s sake, I’m going to use “right” and “left” instead of Republican, Conservative, Democratic, Liberal, and so on). Years ago, I was able to talk with people on the right in a civil manner. I didn’t see them as inherently evil – just people with different opinions. That’s all changed now, and I think I know why. Think back to the Reagan years. The arguments were about how to solve our country’s problems, right? For example, what should we do about poverty in America? The right came up with the trickle-down theory, which basically says that if you make the rich richer, it’ll filter down and help everyone. (I don’t know if that specific phrase was around earlier; I only know that I first heard it when Reagan was president.). The left looked at that theory and realized it wouldn’t work. The right thought it would. The disagreement was over how to solve the problem.
But things are different now. Today, the right DOESN’T THINK THERE IS A PROBLEM. This is a huge change. If people are poor, oh well, too bad. Not my problem. Certainly not the government’s problem. If people need organ transplants to save their lives and can’t pay for them, then they should just die. How sad. Why should I have to do anything about it? Rand Paul thinks all the civil rights legislation from the sixties should go. True, it would mean open discrimination in housing, hiring, and other areas, but that’s just life, right? It’s not my fault. The government certainly shouldn’t be involved!
Now, how can I find common ground with someone who believes that? Maybe the right always thought that way, but it was just socially unacceptable to admit it. Clearly, it’s now perfectly okay to admit that you just don’t care. So, are Americans now more greedy, more selfish, and more uncaring than forty years ago? Or are they just more honest?