This will start out sounding political, but my thoughts here aren't actually on politics. I'm actually considering ramifications. I'm wondering what the election results tell us about Americans.
For quite some time now there have been fears that if America failed to elect Barack Obama there would be race riots. The feeling was that the only reason we would not elect him was that he was a black man. It was racism and racism alone that would prevent Americans from electing him. Now, I always thought that was nonsense -- my objections were never on the basis of race -- but I don't wish to discount public sentiment. So ... now that we've elected a black man for president, can we say that racism in America is dead?
Now, I know that we can't. Rules of logic don't work that way. In logic, not proving one position does not necessarily validate the opposite. So, if we can say that racism didn't cause Obama not to be elected, we cannot say that racism is dead. Still, I have to wonder. Can we say that racism is not as much a part of the fabric of America that it once was? I've talked to several people over the last several months about racism in America who claim that it's just as bad, if not worse, as it ever was. Doesn't this election suggest that it's not so? I suppose there is a feeling in me that suggests that turnabout should be fair play. If not electing Obama proved racism, it seems like electing him proves "not racism." Unfortunately, I'm suspecting that we won't get "fair play" in this case. It's a "no win" for non-racism here. If we don't elect a black man, we're racist. If we do elect a black man ... well, we're still racist. I have to wonder.
In this election there were several states that decided the outcome by shifting their previous position. There were no states that shifted from previously Democrat to currently Republican. There were several that shifted from Republican in 2004 to Democrat in 2008. Is this an indicator of American perspective? The typical Republican side is a typical conservative side. By that I mean that typically the Republican side wishes to conserve original political conditions. The typical Democrat side is typically more liberal or "progressive." As indicated by Obama's stump speeches, they don't want conditions to remain the same; they want change. Does this shift by some eight previously Republican states mean that America is turning away from original ideals and moving to new ones? While the heartland of America, literally "middle" America, is still fundamentally Republican and basically conservative, are there larger numbers of Americans that are leaning toward more change? Is, for instance, capitalism falling into more disfavor than it was previously? Is the class warfare becoming worse, where more people view "rich" as "bad" and more people think "I deserve more of what they have"? Is America headed toward a more isolationist perspective? It makes me wonder.
Then there's this whole "marriage" thing. The argument in Arizona was, "We already shot it down two years ago. Why are we doing it again?" The resounding answer, apparently, is, "We didn't vote on this two years ago. It was a different issue." It appears, in Arizona, that the majority of voters want to keep the traditional definition of marriage as the union of one man and one woman. Is that a political statement, a commentary on running a campaign, or does that say something about the views of Arizonans? But before you answer, what about California? I think the perception has been that Californians are, generally, more liberal. They're, well, more "pro-gay." Most of America (and a loud part of California, I think) assumed that they would terminate that divisive Prop 8 with prejudice. (Sorry ... trying to have fun with words there.) Now they've passed a constitutional amendment that affirms the longstanding and traditional definition of "marriage." And you can throw in Florida with something like a 60% to 40% margin. It makes me wonder (again). Does this say what I think it says? It appears to me that a majority of Arizonans (you know, those who "already voted on this") and a majority of Californians (you know, those who are "the most liberal") along with Florida now and others before have decided that marriage should actually be defined as a union of one man and one woman. Is that a commentary on the views of Americans? Does the majority of America still think that marriage is between a man and a woman, and "same-sex marriage" is not the same thing? And I'm wondering, along the same lines, about those who believe that we should define terms as the majority does. If America still views "marriage" in the longstanding and traditional way, does this change their opinion?
I'm not really wondering so much about politics here. I'm not wondering about who got elected and what it means to America. I am wondering about what the results of the voting says about American viewpoints. I suspect that it says more than we immediately recognize. I also suspect that it says more than we're willing to admit.
No comments:
Post a Comment