The "black vs. white Oscars" made me feel like it was the 1960s again. Even in Hollywood, it seems we're back to the whiny "you dissed me because I'm black" atmosphere that permeated the country forty-to-fifty years ago.
How ironic--and sad--that, far from being the crowning achievement in a long, courageous, bloody path to equality, Barack Obama's presidency has returned the nation to a color-conscious, sniping, and resentful collection of malcontents.
I suppose that Chris Rock did the best he could as host of the awards show. But, despite a few chuckles, it was painful to watch. I don't think I'm alone in my weariness of the word "racism." It's thrown around with such boring regularity that it's become virtually meaningless. That's the real injustice, because true racism is a powerfully destructive force that gets no clarity today. Any perceived slight is deemed "racist"--the "all-white" Academy Awards being a prime example.
In my book, deliberately causing harm to people because of their color, that's racism. Leaving them off a list of Hollywood award nominees, for whatever reason? Is that really "Racism"? After stars such as Denzel Washington, Halle Berry, Jamie Foxx, Forest Whitaker, Morgan Freeman, Stevie Wonder, Whoopi Goldberg, and Jennifer Hudson (complete list linked here) have taken home the golden prize? Not so much.
Do we now need Affirmative Action for the Academy Awards? Will the Motion Picture Academy be required to nominate people based on race, regardless of the quality of their work? If so, then why don't the African American performers just stick with the Black Entertainment Awards and leave the Oscars to the white folks?
In 2016, when the United States has established a distinguished history of diverse racial achievement at the highest levels of government, industry, academia, literature, and art, why are we even talking about "racism," let alone letting it dominate our culture? What is to be gained by this? A more urgent question perhaps is, what is to be lost?