(The cast of “Big Little Lies” accepting their Golden Globe. Photo credit: Hollywood Foreign Press/NBC)
The Golden Globes took me on a journey this year. To be honest, I wish I wasn’t on a good 50 percent of that journey. But the parts that I stuck around for were worth it.
For instance, let’s take the theme of the night—TIME’S UP. With the Golden Globes red carpet and subsequent awards show, the prevalence of sexual assault and harassment against women in the workforce has been put in the spotlight at such a large scale that it seems virtually impossible for the industry to walk back on it or turn its face away from it. The TIME’S UP Legal Defense Fund, spearheaded by over 300 women in entertainment, is now part of the fabric of Hollywood and will only get stronger year by year.
It goes without saying that the initiative’s birth comes from the sheer amount of women in Hollywood who shared their heartbreaking stories of harassment and abuse at the hands of producers, directors, and other Hollywood male elite. But, what also helped the initiative take shape was a message of solidarity from the Alianza Nacional de Campesinas (the National Farmworker Women’s Alliance), an organization that combats the harassment female farmworkers face. As TIME’S UP’s website states, the fund partners “with leading advocates for equality and safety to improve laws, employment agreements, and corporate policies; help change the face of corporate boardrooms and the C-suite; and enable more women and men to access our legal system to hold wrongdoers accountable.” In short, the initiative hopes to help all women be protected against abuse and inequitable power structures.
The solidarity between the Alianza Nacional de Campesinas and TIME’S UP is why so many actresses brought WOC activists as their plus ones Sunday. The goal was to advocate for intersectional feminist politics and uplifting female voices and women-led organizations. The eight activists that joined Michelle Williams, Emma Watson, Susan Sarandon, Meryl Streep, Laura Dern, Shailene Woodley, Amy Poehler and Emma Stone were:
Billie Jean King, legendary tennis star and activist
Their joint statement sheds more light on why they chose the Golden Globes red carpet as the avenue to steer the conversation from one of outrage to one of action.
“As longtime organizers, activists and advocates for racial and gender justice, it gives us enormous pride to stand with the members of the TIMES UP campaign who have stood up and spoken out in this groundbreaking historical moment. We have each dedicated our lives to doing work that supports the least visible, most marginalized women in our diverse contexts. We do this work as participants in movements that seek to affirm the dignity and humanity of every person.
“Too much of the recent press attention has been focused on perpetrators and does not adequately address the systematic nature of violence including the importance of race, ethnicity and economic status in sexual violence and other forms of violence against women. Our goal in attending the Golden Globes is to shift the focus back to survivors and on systemic, lasting solutions. Each of us will be highlighting legislative, community-level and interpersonal solutions that contribute to ending violence against women in all our communities. It is our hope that in doing so, we will also help to broaden conversations about the connection to power, privilege and other systemic inequalities.”
After reading more about these women and how they utilized the red carpet as their battleground, I feel like a butt for initially thinking the act of actresses bringing these women was one of performative wokeness. Without any knowledge behind the women’s goal, it certainly has all of the appearances of a selfish act by Hollywood elite to gain brownie points and good press. Without knowing anything about the event, you could easily think the women were being tokenized. It’s easy to believe the worst of Hollywood, even at times like these.
But in this case, the opposite is true; the women involved, women who do such important work, weren’t being used, which was my fear (if you go on Twitter, you can read my misgivings and see-sawing from point to point). I was extremely protective of how these women were being perceived by Hollywood. I’m glad to feel like I was being protective for no reason. And whether or not you believe there’s still some tokenism or lack of agency happening, there is still the silver lining that the exposure opened us viewers up to just eight of the many women who do the hard work without much recognition. They do they work because it is their true calling. It’s only right that they become just popular and recognizable, if not more so, as the actresses who partnered with them.
Even when I muddled through my concerns while watching the red carpet, I was positively surprised and heartened to hear how many actresses were ready to talk about issues affecting all women, including taking E! to task—while being interviewed by E!—for their pay gap.
Where the night fell apart was how men were largely let off the hook about speaking up for women’s rights. All they really had to do was wear a “TIME’S UP” pin and a black tuxedo and smile. The actual awards show also didn’t help matters, between snubbing Dee Rees and Greta Gerwig in the Best Director category (as Natalie Portman so poignantly said, only men were nominated), snubbing Mudbound as a Best Picture contender, blocking Get Out from its expected win by putting it in the Comedy/Musical category, and awarding Kirk Douglas, James Franco, and Gary Oldman, all of whom have checkered pasts and allegations of abuse, harassment or sexual assault.
But I was brought back by Oprah Winfrey’s rousing speech. I’ll be honest and say that Oprah had fallen off my radar in the past few years; I still watched OWN from time to time, and I still loved my memories of watching The Oprah Winfrey Show. But as for Oprah herself—I thought she’d gone extremely Hollywood. I thought she’d forgotten who she was before she became the New Age guru she is now. Sometimes, the rich begin to forget the hardships of others, and I’d sadly lumped Oprah in with that group, since it’s a luxury to be able to ponder life’s issues inn a comfy chair in the woods.
But Oprah rightfully schooled me, and everyone else in the Golden Globes audience. She gave everyone an education on what they should prioritize in this fight for equality; it’s not about what we wear or don’t wear, and it’s not about how well we speak or how much money we have. What matters is if we use the platforms we have, big or small, to speak out against bigotry, xenophobia, sexism, harassment and abuse. We need to always lift up those like Recy Taylor who never got the justice they deserved. We need to learn and re-learn our American history, so we don’t go through life not giving women like Rosa Parks, an NAACP investigator (not just a tired seamstress, as we’ve always been taught) their full due.
On a personal level, Oprah also reminded me why I got into this representation game in the first place. Too often, many of us lose our way and forget why we were called to do the things we do in the first place. I started blogging about representation in the media years ago after I realized there was a lot more I could say about film and TV than just who is cast in the new thing. There was an entire market not being addressed, and I felt I had the background and talent to address that market with intelligence and humility. However, the world of social media can make you believe that developing a cult of personality is more important than writing a meaningful post. It can make you think your work doesn’t matter because you might not be as loud or as brash or as excitedly opinionated as others. What Oprah did was inspire me the way she did when I was a child. I remembered why I write about film and TV—it’s because my voice is needed. It’s because all of our voices are needed, not just my own. We all should be able to voice our truths about our lives and experiences and lift each other up, finding commonalities in our stories and areas where we can increase our learning. In a way, Oprah did what she’s always done, including when she holds her conversations in the woods—she’s asking us to showcase vulnerable and relatable humanity to each other.
With that said, it’s kinda ridiculous that reactions to her speech has now devolved into a shouting match on Twitter about whether or not she’s qualified to run for President. Sure, I’d like a politician to run for President, but it’s not as if Oprah’s another Trump—she’s highly intelligent, she’s a humanitarian, and she understands what’s at stake with American politics and society. If Trump’s qualified to run and win, anyone’s qualified to run now. And if that means Oprah’s got a chance, then so be it. I mean, if there’s no one else running against Trump, who else are we going to vote for? There are bigger and more meaningful hills to die on than if Oprah wins the Presidency. (By the way: I didn’t see this much outrage when Dwayne Johnson said he was mulling over a presidential candidacy.) The Twittersphere going H.A.M. over Oprah’s hypothetical candidacy has left a bad taste in my mouth for sure, and it’s definitely indicative of how Twitter as a whole can miss the point of a poignant moment.
I’ll end with this: The Golden Globes were the worst and best of times. Some things happened that were deeply questionable, and other things happened that seemed sketchy at first but turned out to be fantastic. In the end, Oprah cut through the muck and proved to be the guiding light of the evening, and looking with hindsight, none of us should have been surprised at that outcome.♦
The wedding cake on display at Masterpiece Cakeshop.
AP Photo/Brennan Linsley
On Tuesday, Dec. 5, a visibly divided U.S. Supreme Court tackled the contentious issue of religious freedom when it heard oral arguments in “Masterpiece Cakeshop, Ltd. v. Colorado Civil Rights Commission.” The arguments appeared to evenly split the four conservative justices from the four liberals. Justice Anthony Kennedy, who is often a swing vote, seemed to side with the baker.
The case involves a Denver bakery owner who refused to make a wedding cake for a gay couple, citing his religious belief that marriage can be between only a man and woman. The couple sued, and a lower court ruled the baker violated Colorado’s public accommodations law. The statute forbids discrimination by businesses serving the public, including on the basis of sexual orientation.
In their appeal to the Supreme Court, the bakery’s lawyers have emphasized free speech issues by presenting the baker as an artist who has a right to choose how he expresses himself. But religious freedom remains central to the case. A key question is whether a business owner must provide services that conflict with his or her religious beliefs.
This divisive case highlights the vast difference between the reality and the rhetoric of religious freedom, which is often considered to be the ideal that promotes harmony and equality. But, history suggests that it does lead to more conflict.
The rhetoric: Equality and goodwill
It is true that throughout U.S. history, Americans have idealized religious freedom and imagined that it brings harmony.
The First Amendment’s clauses guaranteeing religious free exercise and preventing establishment of an official church seemed to promise less discord to the Founding Fathers. In an 1802 letter, Thomas Jefferson, for example, wrote that “religion is a matter which lies solely between Man & his God.” As the nation’s third president, he argued that a “wall of separation between Church & State” would give all people equally the right to free conscience.
Later presidents echoed the view that religious freedom brings equality and unity by preventing government from favoring particular faiths.
Before his election in 1960, John F. Kennedy tried to ease fears about his Catholicism by affirming religious liberty. Kennedy believed this freedom kept one group from oppressing another. It formed the basis of a society, he declared, where people would “refrain from those attitudes of disdain and division which have so often marred their works in the past, and promote instead the American ideal of brotherhood.”
In the early 1990s, George H.W. Bush identified religious liberty as the basis for other rights. He credited it as a major reason for the vibrancy of American society.
The reality: Conflict and debate
Minority communities, ranging from Catholics to Mormons, have fought to have their traditions and customs recognized as religious. As I show in my work on pluralism, Americans have debated what constitutes a religious expression rather than a cultural practice. People have also argued whether religious expression can extend into political, social and business interactions.
These debates have required the intervention of the courts and have often ended at the Supreme Court. Thus, a right intended to free Americans from government has instead necessitated frequent involvement of a major government institution.
Further complicating matters, the Supreme Court has changed its position over time. Its evolving interpretations show how religious freedom debates create shifting categories of winners and losers.
To the courts
Like Masterpiece Cakeshop, one of the Supreme Court’s first religious liberty cases involved marriage. In 1878, a Mormon resident of the Utah territory sued the federal government after he was charged with bigamy. He argued that the law violated his religious liberty by criminalizing his polygamous marriage. The Supreme Court disagreed. In Reynolds v. United States, the court ruled that the First Amendment guaranteed only freedom of belief, not freedom of practice.
In the 20th century, the Supreme Court showed greater sympathy to religious liberty claims. In several cases – including one brought by Jehovah’s Witnesses challenging a statute requiring a permit for public evangelizing and another by an Amish community that objected to Wisconsin’s compulsory public school law – justices sided with those who claimed their freedom was violated.
That changed in 1990. The court ruled against two men who lost their jobs after using peyote, the cactus, which has hallucinogenic properties and has long been used in Native American religious practices. Because they were fired for drug use, the men were denied unemployment benefits. They claimed that as members of a Native American church, they used the drug for religious purposes.
Moving away from earlier decisions, justices ruled that religious belief was not a ground for refusing to obey laws “prohibiting conduct that the State is free to regulate.”
New century, new conflicts
The peyote case set the stage for Masterpiece Cakeshop. It was in response to the case that Congress passed the Religious Freedom Restoration Act (RFRA) of 1993. It required that laws restricting religious expression must show that they serve a compelling need.
RFRA was central in the Supreme Court’s 2014 decision in Burwell v. Hobby Lobby. That contentious split ruling allowed small, closely held companies the right to deny contraceptive benefits mandated by the Affordable Care Act on the grounds of protecting their owners’ religious liberty.
Similarly, in October 2017, the Trump administration invoked freedom of religion when it allowed all employers a religious exemption to the contraception coverage requirement in the Affordable Care Act.
Critics saw that policy change as an attack on women’s rights. Reaction to it on both sides again showed that government involvement in debates about religious freedom invariably produces winners and losers.
Given our polarized society and the division among the Supreme Court justices today, this pattern will continue, whatever the verdict is.
This is an updated version of an article first published on Nov. 28, 2017.
Race-neutral affirmative action can help identify first-generation students like Blanca Diaz and LaQuintah Garrett.
AP Photo/Amy Anthony
In 2003, Justice Antonin Scalia predicted that the Supreme Court’s sanctioning of race-conscious affirmative action in higher education would spark future litigation for years to come. And right he was. From defeated claims of discrimination against the University of Texas at Austin to an ongoing lawsuit against Harvard, colleges continue to come under attack for considering race as a factor in admissions decisions.
The recent report of the Department of Justice’s possible investigation of “intentional race-based discrimination in college and university admissions” demonstrates that the assaults aren’t likely to end anytime soon.
As a professor of law and scholar dedicated to ensuring equal educational opportunities for students of color, I believe now is an important time to earnestly consider other methods for diversifying student bodies. Race-neutral alternatives could effectively consider such factors as socioeconomic status and educational background, while supplementing more traditional affirmative action.
‘Race-based’ vs. ‘race-conscious’
When thinking about affirmative action, it’s important to first define (and debunk) a few key terms, starting with “race-based” and “race-conscious” affirmative action.
“Race-based affirmative action” is a misnomer often used to describe some college admissions policies. “Race-based” implies that an admissions decision is made solely because of or based upon an applicant’s race or ethnicity, which could not be farther from the truth. A university’s decision to admit, deny or waitlist an applicant is based upon myriad criteria, ranging from standardized test scores to state of residency. Race is just one of many admissions factors a university may consider.
This approach is more appropriately termed “race-conscious.”
Schools that employ race-conscious admissions policies do so in order to achieve the educational, social and democratic benefits of a diverse student body.
As the Supreme Court held in Gratz v. Bollinger, race is not and cannot be the determining factor under a constitutional race-conscious plan. Therefore, when people claim that an African-American or Hispanic student was admitted because of race, they’re often not only inaccurate but also dismissive of the student’s other numerous attributes that played a role in the university’s decision.
Opponents of race-conscious affirmative action often assert that such policies are racist or disproportionately benefit privileged minority students from middle- and upper-class backgrounds.
For its part, the Supreme Court is also skeptical of using racial classifications in governmental decision-making. As a result, it has held that institutions of higher education must afford serious consideration to “workable race-neutral alternatives” before implementing a race-conscious policy.
Importantly, the court’s use of the term “race-neutral” does not mean “race-blind.” That is, universities are permitted to think about how alternative admissions criteria could help them achieve their diversity goals. Race-neutral criteria could include socioeconomic background, high school or undergraduate institution, or class rank. In other words, these are factors that may contribute to a school’s racial diversity, but applicants themselves are not considered based on race.
In some cases, it’s proven difficult for race-neutral admissions policies to achieve the same levels of racial diversity as those achieved through direct consideration of race. However, such measures have been useful in helping to diversify student bodies when used in conjunction with or in lieu of race-conscious affirmative action.
The viability of race-neutral alternatives
When coupled with the stark racial disparities that continue to plague some professions, the uncertain future of race-conscious affirmative action calls for a renewed focus on alternatives that look beyond race alone.
Law schools play a unique role in training our country’s next generation of leaders. It is, in fact, vital to the future of our democracy that we continue to provide students from historically underrepresented racial groups with access to legal education. And yet, the legal profession was recently determined to be “one of the least racially diverse professions in the nation.”
To help law schools improve their diversity, we examined the relationship between race and race-neutral identity factors in law school admissions. The project, which was funded in part by a grant from AccessLex Institute, surveyed over a thousand first-year law students at schools throughout the country and asked about various aspects of their identity, such as socioeconomic status and educational background.
Our findings indicated that African-American and Hispanic students were significantly more likely than both white and Asian/Pacific Islander students to have qualified for free or reduced lunch programs in elementary or secondary school, had a parent or guardian who received public assistance when the student was a dependent minor, and received a Pell Grant during their undergraduate studies – all of which are race-neutral factors that schools could consider in admissions decisions.
How admissions could change
Based on the sample of participants in our study, it’s clear that privilege did not catapult all students of color to law school. Many of them had to overcome the structural inequalities of poverty, race and public education to embark on a legal career. Expanding opportunities for these and other minority students will benefit not only legal education and the legal profession, but also society more broadly.
Race-neutral admissions policies could help identify and create opportunities for these students.
To be clear, I do not advocate for the wholesale substitution of traditional race-conscious admissions measures with the factors we studied. Race-conscious policies continue to be the most effective means by which to create diverse student bodies.
However, we encourage law schools and other institutions of higher education to utilize these and other race-neutral admissions factors as a means of complying with the Supreme Court’s affirmative action mandates and testing the viability of policies that take such factors into account.
Doing so will help ensure that traditionally underrepresented students of color will continue to have access to colleges and universities that serve as gateways to career, financial and life opportunities.
Over the Thanksgiving holiday, you might have seen the high-budget trailer for Lifetime’s Griselda Blanco biopic, Cocaine Godmother. If not, here you go:
If you’re astute to representation issues, you probably know what I’m going to point out as the problem. Catherine Zeta-Jones, a Welsh woman, is playing Blanco, a Colombian woman. Why is she, though?
There are plenty Latina actresses who could have played this role, and in fact, there is one who has been lobbying for this role for a very long time–Jennifer Lopez. Lopez has been jonesing to play Blanco for years, and has created a deal with HBO to bring her TV movie to life (as to when that movie is coming remains to be seen).
Surprisingly, it’s also not the first time Zeta-Jones has been tapped to play Blanco; she was initially supposed to play the Queen of Cocaine in a biopic called The Godmother. According to W Magazine, Zeta-Jones won the role over…Jennifer Lopez. According to a source to The Sunday Times in 2016, despite Lopez’s hard lobbying for the role, she didn’t win out because “she doesn’t have the acting quality to pull it off.”
Today, neither woman are in the role–it now belongs to Oscar-nominated actress Catalina Sandino Moreno (Maria Full of Grace). But both women are gunning to have the last word on Blanco’s life. Right now, we’re seeing Zeta-Jones’ vanity project in the lead.
This gets back to the main point of this article–why is a non-Latina actress playing a Latina figure? From where I’m sitting, it seems like another case of Hollywood (and maybe even Zeta-Jones herself) believing in casting white actors in non-white roles because they have an ethnic “look.” It’s another, subtler kind of whitewashing.
There’s a reason Zeta-Jones has been able to play Latina on more than one occasion–she played a Latina character in The Mask of Zorro opposite Antonio Banderas–and that’s because she’s a white woman who has ethnically-ambiguous looks. Casting-wise, Zeta-Jones fits the model Hollywood looks for when casting a stereotypical non-black “Latina” role; she’s, as Hollywood would describe her, “exotic” thanks to her olive skin and curvy features. But casting her also comes with the added bonus of whiteness, which adds “credibility,” and “name recognition” to the role. In this way, Zeta-Jones can play both sides, having her cake and eating it, too.
But in the stills and trailer for Cocaine Godmother, you can still see Zeta-Jones exaggerating her already ethnically-ambiguous features to the point where it starts becoming character makeup. Her naturally olive skin is bronzed even further to get it closer to Blanco’s, making her skin look like it has an unnatural tan. Her nose is somehow contoured and highlighted to look even more bulbous in an effort to match Blanco’s nose in real life. The overall look is meant to make her look less like a Welsh-English woman and more like a woman of color–the makeup treatment doesn’t want you to equate Zeta-Jones’ performance with brownface, but let’s face it; it’s brownface.
This is also not the first time a white actress has used ethnic ambiguity to their advantage. Shirley Maclaine, who has naturally hooded eyes, was able to do it in the 1962 film that’s basically posits a white woman stealing a role from a Japanese woman as a comedy, My Geisha, and in 1966’s Gambit, in which she plays opposite Michael Caine as “exotic Eurasian showgirl” Nicole Chang. Most recently, Floriana Lima, an Italian-American actress, was able to use her looks to play Latina Supergirl character Maggie Sawyer. Many more examples exist beyond these two.
Zeta-Jones is looking to have her cake and eat it too again with Cocaine Godmother. But this time, there’s a little bit of pushback.
Hollywood is continuing its whitewashing. Catherine Zeta-Jones will be playing, yet another Latina, Griselda Blanco for a Lifetime movie
— Goonica's Coming 4 U (@AshleyShyMiller) October 23, 2017
Lifetime had to find Catherine Zeta Jones to play Griselda Blanco c'mon now. 😂😂Get outta here. At least chose someone of Latin descent
— Angel (@LiLmissknowit) October 21, 2017
Lifetime Casts Non-Latina Catherine Zeta-Jones as Griselda Blanco Bio-Pic
Catherine Zeta-Jones whose career took… https://t.co/eavwj5J0MA
— LatinHeatMag (@LatinHeatMag) May 19, 2017
— DELIA PAUNESCU (@delia_p) November 28, 2017
Why is Catherine Zeta Jones white ass playing Griselda Blanco? @lifetimetv you guys just don't learn🤦🏿
— AJACSMOM (@OVAIT) November 28, 2017
Why not cast a Latina actress to play a Latina role ?
— José Larrea (@JoseEditor) November 16, 2017
The noise around this film is only going to grow the closer we get to the film’s 2018 TV premiere. We’ll see how the film handles the impending whitewashing discussion it’ll inevitably come up against.
Cyntoia Brown in the Independent Lens documentary, Me Facing Life: Cyntoia’s Story.
When someone is sick or needs the help of a physician, who should decide what is appropriate – what blood tests and imaging studies to order, what medicines to prescribe, what surgeries to perform? Should it be the doctor, the patient or some combination of the two? Most people nowadays (even most physicians) support what is called ‘shared decision-making’, in which the doctor and patient (and often her family or friends) discuss the situation and come up with a joint plan. The doctor’s role is that of experienced guide, whose medical knowledge, skill and expertise help to shape the conversation and whose understanding of the priorities, values and goals of the patients steers the plan in a given direction to the satisfaction of all.
Unfortunately, in the real world, things don’t always work this way. Doctors and patients have a number of masters, both welcomed and uninvited. Insurance companies or other third-party payers often intrude into the decision-making process, limiting the choices of what services and products might be available: a sick patient often must wait for pre-authorisation for expensive diagnostic tests and procedures; pharmacy formularies restrict the kinds of drugs available for prescriptions, and so on. Furthermore, some doctors have personal interests in the interventions they recommend. Many surgeons make more money if they do more surgery, cardiologists earn more if they put in more cardiac stents and pacemakers, and drug companies have better profits if they sell drugs for chronic conditions that never get better and require lifelong medication (such as high cholesterol, hypertension and diabetes). Such practices contribute to the seeming inexorable rise in healthcare costs (and a host of adverse outcomes) in the United States.
Yet controlling cost without sacrificing quality has been a daunting task. One strategy might be to pay more attention to what patients need, and less to what they want, assuming that the two don’t overlap. Another is limiting the excess of doctors who prescribe because of conflicts of interest or acts of ‘defensive medicine’ – in other words, to protect themselves from lawsuits, not aid the patient.
How does one go about rationing care? Will faceless bureaucrats be denying granny her medication or access to an intensive-care unit solely because she’s old, or saying that Billy can’t get his conditions treated because he is disabled? Indeed, dread of rationing – as well as a healthy dose of old-fashioned fear-mongering by crafty politicians – is what inspired the meme of ‘death panels’, an unfounded canard based upon a misinterpretation of a proposed federal rule for Medicare. Nevertheless, the concept of rationing is still of concern because it implies restriction of a resource that could be beneficial.
Therefore, rationing doesn’t apply to interventions that can’t help anyone at any time – for instance, antibacterial antibiotics that won’t work because the patient has a viral infection. A better example of true rationing is the allocation of organs – such as livers, hearts and lungs – for transplantation. Organ transplantation requires rationing because the supply never keeps up with demand. We also ration drugs that can suddenly become scarce (a distressingly common problem).
But there are other forms of rationing that are problematic, too. The most common one, intrinsic to the US healthcare system, involves limiting the kind and amount of healthcare one can obtain based on one’s financial situation. Poorer people get less and worse healthcare than wealthy people. While the most offensive aspects of this arrangement have been mitigated to some extent in those states that expanded Medicaid under the auspices of the Affordable Care Act, there are still alarming numbers of Americans who have limited access to effective medical care. This is one of the chief reasons why the US population as a whole doesn’t get as much bang per buck as citizens of many other nations, and this form of rationing is blatantly unfair.
But there is another form of rationing that is more insidious still. This is the so-called bedside rationing, in which doctors decide, on an individual per-patient basis, what should be available to them, regardless of the range of services that their insurance or finances might otherwise allow. The problem with this is that it is readily susceptible to prejudice and discrimination, both overt and hidden. It is well-known that doctors, like pretty much everyone else, harbour so-called implicit biases that are readily revealed on the implicit-association test (available online).
This does not mean that physicians express overt sexism, racism, or others forms of bigotry – but rather that these unconscious beliefs about others can influence the kinds of treatments that they offer. Thus, bedside rationing can violate one of the cardinal principles of fairness – that clinically similar situations be treated similarly. So doctors could offer one patient (say, a well-off white person) with unstable angina and blocked coronary arteries the standard of care with cardiac catheterisation and stents, while offering just medical therapy to an African-American patient with comparable disease. And there is ample evidence that such differential treatment occurs.
So how does one ‘choose wisely’ and escape the moral pitfalls of bedside rationing? It turns out that this is an extraordinarily difficult to do, especially in a system such as ours where physicians have such discretionary power about what diagnostic and treatment interventions should be on the ‘menu’ for each patient. This can readily lead to too much and too little offered to patients for reasons that cannot be easily justified.
I think that the solution, at least in the US, might require a wholesale re-engineering of our healthcare system to minimise the financial incentives to overprescribe, and to protect or immunise against the biases that lead to inappropriate rationing at the bedside. The only way to reduce the frequency of these behaviours is to have a single-payer system that controls (to a certain extent) the availability of certain interventions, analogous to the way in which the organ-transplant system regulates who gets transplanted and under what circumstances.
Of course, unlike livers and hearts, what needs to be rationed in the US is money and what it can buy. We could save money by efficiencies of scale and decreasing the waste and administrative costs that contribute at least 25 per cent of the total cost of what we now spend. Can we totally eliminate ‘bad’ rationing? No, of course not. But Americans should do all they can to avoid the moral tragedy of being the wealthiest nation on Earth that chooses dumbly, not wisely, about healthcare.
This article was originally published at Aeon and has been republished under Creative Commons.