How DACA affected the mental health of undocumented young adults

A rally in support of DACA outside of the White House.
AP Photo/Jacquelyn Martin

Elizabeth Aranda, University of South Florida and Elizabeth Vaquera, George Washington University

“I am getting this wonderful education. I have a job. I fit in. At the same time, I feel at any moment that can change. I don’t think that most Americans live with that thought that anything can change [in] just one minute… My biggest fear is me getting deported or DACA being terminated and I go back to being here illegally.” –“Leticia”

“Leticia,” a pseudonym, is now 21. She came to the U.S. from Mexico at the age of eight. She is just one of the many undocumented young adults we have met in the course of our research.

With President Donald Trump’s reversal of an Obama-era executive order known as Deferred Action for Childhood Arrivals (DACA), Leticia’s worst fears seem to be coming true. It is now up to Congress to pass legislation that would grant “Dreamers” legal status. In the meantime, these youths’ dreams and aspirations are once again stalled, with another deadline and six more months of uncertainty, and thus, fear and anxiety.

Together, we have been researching the lives of immigrants for 26 years. Up until 2012, undocumented youth like Leticia found themselves with few options for making their aspirations a reality as they became adults.

This changed with DACA. The program granted certain undocumented youth temporary reprieve from deportation that could be renewed every two years, and identity papers such as driver’s licenses and social security cards. This gave recipients the ability to legally apply for a job or admission into institutions of higher education.

Since DACA passed, youth like Leticia have been able to further their education and obtain jobs and health insurance along with being granted many other rights. Our research demonstrates that DACA has enabled youth and young adults not just to work toward building their own futures, but also to find peace of mind – something that, until then, was unfamiliar to them.

Personal trauma and emotional well-being

Participants in our studies commonly discussed chronic feelings of sadness and worry. Their mental health statuses were precarious prior to DACA. Most did not know they were undocumented until a caregiver told them, usually in late adolescence. To them, finding out about their undocumented status proved to be a source of personal trauma. Their status disrupted their dreams and eroded the trust they had placed in their families, friends and social institutions.

Some participants admitted that, prior to DACA, they had thought about suicide. Feeling hopelessness because of their undocumented status, a few had harmed themselves or even attempted suicide. According to news reports, at least one young Dreamer ended his own life as a result of this anguish.

We found that one way that undocumented youth coped with feelings of isolation was to join immigrant organizations and to volunteer in immigrant advocacy activities. The social connections they developed in these groups fostered relationships that supported them in times of despair.

Then, DACA brought relief and improved their mental health. These youth shared with us that they were more motivated and happy after Obama’s executive order. As Kate, one of our participants, told us, DACA “has gone a long way to give me some sense of security and stability that I haven’t had in a very long time.” Even with DACA, these youth maintained their involvement in organizations to help “give back” to their communities.

Almost 800,000 youth trusted the government with their “fingerprints” and other personal information when they applied for DACA. In return, the two-year reprieve from deportation lifted the constant, everyday fear of existence that characterized their lives. These mental health gains, in addition to the fruits of all of their hard work over the past five years, are now threatened.

The road ahead

These young adults are thoroughly vetted and are either well on their way to or already contributing in significant ways to their communities and the country. Alonso Guillen, to cite just one recent example, lost his life while rescuing victims of Hurricane Harvey. Many have contributed to the U.S. economy – 5.5 percent of DACA recipients have started their own businesses and 87 percent are employed.

With the demise of DACA, these youth may feel that the trust they placed in government has been betrayed. In our research, before Donald Trump was a presidential candidate, we often heard participants expressing fear that DACA may be temporary – but it was always hypothetical. One of our participants, “Mariposa,” said she was “on the list,” and worried that the U.S. government would know exactly where to find her if DACA should end.

If our research and the history of social activism of Dreamers tells us one thing, it is that these youth are resilient. The U.S. is their home, the only place they consider home, and where they want to stay and contribute.

The ConversationOur work shows that being part of organizations that support immigrants is crucial to promoting a sense of social and emotional well-being. These organizations, at least, may continue to provide spaces where youth can come together and feel like they belong. Meanwhile, Dreamers can only hope Congress can find a solution that will help them trust once again in America’s institutions.

Elizabeth Aranda, Professor of Sociology, University of South Florida and Elizabeth Vaquera, Director of Cisneros Hispanic Leadership Institute, George Washington University

This article was originally published on The Conversation. Read the original article.

How doctors’ bias leads to unfair and unsound medical triage

Photo by Piron Guillaume on Unsplash

Philip Rosoff

When someone is sick or needs the help of a physician, who should decide what is appropriate – what blood tests and imaging studies to order, what medicines to prescribe, what surgeries to perform? Should it be the doctor, the patient or some combination of the two? Most people nowadays (even most physicians) support what is called ‘shared decision-making’, in which the doctor and patient (and often her family or friends) discuss the situation and come up with a joint plan. The doctor’s role is that of experienced guide, whose medical knowledge, skill and expertise help to shape the conversation and whose understanding of the priorities, values and goals of the patients steers the plan in a given direction to the satisfaction of all.

Unfortunately, in the real world, things don’t always work this way. Doctors and patients have a number of masters, both welcomed and uninvited. Insurance companies or other third-party payers often intrude into the decision-making process, limiting the choices of what services and products might be available: a sick patient often must wait for pre-authorisation for expensive diagnostic tests and procedures; pharmacy formularies restrict the kinds of drugs available for prescriptions, and so on. Furthermore, some doctors have personal interests in the interventions they recommend. Many surgeons make more money if they do more surgery, cardiologists earn more if they put in more cardiac stents and pacemakers, and drug companies have better profits if they sell drugs for chronic conditions that never get better and require lifelong medication (such as high cholesterol, hypertension and diabetes). Such practices contribute to the seeming inexorable rise in healthcare costs (and a host of adverse outcomes) in the United States.

Yet controlling cost without sacrificing quality has been a daunting task. One strategy might be to pay more attention to what patients need, and less to what they want, assuming that the two don’t overlap. Another is limiting the excess of doctors who prescribe because of conflicts of interest or acts of ‘defensive medicine’ – in other words, to protect themselves from lawsuits, not aid the patient.

How does one go about rationing care? Will faceless bureaucrats be denying granny her medication or access to an intensive-care unit solely because she’s old, or saying that Billy can’t get his conditions treated because he is disabled? Indeed, dread of rationing – as well as a healthy dose of old-fashioned fear-mongering by crafty politicians – is what inspired the meme of ‘death panels’, an unfounded canard based upon a misinterpretation of a proposed federal rule for Medicare. Nevertheless, the concept of rationing is still of concern because it implies restriction of a resource that could be beneficial.

Therefore, rationing doesn’t apply to interventions that can’t help anyone at any time – for instance, antibacterial antibiotics that won’t work because the patient has a viral infection. A better example of true rationing is the allocation of organs – such as livers, hearts and lungs – for transplantation. Organ transplantation requires rationing because the supply never keeps up with demand. We also ration drugs that can suddenly become scarce (a distressingly common problem).

But there are other forms of rationing that are problematic, too. The most common one, intrinsic to the US healthcare system, involves limiting the kind and amount of healthcare one can obtain based on one’s financial situation. Poorer people get less and worse healthcare than wealthy people. While the most offensive aspects of this arrangement have been mitigated to some extent in those states that expanded Medicaid under the auspices of the Affordable Care Act, there are still alarming numbers of Americans who have limited access to effective medical care. This is one of the chief reasons why the US population as a whole doesn’t get as much bang per buck as citizens of many other nations, and this form of rationing is blatantly unfair.

But there is another form of rationing that is more insidious still. This is the so-called bedside rationing, in which doctors decide, on an individual per-patient basis, what should be available to them, regardless of the range of services that their insurance or finances might otherwise allow. The problem with this is that it is readily susceptible to prejudice and discrimination, both overt and hidden. It is well-known that doctors, like pretty much everyone else, harbour so-called implicit biases that are readily revealed on the implicit-association test (available online).

This does not mean that physicians express overt sexism, racism, or others forms of bigotry – but rather that these unconscious beliefs about others can influence the kinds of treatments that they offer. Thus, bedside rationing can violate one of the cardinal principles of fairness – that clinically similar situations be treated similarly. So doctors could offer one patient (say, a well-off white person) with unstable angina and blocked coronary arteries the standard of care with cardiac catheterisation and stents, while offering just medical therapy to an African-American patient with comparable disease. And there is ample evidence that such differential treatment occurs.

So how does one ‘choose wisely’ and escape the moral pitfalls of bedside rationing? It turns out that this is an extraordinarily difficult to do, especially in a system such as ours where physicians have such discretionary power about what diagnostic and treatment interventions should be on the ‘menu’ for each patient. This can readily lead to too much and too little offered to patients for reasons that cannot be easily justified.

I think that the solution, at least in the US, might require a wholesale re-engineering of our healthcare system to minimise the financial incentives to overprescribe, and to protect or immunise against the biases that lead to inappropriate rationing at the bedside. The only way to reduce the frequency of these behaviours is to have a single-payer system that controls (to a certain extent) the availability of certain interventions, analogous to the way in which the organ-transplant system regulates who gets transplanted and under what circumstances.

Of course, unlike livers and hearts, what needs to be rationed in the US is money and what it can buy. We could save money by efficiencies of scale and decreasing the waste and administrative costs that contribute at least 25 per cent of the total cost of what we now spend. Can we totally eliminate ‘bad’ rationing? No, of course not. But Americans should do all they can to avoid the moral tragedy of being the wealthiest nation on Earth that chooses dumbly, not wisely, about healthcare. Aeon counter – do not remove

Philip Rosoff is professor of paediatrics and director of the clinical ethics programme at Duke University Hospital in North Carolina. He is the author Drawing the Line: Healthcare Rationing and the Cutoff Problem (2017).

This article was originally published at Aeon and has been republished under Creative Commons.

Gun violence in the US kills more black people and urban dwellers

A man changes a flag to half-staff near the First Baptist Church of Sutherland Springs.
AP Photo/Eric Gay

Molly Pahn, Boston University; Anita Knopov, Boston University, and Michael Siegel, Boston University

On Nov. 5, just 35 days after the deadly Las Vegas shooting, a man walked into a church in a small Texas town and murdered 26 people with an assault rifle. The coverage dominated the news.

But the day before, even more people – 43 – were shot to death in cities and towns around the country. And nobody really seemed to notice.

Shootings kill more than 36,000 Americans each year. Every day, 90 deaths and 200 injuries are caused by gun violence. Unlike terrorist acts, the everyday gun violence that impacts our communities is accepted as a way of life.

Of all firearm homicides in the world, 82 percent occurs in the United States. An American is 25 times more likely to be fatally shot than a resident of other high-income nations.

As public health scholars who study firearm violence, we believe that our country is unique in its acceptance of gun violence. Although death by firearms in America is a public health crisis, it is a crisis that legislators accept as a societal norm. Some have suggested it is due to the fact that it is blacks and not whites who are the predominant victims, and our data support this striking disparity.

Urban and racial disparities

Within the United States, the odds of dying from firearm homicide are much higher for Americans who reside in cities. Twenty percent of all firearm homicides in the U.S. occur in the country’s 25 largest cities, even though they contain just over one-tenth of the U.S. population. Data from the Centers for Disease Control and Prevention show that of the 12,979 firearm homicides in 2015, 81 percent occurred in urban areas.

There is even more to the story: CDC data also show that within our nation’s cities, black Americans are, on average, eight times more likely to be killed by firearms than those who are white. The rate of death by gun homicide for black people exceeds those among whites in all 50 states, but there is tremendous variation in the magnitude of this disparity. In 2015, a black person living in Wisconsin was 26 times more likely to be fatally shot than a white person in that state. At the same time, a black person in Arizona was “only” 3.2 times more likely than a white person to be killed by a gun. The combination of being black and living in an urban area is even more deadly. In 2015, the black homicide rate for urban areas in Missouri was higher than the total death rate from any cause in New York state.

These differences across states occur primarily because the gap between levels of disadvantage among white and black Americans differs sharply by state. For example, Wisconsin – the state with the highest disparity between black and white firearm homicide rates – has the second-highest gap of any state between black and white incarceration rates, and the second-highest gap between black and white unemployment rates. Racial disparities in advantage translate into racial disparities in firearm violence victimization.

Americans are 128 times more likely to be killed in everyday gun violence than by any act of international terrorism. And a black person living in an urban area is almost 500 times more likely to be killed by everyday gun violence than by terrorism. From a public health perspective, efforts to combat firearm violence need to be every bit as strong as those to fight terrorism.

The ConversationThe first step in treating the epidemic of firearm violence is declaring that the everyday gun violence that is devastating the nation is unacceptable. Mass shootings and terrorist attacks should not be the only incidents of violence that awaken Americans to the threats to our freedom and spur politicians to action.

Molly Pahn, Research Manager, Boston University; Anita Knopov, Research fellow, Boston University, and Michael Siegel, Professor of Community Health Sciences, Boston University

This article was originally published on The Conversation. Read the original article.