In the 1980s, France went through a heroin epidemic in which hundreds of thousands became addicted. Mohamed Mechmache, a community activist, described the scene in the poor banlieues back then: “To begin with, they would disappear to shoot up. But after a bit we’d see them all over the place, in the stairwells and halls, the bike shed, up on the roof with the washing lines. We used to collect the syringes on the football pitch before starting to play," he told The Guardian in 2014.
The rate of overdose deaths was rising 10 percent a year, yet treatment was mostly limited to counseling at special substance-abuse clinics.
In 1995, France made it so any doctor could prescribe buprenorphine without any special licensing or training. Buprenorphine, a first-line treatment for opioid addiction, is a medication that reduces cravings for opioids without becoming addictive itself.
With the change in policy, the majority of buprenorphine prescribers in France became primary-care doctors, rather than addiction specialists or psychiatrists. Suddenly, about 10 times as many addicted patients began receiving medication-assisted treatment, and half the country’s heroin users were being treated. Within four years, overdose deaths had declined by 79 percent.
Of course, France has a socialized medical system in which many users don’t have to worry about cost, and the country also developed a syringe-exchange program around the same time. Some of the users did sell or inject the buprenorphine (as opposed to taking it orally, as indicated), though these practices...
VIEQUES, P.R.—As the cry of a rooster heralded the dawn, Joe Garcia, 41, pulled a vial of insulin from the fridge. He filled a syringe and wrapped it in aluminum foil in preparation for the long day ahead.
“I tell him that from here to there, that’ll spoil,” said his mother, Martina Collazo de Jesus, 63, watching the preparations under the fluorescent bulb lighting the family kitchen.
It is a gamble Garcia, who has both diabetes and kidney failure, has taken since Hurricane Maria slammed this Puerto Rican island just east of the main island. More than six months after the storm, Garcia and 13 other Vieques residents must still board a plane three days a week for kidney dialysis on Puerto Rico’s main island.
Hurricane Maria totaled Vieques’s hospital, which housed the island’s only dialysis clinic.
That set off an ongoing crisis for patients with kidney failure like Garcia—who cannot survive without dialysis and for whom the thrice-weekly round trip to a dialysis center in Humacao on Puerto Rico’s main island, including treatment, takes at least 12 hours.
When seriously ill patients like Garcia will again be able to access their lifesaving treatments in Vieques remains uncertain, as federal and local officials and nonprofit groups debate strategy and finances. No one knows when the hospital will be rebuilt, either. And the government and nonprofit organizations continue to punt the responsibility of paying for the flights.
“This is really hard,” Garcia said, as he prepared for his long...
In 2016, I became the lucky parent of a newborn who slept horribly. Of course, this meant that my wife and I slept horribly, too. We rested in small snatches and were constantly irritable. We were a mess.
As a result, I became consumed with the idea of minimizing my need for sleep as much as possible. I had always required less sleep than my wife, but I thought that if I could just find some clever solution, some trick or tool, I might be cured of this time suck forever. I wanted to hack my need for sleep.
Eventually, I researched polyphasic sleep, a trend among the kind of people who quantify every aspect of their nutrient intake. Taken to its extreme, the practice promised the magic I was looking for: Simply sleep for 20 minutes or so every few hours, and eventually you’ll only need two or three hours of sleep a day. Sleep would be conquered! In its place, productive bliss.
I never ended up attempting polyphasic sleep. Its daunting requirements seemed destined to interfere with any semblance of normal family life. But the siren song of the quick, easy fix through a simple behavioral change or chemical consumed continues to appeal to me, as it does to many others: In Silicon Valley, this subset of biohacking is as strong as ever. Often coupled with its pharmacological sibling of nootropics (chemicals for cognitive enhancement), this trend of attempting to reengineer and overclock one’s physiology promises to make your...
Updated on April 12 at 4:55 p.m. ET
Vertis Boyce got the call from her transplant surgeon last July. We have a kidney for you, Jeffrey Veale explained on the phone, but it has an unusual backstory. The kidney was first transplanted two years ago from a 17-year-old girl into a man in his early 20s, who just unexpectedly died in a car accident. Boyce would be its second recipient. Did she want it?
Boyce had by then been on dialysis for nine-and-a-half years and on the transplant list for nearly as long. “I thought, I’m 69 years old. When could I get a second chance? I really thought I wouldn’t get a kidney,” she recalls. So she said yes. Soon, she was on a plane from Las Vegas to Ronald Reagan UCLA Medical Center, where Veale performed the transplant.
Boyce took the chance because she did not want to be one of the 13 people who die waiting for a kidney transplant every day. The kidney-transplant list in the United States has 100,000 people, of whom only 17,000 will get transplants each year. In the face of this cold brutal math, doctors have tried a variety of ways to expand the pool of available organs—taking up organs from older donors as well as donors who suffered a cardiac death rather than brain death. Reusing previously transplanted organs, however, is rarely considered. “It’s just dogma,” says Veale. “It’s almost like taboo to retransplant a kidney.”
I remember the bag from my childhood. Transparent and oblong, just large enough to fit a handful of papers, a few essentials, and a plastic brain.
My 93-year-old grandmother, Marjorie Pearlson, once loved this bag, filling it with conversation starters. She was a woman who could talk to any stranger and pull an organ replica out of her purse with a straight face. Growing up, I would witness this scene at the supermarket, in a post office, out at dinner between salad and the main course. It was brilliant performance art. She was passionate about her decision to donate her brain to science, following in the footsteps of her mother, who survived multiple brain surgeries, and her older brother. She spoke about the decision—visual aid in tow—matter-of-factly to anyone.
At her 90th-birthday tea party, we sat around my grandparents’ dining table with two of her childhood friends and my mother. My grandmother flipped through a handmade scrapbook of photographs, effortlessly recalling the first and last names of grade-school classmates, one of whom is my 92-year-old grandfather. Today, it’s far rarer to see her smile, much less get a glimpse of her past or quirky personality. A shield of advanced dementia has limited her mobility and dissolved her memories.
There’s an Indian death euphemism I’ve been dwelling on since the last time I saw my grandmother: “To be no more.” But even after she dies, my grandmother’s brain will, in a way, live on, joining the thousands of Americans who donate each year....
The advice column as we know it today started with a deception. In The Athenian Mercury, a London magazine that ran from 1690 to 1697, the Athenian Society—supposedly a group of 30-some experts across many fields—answered anonymous reader questions. They replied to all sorts of queries, as Jessica Weisberg recounts in her new book Asking for a Friend: “Why alcohol killed erections and made people slur, why horse excrement was square, if people born with missing body parts were also missing part of their soul, and if the sun was made of fire.”
In actuality, the Athenian Society was just a handful of men—a publisher named John Dunton, his two brothers-in-law, and a man who “they were 50 percent sure was a doctor,” Weisberg says.
But dubious expertise has never stopped anyone from giving advice. And since the days of the Mercury, people have continued to gobble up guidance from wherever it is on offer. Americans, especially, are enamored with advice, Weisberg writes, whether that comes in the classic form of a column like Dear Abby, from a self-help book like Dale Carnegie’s How to Win Friends and Influence People, or from the anonymous masses on Quora or Reddit.
Weisberg takes a wide sample of advice-givers across time and profiles each of them in-depth: from Benjamin Franklin to Miss Manners to Joan Quigley*, the astrologer who advised Nancy Reagan. She considers the book a work of “emotional history.” Advice both shapes and reflects the society it exists in; she believes...
We’ve known for some time now that Americans are increasingly dying younger, but the scale and nature of the problem has been a little bit murky. There was speculation that the downturn in American life expectancy was all thanks to “deaths of despair,” but some experts have said that might not be the full story, and that obesity and tobacco are still major factors in American mortality.
A new study out today in the Journal of the American Medical Association drills down into which states are showing increases in deaths among the young, and why. In doing so, it reveals a profound disparity among the states when it comes to both life expectancy and disability.
Most startlingly, since 1990, 21 states have seen an increase in the death rate among people aged 20 to 55. In five states—Kentucky, Oklahoma, New Mexico, West Virginia, and Wyoming—the probability of early death among young adults rose by more than 10 percent in that time frame. Meanwhile, in New York and California, young and middle-aged people became much less likely to die in the same time period. The authors note that opioids, alcoholism, suicide, and kidney disease—which can be brought on by diabetes and alcoholism—were the main factors leading to the increases in early deaths.
In 2016, the 10 states with the highest probability of premature death among 20- to 55-year-olds were West Virginia, Mississippi, Alabama, Oklahoma, Kentucky, Arkansas, New Mexico, Louisiana, Tennessee, and South Carolina.
Meanwhile, the 10 states with lowest...
I went to medical school, at least in part, to get to know death and perhaps to make my peace with it. So did many of my doctor friends, as I would find out. One day—usually when you’re young, though sometimes later—the thought hits you: You really are going to die. That moment is shocking, frightening, terrible. You try to pretend it hasn’t happened (it’s only a thought, after all), and you go about your business, worrying about this or that, until the day you put your hand to your neck—in the shower, say—and … What is that? Those hard lumps that you know, at first touch, should not be there? But there they are, and they mean death. Your death, and you can’t pretend anymore.
I never wanted to be surprised that way, and I thought that if I became a doctor and saw a lot of death, I might get used to it; it wouldn’t surprise me, and I could learn to live with it. My strategy worked pretty well. Over the decades, from all my patients, I learned that I would be well until I got sick and that although I could do some things to delay the inevitable a bit, whatever control I had was limited. I learned that I had to live as if I would die tomorrow and at the same time as if I would live forever. Meanwhile, I watched as what had been called “medical care”—that is, treating the sick—
Much as the role of the addictive multibillion-dollar painkiller OxyContin in the opioid crisis has stirred controversy and rancor nationwide, so it has divided members of the wealthy and philanthropic Sackler family, some of whom own the company that makes the drug.
In recent months, as protesters have begun pressuring the Metropolitan Museum of Art in New York and other cultural institutions to spurn donations from the Sacklers, one branch of the family has moved aggressively to distance itself from OxyContin and its manufacturer, Purdue Pharma. The widow and one daughter of Arthur Sackler, who owned a related Purdue company with his two brothers, maintain that none of his heirs have profited from sales of the drug. The daughter, Elizabeth Sackler, told The New York Times in January that Purdue Pharma’s involvement in the opioid epidemic was “morally abhorrent to me.”
Arthur died eight years before OxyContin hit the marketplace. His widow, Jillian Sackler, and Elizabeth, who is Jillian’s stepdaughter, are represented by separate public-relations firms and have successfully won clarifications and corrections from media outlets for suggesting that sales of the potent opioid enriched Arthur Sackler or his family.
But an obscure court document sheds a different light on family history—and on the campaign by Arthur’s relatives to preserve their image and legacy. It shows that the Purdue family of companies made a nearly $20 million payment to the estate of Arthur Sackler in 1997—two year after OxyContin was approved, and just as the pill was becoming a big...
The biggest surprise about Tuesday’s shooting at YouTube wasn’t the fact that there was a shooting. Americans are horribly used to the ritual of these events by now: the sick feeling of waiting for the body count, the time it takes for biographical information to trickle out and a motive to be set forth, the think pieces advocating for fewer guns or more guns, excoriating white male rage or toxic masculinity. But one part of the script was upended in Tuesday’s shooting: The person holding the gun was a woman.
“Mass murder is typically a profoundly male act,” write the criminology professors Eric Madfis and Jeffrey W. Cohen in a paper published in Violence and Gender. The statistics leave no room for doubt: Women are far less likely to commit any sort of murder, much less mass murder. According to Extreme Killing: Understanding Serial and Mass Murder, 93.4 percent of mass killers are male, as are 88.3 percent of homicide offenders in general. And then someone like Nasim Najafi Aghdam, the YouTube shooter, comes along.
To be fair, Aghdam does not technically qualify as a mass murderer. She wounded three before apparently killing herself, while a mass murderer is defined by the FBI as someone who kills four or more in a single incident, usually in a single location. She was an “active shooter,” which the Department of Homeland Security defines as “an individual actively engaged in killing or attempting to kill people in a...
Over a single weekend in March, an unprecedented disaster hit fertility clinics—twice.
First came the news that the University Hospitals Fertility Center in Ohio, lost more than 4,000 eggs and embryos in a malfunctioning cryogenic tank. Then, in an unrelated incident, Pacific Fertility Center in California reported that liquid-nitrogen levels had fallen too low in a tank holding “several thousand” eggs and embryos, affecting an unconfirmed number.
In-vitro fertilization can be a draining process—financially, physically, emotionally. And for some families, these embryos had been their last chance to have biological children. Dozens of lawsuits have since been filed; parents and would-be parents spoke of the children they will never have, of the siblings their children will never know. At times, they spoke not just of “embryos” but of “babies.”
“How many babies are at risk right now while we sit, while we talk?” asked Wendy Penniman, who lost embryos in the malfunctioning Ohio tank, on the Today show.
On Friday, as first reported by Courthouse News, Penniman’s lawyer, Bruce Taubman, filed a complaint asking the court to consider an embryo a person. The filing—which comes in addition to a class-action lawsuit already filed March 12 on behalf of Penniman and her husband, Rick—asks for a declaratory judgment that “the life of a person begins at the moment of conception” and “the legal status of an embryo is that of a person.”
Taubman says that he filed the complaint to “unclog the logjam” of...
DACA, or Deferred Action for Childhood Arrivals, is the Obama-era policy that allows 1.3 million undocumented immigrants who came to the United States as children to stay and work here legally. Those who meet the criteria are protected from deportation for a period of two years, which can be renewed.
The Trump administration plunged this program into a state of uncertainty last September. First, it announced the end of DACA, saying the program wouldn’t be accepting new applicants and that everyone would be kicked out of the program starting March 5 of this year. However, a series of temporary court rulings earlier this year blocked the program’s termination, allowing DACA recipients to continue to apply for renewals to their status, just as they had under Obama.
President Trump then said he wanted to reach a more permanent deal with Congress to protect the Dreamers, as DACA recipients are called, from deportation in exchange for funding for the border wall. But then, a few days ago Trump took to Twitter vowing to fight DACA supporters:
Border Patrol Agents are not allowed to properly do their job at the Border because of ridiculous liberal (Democrat) laws like Catch & Release. Getting more dangerous. “Caravans” coming. Republicans must go to Nuclear Option to pass tough laws NOW. NO MORE DACA DEAL!— Donald J. Trump (@realDonaldTrump) April 1, 2018
He seemed to reveal a lack of knowledge of the program, writing:
Austin, Texas, recently experienced 19 days of terror at the hands of an unknown figure, as hundreds of law-enforcement officers crisscrossed the Texas capital in a race to track down a shadow. We now know the “who”: The bombings are suspected to have been perpetrated by a 23-year-old, homegrown, unemployed community-college dropout named Mark Anthony Conditt. Investigators probably know the makeup of the mechanical switches he used to detonate his seven explosive devices, ones filled with smokeless powder along with nails to enhance their shrapnel effect. The Bureau of Alcohol, Tobacco, Firearms, and Explosives (ATF) and the FBI likely rebuilt each device to study it. What we don’t know is the composition of the switch in the bomber’s head that, once flipped, allowed him to move forward with his assault on the sense of safety and security of the city of Austin. It’s the “why” we don’t understand.
Conditt, who would say in regard to these bombings, “I wish I were sorry, but I am not,” did not seem markedly different from other men and women his age in his community. Conditt once identified himself as politically conservative, with some making much of his six-year-old statements against abortion and gay marriage. Others countered that he was against sex offenders being labeled for life, a position perhaps associated with a more liberal base. So what drove him to murder? These were not spontaneous acts. These bombings were a planned, methodical series of decisions that he could have stopped...
Nine million veterans will soon be under the care of the emergency-physician rear admiral Ronny Jackson, pending confirmation of his appointment Wednesday to lead the U.S. Department of Veterans Affairs.
This seemingly mundane appointment—a doctor and naval officer with years of experience as White House physician under both Trump and Obama—is of great consequence. It comes at a time when the VA is in need of a politically savvy expert on health-care administration, budgeting, and resource allocation, as the system is on the brink of major changes that bear on national security. The system has proven to require a leader who can thread multiple bureaucratic needles with his or her eyes closed. Jackson does not clearly fit this bill.
The VA is the second-largest federal department, overseeing 1,243 health-care facilities including 170 hospitals, which tend to be a ghostly network of dim, mid-century structures that bear the scars of serving as constant political battlefields. They tend to have bad food and no marble and bizarre gift shops that I’ve seen sell knives and cured meats. Yet VA hospitals seem to underscore the waste of the glitz of five-star-hospital-style academic medical centers. The system punches above its weight in the quality and safety of care it delivers compared to most of the private health-care industry.
While it is crucial to have experienced veterans and physicians in the upper echelons of a system like this, the work is mostly about politics and economics. Jackson is not an expert in policy, and he lacks...
At the turn of the 20th century, prominent physicians who were trying to understand where mental illness comes from seized on a new theory: autointoxication. Intestinal microbes, these doctors suggested, are actually dangerous to their human hosts. They have a way of inducing “fatigue, melancholia, and the neuroses,” as a historical article in the journal Gut Pathogens recounts.
“The control of man’s diet is readily accomplished, but mastery over his intestinal bacterial flora is not,” wrote a doctor named Bond Stow in the Medical Record Journal of Medicine and Surgery in 1914. “The innumerable examples of autointoxication that one sees in his daily walks in life is proof thereof ... malaise, total lack of ambition so that every effort in life is a burden, mental depression often bordering upon melancholia.”
Stow went on to say that “a battle royal must be fought” with these intestinal germs.
Another physician, Daniel R. Brower of Rush Medical College, suspected that the increasing rates of melancholia—depression—in Western society might be the result of changing dietary habits and the resulting toxins dwelling in the gut.
Of course, like most medical ideas at the time, this one was not quite right. (And the proposed cures—removing part of the colon or eating rotten meat—seem worse than the disease.) Your gut doesn’t contain “toxins” that are poisonous so much as it hosts a diverse colony of bacteria called the “microbiome.” But these doctors were right about one thing: What we eat does affect how we feel, and gut microbes...
DURBAN, South Africa—Ronald Louw was a human-rights lawyer and professor at the University of KwaZulu-Natal, the South African province that’s one of the most HIV-affected regions of the world, so he must have known about the dangers of the virus. In April 2005, he was taking care of his mother, who had been diagnosed with cancer, when he noticed he had a cough that would not go away. He went to a doctor, who treated him with antibiotics.
Four weeks later, he got even worse, fighting a fever, night sweats, and disorientation, as his friend and fellow activist Zackie Achmat recounted later in a journal article. It was only then that Louw finally went in for an HIV test. He was positive.
A month later, doctors told him his persistent cough was actually tuberculosis—one of the leading causes of death for people with HIV. Three days later, Louw was dead at the age of 46.
“Smart, educated, and surrounded by friends who understand HIV/AIDS, yet even Louw failed to get tested early,” Achmat wrote later in an op-ed. “He died because he did not get tested early. And, when he discovered his HIV status, his lungs and immune system were destroyed.
Louw’s case provides a stark example of one reason why the HIV epidemic in South Africa remains the largest in the world, even though the country provides antiretroviral treatment to anyone for free. ARVs, which slow down damage to the immune system, are now able to help HIV-positive...
Maternity Desert, a new documentary from The Atlantic, follows Amber Pierre, a 24-year-old African-American woman living in southeast D.C. Pierre is pregnant with her second child. After two previous miscarriages, she is navigating a high-risk pregnancy that, combined with her Medicaid coverage, requires she visit a hospital every two weeks to be seen by an Ob-Gyn.
Following the 2017 closures of Providence Hospital and United Medical Center, Pierre must travel to Medstar Washington Hospital Center to receive prenatal care—a trip that can take over an hour on public transportation. Pierre says long wait times and frequent rescheduling have cost her a waitressing job.*
Pierre lives in Anacostia, the area of D.C. most affected by the closures. In these neighborhoods, the population is 93 percent black and 32 percent below the poverty line. Across the U.S., black mothers are three times more likely to die from pregnancy-related complications than white mothers. In the nation’s capital, where the maternal mortality rate is already twice as high as the national average, two recent hospital closures have the potential to make this disparity much worse.
“Every black woman who makes it and has a full term baby— it's just like, ‘You made it!’” says Aza Nedhari, founder of the Washington, D.C. perinatal support organization Mamatoto Village.
*This article has been updated to clarify how the long waits have inconvenienced Pierre.
“It starts in the back of my neck,” Javier Palejko told me over Skype. “It’s like I have a muscle there and I just make it work.”
The “it” in this case was goose bumps, which Palejko, a 34-year-old tech worker in Argentina, says he can control at will. Like most unexceptional people—by which I mean, people whose goose bumps only appear when we’re cold or feeling intense emotions—I could not even begin to imagine how to control goose bumps. I inquired, could he do it, like, right now?
“Let’s try,” he said, angling the webcam toward his forearm. “Do you see it?” And sure enough, within two seconds, the hair follicles on his arm had become bumps, visible even on a grainy Skype video. “I thought everyone can do that,” Palejko said.
Everyone cannot do it. But Palejko is not alone, either. He is among dozens of people that James Heathers, a postdoctoral researcher at Northeastern University, identified during and after a recent study on the phenomenon. Heathers posted a preprint—which has not yet been peer reviewed—describing 32 people who can control their goose bumps, and he’s been contacted by several others since. Many of them, like Palejko, had thought this ability was perfectly ordinary for most of their lives. Palejko told me his brother can do it, too.
But this is not how the human nervous system usually works. Scientists think goose bumps are a reflex left over from our hairy ancestors, whose...
LITTLETON, Colo.—Evan Todd, then a sophomore at Columbine High School, was in the library on the day 19 years ago when Eric Harris appeared in the doorway, wielding a shotgun. Harris fired in his direction. Debris, shrapnel, and buckshot hit Todd’s lower back; he fell to the ground and ducked behind a copy machine. Harris fired several more shots toward Todd’s head, splintering a desk and driving wood chips into Todd’s left eye.
Todd listened for several more minutes as Harris and Dylan Klebold murdered their classmates, taunting them as they screamed. Todd prayed silently: “God, let me live.”
Then Klebold pulled back a chair and found Todd hiding underneath a table.
He put a gun to Todd’s head. "Why shouldn't I kill you?" he asked.
“I've been good to you,” Todd said.
Klebold looked at Harris. “You can kill him if you want,” Klebold told his teenage co-conspirator.
No one knows why—indeed, no one knows the “why” behind such violence—but that’s when Harris and Klebold left the library. Todd got to live.
Thirteen people did not, though. Today, that’s why Todd supports allowing teachers to have guns in schools. Teachers shouldn’t be required to be armed, he says, but if they already have a concealed-weapons permit, and they’re already comfortable using a gun, why not let them have it with them in school, the place they are most of the day, and the place where these attacks happen over and over again?
Today, Todd is a stocky, bearded manager of construction projects, and describes himself as...
In the first year of his administration, Donald Trump has repeatedly filled important scientific positions with candidates who seem to be either unqualified for the roles or diametrically opposed to the very purpose of those roles. Scott Pruitt was chosen to lead the Environmental Protection Agency after having repeatedly sued it. Rick Perry became Secretary of Energy, heading a department that he formerly wanted to eliminate and that he couldn’t remember the name of. Sam Clovis, a now-withdrawn nominee for chief scientist at the United States Department of Agriculture, had no scientific background. Brenda Fitzgerald seemed a reasonable choice to direct the Centers of Disease Control and Prevention (CDC) but was forced to resign after Politico reported that she had bought shares in a tobacco company shortly after taking up her post.
Given this parade of foxes in henhouses, it should have been a moment of joyous relief when Robert Redfield was confirmed as the new director of the Centers for Disease Control and Prevention on Wednesday. A leading virologist, Redfield has spent more than 30 years researching HIV and other infectious diseases. He served in the U.S. Army Medical Corps for 20 years and later cofounded the Institute of Human Virology at the University of Maryland School of Medicine, where he now acts as associate director. He has overseen a clinical program that treats more than 5,000 patients in the Baltimore-Washington area, and has experience treating people in sub-Saharan Africa. His supporters speak of him...