Arquivo da tag: Suicídio

The Terrible Costs of a Phone-Based Childhood (The Atlantic)

theatlantic.com

The environment in which kids grow up today is hostile to human development.

By Jonathan Haidt

Photographs by Maggie Shannon

MARCH 13, 2024


Two teens sit on a bed looking at their phones

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

Something went suddenly and horribly wrong for adolescents in the early 2010s. By now you’ve likely seen the statistics: Rates of depression and anxiety in the United States—fairly stable in the 2000s—rose by more than 50 percent in many studies from 2010 to 2019. The suicide rate rose 48 percent for adolescents ages 10 to 19. For girls ages 10 to 14, it rose 131 percent.

The problem was not limited to the U.S.: Similar patterns emerged around the same time in Canada, the U.K., Australia, New Zealand, the Nordic countries, and beyond. By a variety of measures and in a variety of countries, the members of Generation Z (born in and after 1996) are suffering from anxiety, depression, self-harm, and related disorders at levels higher than any other generation for which we have data.

The decline in mental health is just one of many signs that something went awry. Loneliness and friendlessness among American teens began to surge around 2012. Academic achievement went down, too. According to “The Nation’s Report Card,” scores in reading and math began to decline for U.S. students after 2012, reversing decades of slow but generally steady increase. PISA, the major international measure of educational trends, shows that declines in math, reading, and science happened globally, also beginning in the early 2010s.

As the oldest members of Gen Z reach their late 20s, their troubles are carrying over into adulthood. Young adults are dating less, having less sex, and showing less interest in ever having children than prior generations. They are more likely to live with their parents. They were less likely to get jobs as teens, and managers say they are harder to work with. Many of these trends began with earlier generations, but most of them accelerated with Gen Z.

Surveys show that members of Gen Z are shyer and more risk averse than previous generations, too, and risk aversion may make them less ambitious. In an interview last May, OpenAI co-founder Sam Altman and Stripe co-founder Patrick Collison noted that, for the first time since the 1970s, none of Silicon Valley’s preeminent entrepreneurs are under 30. “Something has really gone wrong,” Altman said. In a famously young industry, he was baffled by the sudden absence of great founders in their 20s.

Generations are not monolithic, of course. Many young people are flourishing. Taken as a whole, however, Gen Z is in poor mental health and is lagging behind previous generations on many important metrics. And if a generation is doing poorly––if it is more anxious and depressed and is starting families, careers, and important companies at a substantially lower rate than previous generations––then the sociological and economic consequences will be profound for the entire society.

graph showing rates of self-harm in children
Number of emergency-department visits for nonfatal self-harm per 100,000 children (source: Centers for Disease Control and Prevention)

What happened in the early 2010s that altered adolescent development and worsened mental health? Theories abound, but the fact that similar trends are found in many countries worldwide means that events and trends that are specific to the United States cannot be the main story.

I think the answer can be stated simply, although the underlying psychology is complex: Those were the years when adolescents in rich countries traded in their flip phones for smartphones and moved much more of their social lives online—particularly onto social-media platforms designed for virality and addiction. Once young people began carrying the entire internet in their pockets, available to them day and night, it altered their daily experiences and developmental pathways across the board. Friendship, dating, sexuality, exercise, sleep, academics, politics, family dynamics, identity—all were affected. Life changed rapidly for younger children, too, as they began to get access to their parents’ smartphones and, later, got their own iPads, laptops, and even smartphones during elementary school.


As a social psychologist who has long studied social and moral development, I have been involved in debates about the effects of digital technology for years. Typically, the scientific questions have been framed somewhat narrowly, to make them easier to address with data. For example, do adolescents who consume more social media have higher levels of depression? Does using a smartphone just before bedtime interfere with sleep? The answer to these questions is usually found to be yes, although the size of the relationship is often statistically small, which has led some researchers to conclude that these new technologies are not responsible for the gigantic increases in mental illness that began in the early 2010s.

But before we can evaluate the evidence on any one potential avenue of harm, we need to step back and ask a broader question: What is childhood––including adolescence––and how did it change when smartphones moved to the center of it? If we take a more holistic view of what childhood is and what young children, tweens, and teens need to do to mature into competent adults, the picture becomes much clearer. Smartphone-based life, it turns out, alters or interferes with a great number of developmental processes.

The intrusion of smartphones and social media are not the only changes that have deformed childhood. There’s an important backstory, beginning as long ago as the 1980s, when we started systematically depriving children and adolescents of freedom, unsupervised play, responsibility, and opportunities for risk taking, all of which promote competence, maturity, and mental health. But the change in childhood accelerated in the early 2010s, when an already independence-deprived generation was lured into a new virtual universe that seemed safe to parents but in fact is more dangerous, in many respects, than the physical world.

My claim is that the new phone-based childhood that took shape roughly 12 years ago is making young people sick and blocking their progress to flourishing in adulthood. We need a dramatic cultural correction, and we need it now.

1. The Decline of Play and Independence

Human brains are extraordinarily large compared with those of other primates, and human childhoods are extraordinarily long, too, to give those large brains time to wire up within a particular culture. A child’s brain is already 90 percent of its adult size by about age 6. The next 10 or 15 years are about learning norms and mastering skills—physical, analytical, creative, and social. As children and adolescents seek out experiences and practice a wide variety of behaviors, the synapses and neurons that are used frequently are retained while those that are used less often disappear. Neurons that fire together wire together, as brain researchers say.

Brain development is sometimes said to be “experience-expectant,” because specific parts of the brain show increased plasticity during periods of life when an animal’s brain can “expect” to have certain kinds of experiences. You can see this with baby geese, who will imprint on whatever mother-sized object moves in their vicinity just after they hatch. You can see it with human children, who are able to learn languages quickly and take on the local accent, but only through early puberty; after that, it’s hard to learn a language and sound like a native speaker. There is also some evidence of a sensitive period for cultural learning more generally. Japanese children who spent a few years in California in the 1970s came to feel “American” in their identity and ways of interacting only if they attended American schools for a few years between ages 9 and 15. If they left before age 9, there was no lasting impact. If they didn’t arrive until they were 15, it was too late; they didn’t come to feel American.

Human childhood is an extended cultural apprenticeship with different tasks at different ages all the way through puberty. Once we see it this way, we can identify factors that promote or impede the right kinds of learning at each age. For children of all ages, one of the most powerful drivers of learning is the strong motivation to play. Play is the work of childhood, and all young mammals have the same job: to wire up their brains by playing vigorously and often, practicing the moves and skills they’ll need as adults. Kittens will play-pounce on anything that looks like a mouse tail. Human children will play games such as tag and sharks and minnows, which let them practice both their predator skills and their escaping-from-predator skills. Adolescents will play sports with greater intensity, and will incorporate playfulness into their social interactions—flirting, teasing, and developing inside jokes that bond friends together. Hundreds of studies on young rats, monkeys, and humans show that young mammals want to play, need to play, and end up socially, cognitively, and emotionally impaired when they are deprived of play.

One crucial aspect of play is physical risk taking. Children and adolescents must take risks and fail—often—in environments in which failure is not very costly. This is how they extend their abilities, overcome their fears, learn to estimate risk, and learn to cooperate in order to take on larger challenges later. The ever-present possibility of getting hurt while running around, exploring, play-fighting, or getting into a real conflict with another group adds an element of thrill, and thrilling play appears to be the most effective kind for overcoming childhood anxieties and building social, emotional, and physical competence. The desire for risk and thrill increases in the teen years, when failure might carry more serious consequences. Children of all ages need to choose the risk they are ready for at a given moment. Young people who are deprived of opportunities for risk taking and independent exploration will, on average, develop into more anxious and risk-averse adults.

Human childhood and adolescence evolved outdoors, in a physical world full of dangers and opportunities. Its central activities––play, exploration, and intense socializing––were largely unsupervised by adults, allowing children to make their own choices, resolve their own conflicts, and take care of one another. Shared adventures and shared adversity bound young people together into strong friendship clusters within which they mastered the social dynamics of small groups, which prepared them to master bigger challenges and larger groups later on.

And then we changed childhood.

The changes started slowly in the late 1970s and ’80s, before the arrival of the internet, as many parents in the U.S. grew fearful that their children would be harmed or abducted if left unsupervised. Such crimes have always been extremely rare, but they loomed larger in parents’ minds thanks in part to rising levels of street crime combined with the arrival of cable TV, which enabled round-the-clock coverage of missing-children cases. A general decline in social capital––the degree to which people knew and trusted their neighbors and institutions––exacerbated parental fears. Meanwhile, rising competition for college admissions encouraged more intensive forms of parenting. In the 1990s, American parents began pulling their children indoors or insisting that afternoons be spent in adult-run enrichment activities. Free play, independent exploration, and teen-hangout time declined.

In recent decades, seeing unchaperoned children outdoors has become so novel that when one is spotted in the wild, some adults feel it is their duty to call the police. In 2015, the Pew Research Center found that parents, on average, believed that children should be at least 10 years old to play unsupervised in front of their house, and that kids should be 14 before being allowed to go unsupervised to a public park. Most of these same parents had enjoyed joyous and unsupervised outdoor play by the age of 7 or 8.

But overprotection is only part of the story. The transition away from a more independent childhood was facilitated by steady improvements in digital technology, which made it easier and more inviting for young people to spend a lot more time at home, indoors, and alone in their rooms. Eventually, tech companies got access to children 24/7. They developed exciting virtual activities, engineered for “engagement,” that are nothing like the real-world experiences young brains evolved to expect.

Triptych: teens on their phones at the mall, park, and bedroom

2. The Virtual World Arrives in Two Waves

The internet, which now dominates the lives of young people, arrived in two waves of linked technologies. The first one did little harm to Millennials. The second one swallowed Gen Z whole.

The first wave came ashore in the 1990s with the arrival of dial-up internet access, which made personal computers good for something beyond word processing and basic games. By 2003, 55 percent of American households had a computer with (slow) internet access. Rates of adolescent depression, loneliness, and other measures of poor mental health did not rise in this first wave. If anything, they went down a bit. Millennial teens (born 1981 through 1995), who were the first to go through puberty with access to the internet, were psychologically healthier and happier, on average, than their older siblings or parents in Generation X (born 1965 through 1980).

The second wave began to rise in the 2000s, though its full force didn’t hit until the early 2010s. It began rather innocently with the introduction of social-media platforms that helped people connect with their friends. Posting and sharing content became much easier with sites such as Friendster (launched in 2003), Myspace (2003), and Facebook (2004).

Teens embraced social media soon after it came out, but the time they could spend on these sites was limited in those early years because the sites could only be accessed from a computer, often the family computer in the living room. Young people couldn’t access social media (and the rest of the internet) from the school bus, during class time, or while hanging out with friends outdoors. Many teens in the early-to-mid-2000s had cellphones, but these were basic phones (many of them flip phones) that had no internet access. Typing on them was difficult––they had only number keys. Basic phones were tools that helped Millennials meet up with one another in person or talk with each other one-on-one. I have seen no evidence to suggest that basic cellphones harmed the mental health of Millennials.

It was not until the introduction of the iPhone (2007), the App Store (2008), and high-speed internet (which reached 50 percent of American homes in 2007)—and the corresponding pivot to mobile made by many providers of social media, video games, and porn—that it became possible for adolescents to spend nearly every waking moment online. The extraordinary synergy among these innovations was what powered the second technological wave. In 2011, only 23 percent of teens had a smartphone. By 2015, that number had risen to 73 percent, and a quarter of teens said they were online “almost constantly.” Their younger siblings in elementary school didn’t usually have their own smartphones, but after its release in 2010, the iPad quickly became a staple of young children’s daily lives. It was in this brief period, from 2010 to 2015, that childhood in America (and many other countries) was rewired into a form that was more sedentary, solitary, virtual, and incompatible with healthy human development.

3. Techno-optimism and the Birth of the Phone-Based Childhood

The phone-based childhood created by that second wave—including not just smartphones themselves, but all manner of internet-connected devices, such as tablets, laptops, video-game consoles, and smartwatches—arrived near the end of a period of enormous optimism about digital technology. The internet came into our lives in the mid-1990s, soon after the fall of the Soviet Union. By the end of that decade, it was widely thought that the web would be an ally of democracy and a slayer of tyrants. When people are connected to each other, and to all the information in the world, how could any dictator keep them down?

In the 2000s, Silicon Valley and its world-changing inventions were a source of pride and excitement in America. Smart and ambitious young people around the world wanted to move to the West Coast to be part of the digital revolution. Tech-company founders such as Steve Jobs and Sergey Brin were lauded as gods, or at least as modern Prometheans, bringing humans godlike powers. The Arab Spring bloomed in 2011 with the help of decentralized social platforms, including Twitter and Facebook. When pundits and entrepreneurs talked about the power of social media to transform society, it didn’t sound like a dark prophecy.

You have to put yourself back in this heady time to understand why adults acquiesced so readily to the rapid transformation of childhood. Many parents had concerns, even then, about what their children were doing online, especially because of the internet’s ability to put children in contact with strangers. But there was also a lot of excitement about the upsides of this new digital world. If computers and the internet were the vanguards of progress, and if young people––widely referred to as “digital natives”––were going to live their lives entwined with these technologies, then why not give them a head start? I remember how exciting it was to see my 2-year-old son master the touch-and-swipe interface of my first iPhone in 2008. I thought I could see his neurons being woven together faster as a result of the stimulation it brought to his brain, compared to the passivity of watching television or the slowness of building a block tower. I thought I could see his future job prospects improving.

Touchscreen devices were also a godsend for harried parents. Many of us discovered that we could have peace at a restaurant, on a long car trip, or at home while making dinner or replying to emails if we just gave our children what they most wanted: our smartphones and tablets. We saw that everyone else was doing it and figured it must be okay.

It was the same for older children, desperate to join their friends on social-media platforms, where the minimum age to open an account was set by law to 13, even though no research had been done to establish the safety of these products for minors. Because the platforms did nothing (and still do nothing) to verify the stated age of new-account applicants, any 10-year-old could open multiple accounts without parental permission or knowledge, and many did. Facebook and later Instagram became places where many sixth and seventh graders were hanging out and socializing. If parents did find out about these accounts, it was too late. Nobody wanted their child to be isolated and alone, so parents rarely forced their children to shut down their accounts.

We had no idea what we were doing.

4. The High Cost of a Phone-Based Childhood

In Walden, his 1854 reflection on simple living, Henry David Thoreau wrote, “The cost of a thing is the amount of … life which is required to be exchanged for it, immediately or in the long run.” It’s an elegant formulation of what economists would later call the opportunity cost of any choice—all of the things you can no longer do with your money and time once you’ve committed them to something else. So it’s important that we grasp just how much of a young person’s day is now taken up by their devices.

The numbers are hard to believe. The most recent Gallup data show that American teens spend about five hours a day just on social-media platforms (including watching videos on TikTok and YouTube). Add in all the other phone- and screen-based activities, and the number rises to somewhere between seven and nine hours a day, on average. The numbers are even higher in single-parent and low-income families, and among Black, Hispanic, and Native American families.

These very high numbers do not include time spent in front of screens for school or homework, nor do they include all the time adolescents spend paying only partial attention to events in the real world while thinking about what they’re missing on social media or waiting for their phones to ping. Pew reports that in 2022, one-third of teens said they were on one of the major social-media sites “almost constantly,” and nearly half said the same of the internet in general. For these heavy users, nearly every waking hour is an hour absorbed, in full or in part, by their devices.

overhead image of teens hands with phones

In Thoreau’s terms, how much of life is exchanged for all this screen time? Arguably, most of it. Everything else in an adolescent’s day must get squeezed down or eliminated entirely to make room for the vast amount of content that is consumed, and for the hundreds of “friends,” “followers,” and other network connections that must be serviced with texts, posts, comments, likes, snaps, and direct messages. I recently surveyed my students at NYU, and most of them reported that the very first thing they do when they open their eyes in the morning is check their texts, direct messages, and social-media feeds. It’s also the last thing they do before they close their eyes at night. And it’s a lot of what they do in between.

The amount of time that adolescents spend sleeping declined in the early 2010s, and many studies tie sleep loss directly to the use of devices around bedtime, particularly when they’re used to scroll through social media. Exercise declined, too, which is unfortunate because exercise, like sleep, improves both mental and physical health. Book reading has been declining for decades, pushed aside by digital alternatives, but the decline, like so much else, sped up in the early 2010s. With passive entertainment always available, adolescent minds likely wander less than they used to; contemplation and imagination might be placed on the list of things winnowed down or crowded out.

But perhaps the most devastating cost of the new phone-based childhood was the collapse of time spent interacting with other people face-to-face. A study of how Americans spend their time found that, before 2010, young people (ages 15 to 24) reported spending far more time with their friends (about two hours a day, on average, not counting time together at school) than did older people (who spent just 30 to 60 minutes with friends). Time with friends began decreasing for young people in the 2000s, but the drop accelerated in the 2010s, while it barely changed for older people. By 2019, young people’s time with friends had dropped to just 67 minutes a day. It turns out that Gen Z had been socially distancing for many years and had mostly completed the project by the time COVID-19 struck.

You might question the importance of this decline. After all, isn’t much of this online time spent interacting with friends through texting, social media, and multiplayer video games? Isn’t that just as good?

Some of it surely is, and virtual interactions offer unique benefits too, especially for young people who are geographically or socially isolated. But in general, the virtual world lacks many of the features that make human interactions in the real world nutritious, as we might say, for physical, social, and emotional development. In particular, real-world relationships and social interactions are characterized by four features—typical for hundreds of thousands of years—that online interactions either distort or erase.

First, real-world interactions are embodied, meaning that we use our hands and facial expressions to communicate, and we learn to respond to the body language of others. Virtual interactions, in contrast, mostly rely on language alone. No matter how many emojis are offered as compensation, the elimination of communication channels for which we have eons of evolutionary programming is likely to produce adults who are less comfortable and less skilled at interacting in person.

Second, real-world interactions are synchronous; they happen at the same time. As a result, we learn subtle cues about timing and conversational turn taking. Synchronous interactions make us feel closer to the other person because that’s what getting “in sync” does. Texts, posts, and many other virtual interactions lack synchrony. There is less real laughter, more room for misinterpretation, and more stress after a comment that gets no immediate response.

Third, real-world interactions primarily involve one‐to‐one communication, or sometimes one-to-several. But many virtual communications are broadcast to a potentially huge audience. Online, each person can engage in dozens of asynchronous interactions in parallel, which interferes with the depth achieved in all of them. The sender’s motivations are different, too: With a large audience, one’s reputation is always on the line; an error or poor performance can damage social standing with large numbers of peers. These communications thus tend to be more performative and anxiety-inducing than one-to-one conversations.

Finally, real-world interactions usually take place within communities that have a high bar for entry and exit, so people are strongly motivated to invest in relationships and repair rifts when they happen. But in many virtual networks, people can easily block others or quit when they are displeased. Relationships within such networks are usually more disposable.

These unsatisfying and anxiety-producing features of life online should be recognizable to most adults. Online interactions can bring out antisocial behavior that people would never display in their offline communities. But if life online takes a toll on adults, just imagine what it does to adolescents in the early years of puberty, when their “experience expectant” brains are rewiring based on feedback from their social interactions.

Kids going through puberty online are likely to experience far more social comparison, self-consciousness, public shaming, and chronic anxiety than adolescents in previous generations, which could potentially set developing brains into a habitual state of defensiveness. The brain contains systems that are specialized for approach (when opportunities beckon) and withdrawal (when threats appear or seem likely). People can be in what we might call “discover mode” or “defend mode” at any moment, but generally not both. The two systems together form a mechanism for quickly adapting to changing conditions, like a thermostat that can activate either a heating system or a cooling system as the temperature fluctuates. Some people’s internal thermostats are generally set to discover mode, and they flip into defend mode only when clear threats arise. These people tend to see the world as full of opportunities. They are happier and less anxious. Other people’s internal thermostats are generally set to defend mode, and they flip into discover mode only when they feel unusually safe. They tend to see the world as full of threats and are more prone to anxiety and depressive disorders.

graph showing rates of disabilities in US college freshman
Percentage of U.S. college freshmen reporting various kinds of disabilities and disorders (source: Higher Education Research Institute)

A simple way to understand the differences between Gen Z and previous generations is that people born in and after 1996 have internal thermostats that were shifted toward defend mode. This is why life on college campuses changed so suddenly when Gen Z arrived, beginning around 2014. Students began requesting “safe spaces” and trigger warnings. They were highly sensitive to “microaggressions” and sometimes claimed that words were “violence.” These trends mystified those of us in older generations at the time, but in hindsight, it all makes sense. Gen Z students found words, ideas, and ambiguous social encounters more threatening than had previous generations of students because we had fundamentally altered their psychological development.

5. So Many Harms

The debate around adolescents’ use of smartphones and social media typically revolves around mental health, and understandably so. But the harms that have resulted from transforming childhood so suddenly and heedlessly go far beyond mental health. I’ve touched on some of them—social awkwardness, reduced self-confidence, and a more sedentary childhood. Here are three additional harms.

Fragmented Attention, Disrupted Learning

Staying on task while sitting at a computer is hard enough for an adult with a fully developed prefrontal cortex. It is far more difficult for adolescents in front of their laptop trying to do homework. They are probably less intrinsically motivated to stay on task. They’re certainly less able, given their undeveloped prefrontal cortex, and hence it’s easy for any company with an app to lure them away with an offer of social validation or entertainment. Their phones are pinging constantly—one study found that the typical adolescent now gets 237 notifications a day, roughly 15 every waking hour. Sustained attention is essential for doing almost anything big, creative, or valuable, yet young people find their attention chopped up into little bits by notifications offering the possibility of high-pleasure, low-effort digital experiences.

It even happens in the classroom. Studies confirm that when students have access to their phones during class time, they use them, especially for texting and checking social media, and their grades and learning suffer. This might explain why benchmark test scores began to decline in the U.S. and around the world in the early 2010s—well before the pandemic hit.

Addiction and Social Withdrawal

The neural basis of behavioral addiction to social media or video games is not exactly the same as chemical addiction to cocaine or opioids. Nonetheless, they all involve abnormally heavy and sustained activation of dopamine neurons and reward pathways. Over time, the brain adapts to these high levels of dopamine; when the child is not engaged in digital activity, their brain doesn’t have enough dopamine, and the child experiences withdrawal symptoms. These generally include anxiety, insomnia, and intense irritability. Kids with these kinds of behavioral addictions often become surly and aggressive, and withdraw from their families into their bedrooms and devices.

Social-media and gaming platforms were designed to hook users. How successful are they? How many kids suffer from digital addictions?

The main addiction risks for boys seem to be video games and porn. “Internet gaming disorder,” which was added to the main diagnosis manual of psychiatry in 2013 as a condition for further study, describes “significant impairment or distress” in several aspects of life, along with many hallmarks of addiction, including an inability to reduce usage despite attempts to do so. Estimates for the prevalence of IGD range from 7 to 15 percent among adolescent boys and young men. As for porn, a nationally representative survey of American adults published in 2019 found that 7 percent of American men agreed or strongly agreed with the statement “I am addicted to pornography”—and the rates were higher for the youngest men.

Girls have much lower rates of addiction to video games and porn, but they use social media more intensely than boys do. A study of teens in 29 nations found that between 5 and 15 percent of adolescents engage in what is called “problematic social media use,” which includes symptoms such as preoccupation, withdrawal symptoms, neglect of other areas of life, and lying to parents and friends about time spent on social media. That study did not break down results by gender, but many others have found that rates of “problematic use” are higher for girls.

I don’t want to overstate the risks: Most teens do not become addicted to their phones and video games. But across multiple studies and across genders, rates of problematic use come out in the ballpark of 5 to 15 percent. Is there any other consumer product that parents would let their children use relatively freely if they knew that something like one in 10 kids would end up with a pattern of habitual and compulsive use that disrupted various domains of life and looked a lot like an addiction?

The Decay of Wisdom and the Loss of Meaning

During that crucial sensitive period for cultural learning, from roughly ages 9 through 15, we should be especially thoughtful about who is socializing our children for adulthood. Instead, that’s when most kids get their first smartphone and sign themselves up (with or without parental permission) to consume rivers of content from random strangers. Much of that content is produced by other adolescents, in blocks of a few minutes or a few seconds.

This rerouting of enculturating content has created a generation that is largely cut off from older generations and, to some extent, from the accumulated wisdom of humankind, including knowledge about how to live a flourishing life. Adolescents spend less time steeped in their local or national culture. They are coming of age in a confusing, placeless, ahistorical maelstrom of 30-second stories curated by algorithms designed to mesmerize them. Without solid knowledge of the past and the filtering of good ideas from bad––a process that plays out over many generations––young people will be more prone to believe whatever terrible ideas become popular around them, which might explain why videos showing young people reacting positively to Osama bin Laden’s thoughts about America were trending on TikTok last fall.

All this is made worse by the fact that so much of digital public life is an unending supply of micro dramas about somebody somewhere in our country of 340 million people who did something that can fuel an outrage cycle, only to be pushed aside by the next. It doesn’t add up to anything and leaves behind only a distorted sense of human nature and affairs.

When our public life becomes fragmented, ephemeral, and incomprehensible, it is a recipe for anomie, or normlessness. The great French sociologist Émile Durkheim showed long ago that a society that fails to bind its people together with some shared sense of sacredness and common respect for rules and norms is not a society of great individual freedom; it is, rather, a place where disoriented individuals have difficulty setting goals and exerting themselves to achieve them. Durkheim argued that anomie was a major driver of suicide rates in European countries. Modern scholars continue to draw on his work to understand suicide rates today.

graph showing rates of young people who struggle with mental health
Percentage of U.S. high-school seniors who agreed with the statement “Life often seems meaningless.” (Source: Monitoring the Future)

Durkheim’s observations are crucial for understanding what happened in the early 2010s. A long-running survey of American teens found that, from 1990 to 2010, high-school seniors became slightly less likely to agree with statements such as “Life often feels meaningless.” But as soon as they adopted a phone-based life and many began to live in the whirlpool of social media, where no stability can be found, every measure of despair increased. From 2010 to 2019, the number who agreed that their lives felt “meaningless” increased by about 70 percent, to more than one in five.

6. Young People Don’t Like Their Phone-Based Lives

How can I be confident that the epidemic of adolescent mental illness was kicked off by the arrival of the phone-based childhood? Skeptics point to other events as possible culprits, including the 2008 global financial crisis, global warming, the 2012 Sandy Hook school shooting and the subsequent active-shooter drills, rising academic pressures, and the opioid epidemic. But while these events might have been contributing factors in some countries, none can explain both the timing and international scope of the disaster.

An additional source of evidence comes from Gen Z itself. With all the talk of regulating social media, raising age limits, and getting phones out of schools, you might expect to find many members of Gen Z writing and speaking out in opposition. I’ve looked for such arguments and found hardly any. In contrast, many young adults tell stories of devastation.

Freya India, a 24-year-old British essayist who writes about girls, explains how social-media sites carry girls off to unhealthy places: “It seems like your child is simply watching some makeup tutorials, following some mental health influencers, or experimenting with their identity. But let me tell you: they are on a conveyor belt to someplace bad. Whatever insecurity or vulnerability they are struggling with, they will be pushed further and further into it.” She continues:

Gen Z were the guinea pigs in this uncontrolled global social experiment. We were the first to have our vulnerabilities and insecurities fed into a machine that magnified and refracted them back at us, all the time, before we had any sense of who we were. We didn’t just grow up with algorithms. They raised us. They rearranged our faces. Shaped our identities. Convinced us we were sick.

Rikki Schlott, a 23-year-old American journalist and co-author of The Canceling of the American Mind, writes,

The day-to-day life of a typical teen or tween today would be unrecognizable to someone who came of age before the smartphone arrived. Zoomers are spending an average of 9 hours daily in this screen-time doom loop—desperate to forget the gaping holes they’re bleeding out of, even if just for … 9 hours a day. Uncomfortable silence could be time to ponder why they’re so miserable in the first place. Drowning it out with algorithmic white noise is far easier.

A 27-year-old man who spent his adolescent years addicted (his word) to video games and pornography sent me this reflection on what that did to him:

I missed out on a lot of stuff in life—a lot of socialization. I feel the effects now: meeting new people, talking to people. I feel that my interactions are not as smooth and fluid as I want. My knowledge of the world (geography, politics, etc.) is lacking. I didn’t spend time having conversations or learning about sports. I often feel like a hollow operating system.

Or consider what Facebook found in a research project involving focus groups of young people, revealed in 2021 by the whistleblower Frances Haugen: “Teens blame Instagram for increases in the rates of anxiety and depression among teens,” an internal document said. “This reaction was unprompted and consistent across all groups.”

How can it be that an entire generation is hooked on consumer products that so few praise and so many ultimately regret using? Because smartphones and especially social media have put members of Gen Z and their parents into a series of collective-action traps. Once you understand the dynamics of these traps, the escape routes become clear.

diptych: teens on phone on couch and on a swing

7. Collective-Action Problems

Social-media companies such as Meta, TikTok, and Snap are often compared to tobacco companies, but that’s not really fair to the tobacco industry. It’s true that companies in both industries marketed harmful products to children and tweaked their products for maximum customer retention (that is, addiction), but there’s a big difference: Teens could and did choose, in large numbers, not to smoke. Even at the peak of teen cigarette use, in 1997, nearly two-thirds of high-school students did not smoke.

Social media, in contrast, applies a lot more pressure on nonusers, at a much younger age and in a more insidious way. Once a few students in any middle school lie about their age and open accounts at age 11 or 12, they start posting photos and comments about themselves and other students. Drama ensues. The pressure on everyone else to join becomes intense. Even a girl who knows, consciously, that Instagram can foster beauty obsession, anxiety, and eating disorders might sooner take those risks than accept the seeming certainty of being out of the loop, clueless, and excluded. And indeed, if she resists while most of her classmates do not, she might, in fact, be marginalized, which puts her at risk for anxiety and depression, though via a different pathway than the one taken by those who use social media heavily. In this way, social media accomplishes a remarkable feat: It even harms adolescents who do not use it.

A recent study led by the University of Chicago economist Leonardo Bursztyn captured the dynamics of the social-media trap precisely. The researchers recruited more than 1,000 college students and asked them how much they’d need to be paid to deactivate their accounts on either Instagram or TikTok for four weeks. That’s a standard economist’s question to try to compute the net value of a product to society. On average, students said they’d need to be paid roughly $50 ($59 for TikTok, $47 for Instagram) to deactivate whichever platform they were asked about. Then the experimenters told the students that they were going to try to get most of the others in their school to deactivate that same platform, offering to pay them to do so as well, and asked, Now how much would you have to be paid to deactivate, if most others did so? The answer, on average, was less than zero. In each case, most students were willing to pay to have that happen.

Social media is all about network effects. Most students are only on it because everyone else is too. Most of them would prefer that nobody be on these platforms. Later in the study, students were asked directly, “Would you prefer to live in a world without Instagram [or TikTok]?” A majority of students said yes––58 percent for each app.

This is the textbook definition of what social scientists call a collective-action problem. It’s what happens when a group would be better off if everyone in the group took a particular action, but each actor is deterred from acting, because unless the others do the same, the personal cost outweighs the benefit. Fishermen considering limiting their catch to avoid wiping out the local fish population are caught in this same kind of trap. If no one else does it too, they just lose profit.

Cigarettes trapped individual smokers with a biological addiction. Social media has trapped an entire generation in a collective-action problem. Early app developers deliberately and knowingly exploited the psychological weaknesses and insecurities of young people to pressure them to consume a product that, upon reflection, many wish they could use less, or not at all.

8. Four Norms to Break Four Traps

Young people and their parents are stuck in at least four collective-action traps. Each is hard to escape for an individual family, but escape becomes much easier if families, schools, and communities coordinate and act together. Here are four norms that would roll back the phone-based childhood. I believe that any community that adopts all four will see substantial improvements in youth mental health within two years.

No smartphones before high school 

The trap here is that each child thinks they need a smartphone because “everyone else” has one, and many parents give in because they don’t want their child to feel excluded. But if no one else had a smartphone—or even if, say, only half of the child’s sixth-grade class had one—parents would feel more comfortable providing a basic flip phone (or no phone at all). Delaying round-the-clock internet access until ninth grade (around age 14) as a national or community norm would help to protect adolescents during the very vulnerable first few years of puberty. According to a 2022 British study, these are the years when social-media use is most correlated with poor mental health. Family policies about tablets, laptops, and video-game consoles should be aligned with smartphone restrictions to prevent overuse of other screen activities.

No social media before 16

The trap here, as with smartphones, is that each adolescent feels a strong need to open accounts on TikTok, Instagram, Snapchat, and other platforms primarily because that’s where most of their peers are posting and gossiping. But if the majority of adolescents were not on these accounts until they were 16, families and adolescents could more easily resist the pressure to sign up. The delay would not mean that kids younger than 16 could never watch videos on TikTok or YouTube—only that they could not open accounts, give away their data, post their own content, and let algorithms get to know them and their preferences.

Phone‐free schools

Most schools claim that they ban phones, but this usually just means that students aren’t supposed to take their phone out of their pocket during class. Research shows that most students do use their phones during class time. They also use them during lunchtime, free periods, and breaks between classes––times when students could and should be interacting with their classmates face-to-face. The only way to get students’ minds off their phones during the school day is to require all students to put their phones (and other devices that can send or receive texts) into a phone locker or locked pouch at the start of the day. Schools that have gone phone-free always seem to report that it has improved the culture, making students more attentive in class and more interactive with one another. Published studies back them up.

More independence, free play, and responsibility in the real world

Many parents are afraid to give their children the level of independence and responsibility they themselves enjoyed when they were young, even though rates of homicide, drunk driving, and other physical threats to children are way down in recent decades. Part of the fear comes from the fact that parents look at each other to determine what is normal and therefore safe, and they see few examples of families acting as if a 9-year-old can be trusted to walk to a store without a chaperone. But if many parents started sending their children out to play or run errands, then the norms of what is safe and accepted would change quickly. So would ideas about what constitutes “good parenting.” And if more parents trusted their children with more responsibility––for example, by asking their kids to do more to help out, or to care for others––then the pervasive sense of uselessness now found in surveys of high-school students might begin to dissipate.

It would be a mistake to overlook this fourth norm. If parents don’t replace screen time with real-world experiences involving friends and independent activity, then banning devices will feel like deprivation, not the opening up of a world of opportunities.

The main reason why the phone-based childhood is so harmful is because it pushes aside everything else. Smartphones are experience blockers. Our ultimate goal should not be to remove screens entirely, nor should it be to return childhood to exactly the way it was in 1960. Rather, it should be to create a version of childhood and adolescence that keeps young people anchored in the real world while flourishing in the digital age.

9. What Are We Waiting For?

An essential function of government is to solve collective-action problems. Congress could solve or help solve the ones I’ve highlighted—for instance, by raising the age of “internet adulthood” to 16 and requiring tech companies to keep underage children off their sites.

In recent decades, however, Congress has not been good at addressing public concerns when the solutions would displease a powerful and deep-pocketed industry. Governors and state legislators have been much more effective, and their successes might let us evaluate how well various reforms work. But the bottom line is that to change norms, we’re going to need to do most of the work ourselves, in neighborhood groups, schools, and other communities.

There are now hundreds of organizations––most of them started by mothers who saw what smartphones had done to their children––that are working to roll back the phone-based childhood or promote a more independent, real-world childhood. (I have assembled a list of many of them.) One that I co-founded, at LetGrow.org, suggests a variety of simple programs for parents or schools, such as play club (schools keep the playground open at least one day a week before or after school, and kids sign up for phone-free, mixed-age, unstructured play as a regular weekly activity) and the Let Grow Experience (a series of homework assignments in which students––with their parents’ consent––choose something to do on their own that they’ve never done before, such as walk the dog, climb a tree, walk to a store, or cook dinner).

Even without the help of organizations, parents could break their families out of collective-action traps if they coordinated with the parents of their children’s friends. Together they could create common smartphone rules and organize unsupervised play sessions or encourage hangouts at a home, park, or shopping mall.

teen on her phone in her room

Parents are fed up with what childhood has become. Many are tired of having daily arguments about technologies that were designed to grab hold of their children’s attention and not let go. But the phone-based childhood is not inevitable.

The four norms I have proposed cost almost nothing to implement, they cause no clear harm to anyone, and while they could be supported by new legislation, they can be instilled even without it. We can begin implementing all of them right away, this year, especially in communities with good cooperation between schools and parents. A single memo from a principal asking parents to delay smartphones and social media, in support of the school’s effort to improve mental health by going phone free, would catalyze collective action and reset the community’s norms.

We didn’t know what we were doing in the early 2010s. Now we do. It’s time to end the phone-based childhood.


This article is adapted from Jonathan Haidt’s forthcoming book, The Anxious Generation: How the Great Rewiring of Childhood Is Causing an Epidemic of Mental Illness.

Banzo: a depressão e o suicídio de escravizados eram fatos corriqueiros (Aventuras na História)

aventurasnahistoria.uol.com.br

Renato Pinto Venâncio, 13/09/2020

Assim funcionavam os antigos navios negreiros – Getty Images

Pouco discutido nos livros, os escravos ficavam entristecidos, paravam de falar e, acima de tudo, deixavam de se alimentar

“Apareceu ontem enforcado com um baraço [corda de fios de linho], dentro de um alçapão, na casa da rua da Alfândega, nº 376, sobrado, o preto Dionysio, escravo de D.
Olimpya Theodora de Souza, moradora na mesma casa. O infeliz preto, querendo sem dúvida apressar a morte, fizera com uma thesoura pequenos ferimentos no braço…”

Essa nota, chocante, publicada no Jornal do Commercio, no Rio de Janeiro, em 22 de junho de 1872, revela uma faceta pouco conhecida da escravidão: os escravos se suicidavam. E com o índice de “mortes voluntárias” entre eles, quando comparado ao de homens livres, era duas ou três vezes mais elevado.

Os suicídios de escravos também se diferenciavam em outros aspectos. O mais notável deles era o fato de atribuir-se o gesto ao banzo. Ainda hoje se discute o significado dessa palavra. O mais aceito tem uma remota origem africana, equivalendo a “pensar” ou “meditar”. O termo também, há tempos, designou uma doença.

Em 1799, por exemplo, Luiz António de Oliveira Mendes apresentou, na Academia Real de Ciências de Lisboa, um estudo sobre “as doenças agudas e crônicas que mais frequentemente acometem os pretos recém-tirados da África”. O banzo constava entre elas.

Os sintomas? Os escravos ficavam entristecidos, paravam de falar e, acima de tudo, deixavam de se alimentar, mesmo “oferecendo-se-lhes” – afirma o médico – “as melhores comidas, assim do nosso trato e costume, como as do seu país…”, falecendo pouco tempo depois.

No século 19, com o desenvolvimento das primeiras teorias psicológicas, o comportamento dos escravos banzeiros foi reconhecido como distúrbio mental. Em 1844, Joaquim Manoel de Macedo, na tese médica intitulada Considerações Sobre a Nostalgia, afirma o seguinte: “[…] estamos convencidos de que a espantosa mortandade que entre nós se observa nos africanos, principalmente nos recém-chegados, bem como de que o número de suicídios que entre eles se conta, tem seu tanto de dívida a nostalgia […]” 

Aos poucos, a associação entre nostalgia e banzo se tornou popular. No Dicionário Brasileiro da Língua Portuguesa, de 1875, de Joaquim de Macedo Soares, é possível ler a seguinte definição: “banzar: estar pensativo sobre qualquer caso; triste sem saber de quê; sofrer do spleen dos ingleses; tristeza e apatia simultânea; sofrer de nostalgia, como os negros da Costa quando vinham para cá, e ainda depois de cá estarem”.

Hoje, a palavra “nostalgia”, difundida na literatura, é sinônimo de “saudade”, um sentimento. Situação bem diferente é pensá-la como doença. Tal rótulo – assim como o de banzo – provavelmente encobria uma vasta gama de problemas psicológicos ou psiquiátricos, que iam da depressão à esquizofrenia; ou eram provocados pela desnutrição, por doenças contagiosas.

Não faltam exemplos de aproximações entre suicídio e doença mental. O citado Jornal do Commercio registra ocorrências de mortes voluntárias associadas a delírios: “Valentim, escravo de Faria & Miranda, estabelecidos na rua dos Lázaros nº 26, sofria há dias violenta febre, e era tratado pelo Dr. Antonio Rodrigues de Oliveira. Anteontem [20 de maio de 1872], às 9 horas da noite, ao que parece, em um acesso mais forte, Valentim feriu-se com um golpe no pescoço”.

Outras vezes se reconhecia explicitamente a loucura: “Suicidouse ontem [8 de março de 1872] à 1 hora da tarde, enforcando-se, a preta africana Justina, de 50 anos, escrava de Narciso da Silva Galharno. O Sr. 2º Delegado tomou conhecimento do fato e procedeu a corpo delito. Consta que a preta sofria de alienação mental”.

Como todos os testemunhos do passado, os textos acima devem ser lidos com olhos críticos: o registro de suicídio pode encobrir assassinatos praticados por senhores. Tal fato não implica em diminuir o banzo como uma das expressões trágicas da loucura comum a milhões de pessoas vítimas do tráfico de escravos.

Por outro lado, a divulgação desse sofrimento nos jornais deve ter contribuído para a formação da sensibilidade abolicionista na sociedade imperial. Por isso, o banzo pode ser entendido como uma forma não intencional de protesto político, um exemplo primário
de luta pela não-violência.


**Professor de História e co-autor do livro Ancestrais: Uma Introdução Á História da África Atlântica, 2003. 

++ A seção Coluna não representa, necessariamente, a opinião do site Aventuras na História. 

Should the Japanese give nuclear power another chance? (Science Daily)

Date: October 23, 2014

Source: ResearchSEA

Summary: On September 9, 2014, the Japan Times reported an increasing number of suicides coming from the survivors of the March 2011 disaster. In Minami Soma Hospital, which is located 23 km away from the power plant, the number of patients experiencing stress has also increased since the disaster. What’s more, many of the survivors are now jobless and therefore facing an uncertain future.


On September 9, 2014, the Japan Times reported an increasing number of suicides coming from the survivors of the March 2011 disaster. In Minami Soma Hospital, which is located 23 km away from the power plant, the number of patients experiencing stress has also increased since the disaster. What’s more, many of the survivors are now jobless and therefore facing an uncertain future.

This is not the first time that nuclear power has victimized the Japanese people. In 1945, atomic bombs exploded in Hiroshima and Nagasaki, creating massive fears about nuclear power in the Japanese population. It took 20 years for the public to erase the trauma of these events. It was then — in the mid 1960s(?) — that the Fukushima Daiichii Nuclear Power Plant was built.

According to Professor Tetsuo Sawada, Assistant Professor in the Laboratory of Nuclear Reactors at Tokyo University, it took a lot of effort to assure people that nuclear power was safe and beneficial. The first step was a legal step: In 1955, the Japanese government passed a law decreeing that nuclear power could only be used for peaceful purposes.

“But that law was not enough to assure people to accept the establishment of nuclear power,” said Prof. Sawada.

He explained that the economy plays an important role in public acceptance of nuclear power. Through the establishment of nuclear power plants, more jobs were created, which boosted the economy of the Fukushima region at that time.

“Before the Fukushima disaster, we could find many pro-nuclear people in the area of nuclear power plants since it gave them money,” said Prof. Sawada.

Now, more than forty years have passed and the public’s former confidence has evolved into feelings of fear about nuclear power and distrust toward the government.

According to a study conducted by Noriko Iwai from the Japanese General Social Survey Research Center, the Fukushima nuclear accident has heightened people’s perception of disaster risks, fears of nuclear accident, and recognition of pollution, and has changed public opinion on nuclear energy policy.

“Distance from nuclear plants and the perception of earthquake risk interactively correlate with opinions on nuclear issues: among people whose evaluation of earthquake risk is low, those who live nearer to the plants are more likely to object to the abolishment of nuclear plants,” said Iwai.

This finding is in line with the perception of Sokyu Genyu, a chief priest in Fukujuji temple, Miharu Town, Fukushima Prefecture. As a member of the Reconstruction Design Council in Response to the Great East Japan Earthquake, he argued that both the Fukushima Daiichi and Daini nuclear power plants should be shut down in response to the objection of 80% of Fukushima residents.

However, the Japanese government, local scientists and international authorities have announced that Fukushima is safe. Radiation levels are below 1mSv/y, a number that, according to them, we should not be worried about. But the public do not believe in numbers.

But Genyu was not saying that these numbers are scientifically false. Rather, he argues that the problem lies more in the realm of social psychology. Despite the announcement about low-radiation levels, the Japanese people are still afraid of radiation.

“It is reasonable for local residents in Fukushima to speak out very emotionally. Within three months of the disaster, six people had committed sucide. They were homeless and jobless, ” said Genyu.

It is heart-breaking to know that victims of the Fukushima Daiichi nuclear accident died not because of radiation, but instead because of depression. Besides the increasing number of suicides, the number ofpatients suffering from cerebrovascular disease (strokes)has also risen. In Minami-Soma Hospital, the population of stroke patients increased by more than 100% after the disaster.

Local doctors and scientists are now actively educating students in Fukushima, convincing them that the radiation will not affect their health.

Dr. Masaharu Tsubokura, a practicing doctor at Minami-Soma Hospital, has been informing students that Fukushima is safe. But sadly, their responses are mostly negative and full of apathy.

“I think the Fukushima disaster is not about nuclear radiation but is rather a matter of public trust in the technology ,” said Dr. Tsubokura.

Dr. Tsubokura has given dosimeters, a device used to measure radiation, to children living in Minami-Soma city. But apparently, this was not enough to eliminate people’s fears.

In 2012, Professor Ryogo Hayano, a physicist from the University of Tokyo, joined Dr. Tsubokura in Minami-Soma Hospital and invented BABYSCAN technology, a whole-body scanning to measure radiation in small children as well as to allay the fears of Fukushima parents.

“BABYSCAN is unnecessary but necessary. It is unnecessary because we know that the radiation is low. But it is necessary to assure parents that their children are going to be okay,” said Prof. Hayano.

After witnessing the fears of the Fukushima people, Prof. Hayano thinks that nuclear power is no longer appropriate for Japan. He believes that the government should shut down nuclear power plants.

“As a scientist, I know that nuclear power is safe and cheap. But looking at the public’s fear in Fukushima, I think it should be phased out,” said Prof. Hayano.

But, does the government care about the public when it comes to politics?

It has only been three years since the disaster and Prime Minister Shinzo Abe has been keen to revive the country’s nuclear power plants. The operations of more than 50 nuclear power plants in Japan have been suspended because of the Daiichi power plant meltdown.

Last month, Japan’s Nuclear Regulation Authority approved the reopening of a power plant in Sendai for 2015.

Índice de suicídios entre indígenas no MS é o maior em 28 anos (Combate Racismo Ambiental)

Por , 23/05/2014 14:06

Por Carolina Fasolo, de Brasília (DF), no Cimi

No dia 3 de abril, quando amanheceu em uma aldeia Guarani-Kaiowá, localizada no sul do estado de Mato Grosso do Sul, a mãe de três filhos abriu a porta de casa e paralisou ao ver o corpo frágil de sua menina mais nova suspenso pelo lençol, amarrado à árvore por um nó que parecia firme. No dia anterior, a garota havia completado 13 anos.

“A mãe disse que ela chegou da escola muito triste e reclamando de dores na cabeça”, conta Otoniel, liderança Guarani-Kaiowá. “Depois que todos foram dormir ela amarrou o lençol na árvore e se matou. Um primo dela de 12 anos tinha se enforcado uma semana antes. E uns dias depois que ela morreu outro adolescente, de 16 anos, também se suicidou na mesma aldeia. Fui até lá para saber o que estava acontecendo”.

Os três enforcamentos em menos de duas semanas fazem parte de uma estatística que no ano de 2013 ganhou contornos históricos. Foram contabilizados 73 casos de suicídios entre os indígenas de Mato Grosso do Sul. De acordo com registros do Conselho Indigenista Missionário (Cimi), é o maior número em 28 anos. Os dados, apurados pelo Distrito Sanitário Especial Indígena (DSEI/MS), constam no Relatório de Violência Contra os Povos Indígenas no Brasil, a ser divulgado pelo Cimi em junho.

Dos 73 indígenas mortos, 72 eram do povo Guarani-Kaiowá, a maioria com idade entre 15 e 30 anos. Otoniel acredita que o motivo de tantos jovens cometerem suicídio é a falta de perspectiva. “Não têm futuro, não têm respeito, não têm trabalho e nem terra pra plantar e viver. Escolhem morrer porque na verdade já estão mortos por dentro”.

O procurador da República Marco Antônio Delfino de Almeida, do Ministério Público Federal (MPF) em Dourados (MS), explica que as oportunidades de trabalho para os indígenas são praticamente restritas a atividades subalternas degradantes, como o corte da cana-de-açúcar. “Temos escolas indígenas, mas o modelo educacional não foi construído para a comunidade, existe apenas uma ‘casca indígena’, que não contempla a inserção do jovem no processo produtivo”, completa. 

“A discriminação e o ódio étnico, condutas incentivadas inclusive pelos meios de comunicação, acentuam sobremaneira o problema dos suicídios. Os indígenas são pintados como entraves, empecilhos, obstáculos ao desenvolvimento. É como se a mídia passasse a mensagem ‘Se você quer ficar bem, tire o índio do seu caminho’, ressalta o procurador.

13 anos, 684 suicídios

No período de 1986 a 1997, foram registradas 244 mortes por suicídio entre os Guarani-Kaiowá de MS, número que praticamente triplicou na última década. De 2000 a 2013 foram 684 casos.

“As atuais condições de vida desses indígenas, que desembocam em estatísticas assombrosas de violência, têm origem num processo histórico”, explica Marco Antonio Delfino. “O que aconteceu foi uma transferência brutal, por parte da União, de territórios indígenas para não índios”.

A transferência se deu, principalmente, pelo então Serviço de Proteção ao Índio (SPI) que demarcou, entre 1915 e 1928, oito pequenas reservas no sul do estado para onde diferentes povos indígenas foram obrigados a migrar. “As reservas demarcadas serviam como um depósito gigantesco de mão de obra a ser utilizada conforme os interesses econômicos. Todo o processo de confinamento indígena teve como finalidade sua utilização como mão de obra para os projetos agrícolas implantados no país, desde a cultura da erva-mate até recentemente, com a cana-de-açúcar”, completa o procurador.

O confinamento compulsório, com a sobreposição de aldeias distintas e de dinâmicas político-religiosas peculiares, acirrou o conflito dentro das reservas, alterando profundamente as formas de organização social, econômica e cultural dos indígenas, o que resultou em índices alarmantes de superpopulação, miséria e violência nestes espaços.

Definida pela vice-procuradora-geral da República, Deborah Duprat, como “a maior tragédia conhecida na questão indígena em todo o mundo”, a Reserva Indígena de Dourados é um dos exemplos mais contundentes desse processo histórico. Encravada no perímetro urbano do município, na Reserva vivem hoje mais de 13 mil indígenas em 3,6 hectares de terra. É a maior densidade populacional entre todas as comunidades tradicionais do país, e onde aconteceram 18 dos 73 casos de suicídio no estado em 2013.

“Hoje enfrentamos uma carência extremamente aguda de políticas públicas. Desde 2009 existem discussões para implantar um Centro de Atenção Psicossocial Indígena em Durados mas, por enquanto, não foi adotada nenhuma medida concreta para sua construção”, diz Marco Antonio Delfino. “A impressão que se tem é que as pessoas perderam o controle sobre o monstro que criaram, que são essas reservas. Então, fica nesse jogo de empurra-empurra, sempre com soluções paliativas. Precisamos reconhecer e reparar os erros cometidos para que existam soluções efetivas. O primeiro passo é demarcar os territórios usurpados dos indígenas”, conclui o procurador.

Suicide Risk Linked to Rates of Gun Ownership, Political Conservatism (Science Daily)

Apr. 4, 2013 — Residents of states with the highest rates of gun ownership and political conservatism are at greater risk of suicide than those in states with less gun ownership and less politically conservative leanings, according to a study by University of California, Riverside sociology professor Augustine J. Kposowa.

UCR study links risk of suicide with rate of gun ownership and political conservatism at the state level. (Credit: Image courtesy of University of California, Riverside)

The study, “Association of suicide rates, gun ownership, conservatism and individual suicide risk,” was published online in the journal Social Psychiatry & Psychiatric Epidemiology in February.

Suicide was the 11th leading cause of death for all ages in the United States in 2007, the most recent year for which complete mortality data was available at the time of the study. It was the seventh leading cause of death for males and the 15th leading cause of death for females. Firearms are the most commonly used method of suicide by males and poisoning the most common among females.

Kposowa, who has studied suicide and its causes for two decades, analyzed mortality data from the U.S. Multiple Cause of Death Files for 2000 through 2004 and combined individual-level data with state-level information. Firearm ownership, conservatism (measured by percentage voting for former President George W. Bush in the 2000 election), suicide rate, church adherence, and the immigration rate were measured at the state level. He analyzed data relating to 131,636 individual suicides, which were then compared to deaths from natural causes (excluding homicides and accidents).

“Many studies show that of all suicide methods, firearms have the highest case fatality, implying that an individual who selects this technique has a very low chance of survival,” Kposowa said. Guns are simply the most efficient method of suicide, he added.

With few exceptions, states with the highest rates of gun ownership — for example, Alaska, Montana, Wyoming, Idaho, Alabama, and West Virginia — also tended to have the highest suicide rates. These states were also carried overwhelmingly by George Bush in the 2000 presidential election.

The study also found that:

  • The odds of committing suicide were 2.9 times higher among men than women
  • Non-Hispanic whites were nearly four times as likely to kill themselves as Non-Hispanic African Americans
  • The odds of suicide among Hispanics were 2.3 times higher than the odds among Non-Hispanic African Americans
  • Divorced and separated individuals were 38 percent more likely to kill themselves than those who were married
  • A higher percentage of church-goers at the state level reduced individual suicide risk.

“Church adherence may promote church attendance, which exposes an individual to religious beliefs, for example, about an afterlife. Suicide is proscribed in the three monotheistic religions: Judaism, Christianity and Islam,” Kposowa noted in explaining the finding that church membership at the state level reduces individual risk of suicide. “In states with a higher percentage of the population that belong to a church, it is plausible that religious views and doctrine about suicide are well-known through sacred texts, theology or sermons, and adherents may be less likely to commit suicide.”

Kposowa is the first to use a nationally representative sample to examine the effect of firearm availability on suicide odds. Previous studies that associated firearm availability to suicide were limited to one or two counties. His study also demonstrates that individual behavior is influenced not only by personal characteristics, but by social structural or contextual attributes. That is, what happens at the state level can influence the personal actions of those living within that state.

The sociologist said that although policies aimed at seriously regulating firearm ownership would reduce individual suicides, such policies are likely to fail not because they do not work, but because many Americans remain opposed to meaningful gun control, arguing that they have a constitutional right to bear arms.

“Even modest efforts to reform gun laws are typically met with vehement opposition. There are also millions of Americans who continue to believe that keeping a gun at home protects them against intruders, even though research shows that when a gun is used in the home, it is often against household members in the commission of homicides or suicides,” Kposowa said.

“Adding to the widespread misinformation about guns is that powerful pro-gun lobby groups, especially the National Rifle Association, seem to have a stranglehold on legislators and U.S. policy, and a politician who calls for gun control may be targeted for removal from office in a future election by a gun lobby,” he added.

Although total suicide rates in the U.S. are not much higher than in other Western countries, without changes in gun-ownership policies “the United States is poised to remain a very armed and potentially dangerous nation for its inhabitants for years to come.”

Journal Reference:

  1. Augustine J. Kposowa. Association of suicide rates, gun ownership, conservatism and individual suicide risk.Social Psychiatry and Psychiatric Epidemiology, 2013; DOI:10.1007/s00127-013-0664-4

A burocracia e as violências invisíveis (Canal Ibase)

Renzo Taddei – Colunista do Canal Ibase

2 de agosto de 2012

matéria de capa da revista Time da semana passada chama a atenção para dados impressionantes sobre o suicídio entre militares norte-americanos. Desde 2004, o número de militares americanos que se suicidaram é maior do que os que foram mortos em combate no Afeganistão. Em média, um soldado americano na ativa se suicida por dia. Dentre os veteranos, um suicídio ocorre a cada 80 minutos. Entre 2004 e 2008, a taxa de suicídio entre militares cresceu 80%; só em 2012, esse crescimento já é de 18%. O suicídio ultrapassou os acidentes automobilísticos como primeira causa de morte de militares fora de situação de combate.

Foto: Matthew C. Moeller (Flickr)

O exército americano naturalmente busca, preocupado, identificar as causas do problema – até o momento sem sucesso. O problema está longe de ser óbvio, no entanto. Um terço dos suicidas nunca foi ao Afeganistão ou ao Iraque. 43% só foram convocados uma vez. Apenas 8,5% dos suicidas foram convocados três vezes ou mais. E, em sua maioria, são casados. Ou seja, nem todos os suicídios estão relacionados com traumas de campos de batalha.

Como é de se esperar, a burocracia militar busca um diagnóstico burocrático, para que a solução seja burocrática – de modo que não seja necessário cavar muito fundo na questão. O exército americano não tem psiquiatras e profissionais de serviço social suficientes. Muitos soldados se suicidam na longa espera por uma consulta psiquiátrica; outros, após terem sido receitados soníferos e oficialmente diagnosticados como “não sendo um perigo para si ou para os demais”. A cultura militar estigmatiza demonstrações de fraqueza, de modo que muitos evitam procurar ajuda a tempo. Viúvas acusam o exército de negligência; oficiais militares dizem que os soldados se suicidam devido a problemas conjugais.

Enquanto eu refletia sobre o assunto, chegou até mim a indicação de um livro chamadoDays of Destruction, Days of Revolt, do jornalista americano Chris Edges. O livro descreve a situação de algumas das cidades mais pobres dos Estados Unidos e chega à conclusão de que a pobreza de tais cidades não tem ligação com a ideia de subdesenvolvimento, mas sim ao que se poderia chamar de contra-desenvolvimento: são cidades que foram destruídas pela exploração capitalista.

Uma dessas cidades, Camden, no estado de Nova Jersey, é velha conhecida: durante meu doutorado nos Estados Unidos, trabalhei como fotógrafo para complementar minha renda, e estive em Camden várias vezes. Sempre me impressionaram os sinais explícitos de decadência do lugar: gente vivendo em prédios em ruínas; equipamentos públicos em decomposição; tráfico de droga à luz do dia. Agora descubro que se trata nada menos da cidade com menor renda per capita do país.

Chris Edges chama tais cidades de zonas de sacrifício do capitalismo. Ou seja, para que a exploração capitalista possa ocorrer sem impedimentos, o capital se move de um lugar para outro assim que os recursos ou as oportunidades se esgotam, deixando para trás cidades fantasmas, desemprego e depressão. A lógica desse padrão de exploração é bem conhecida desde Marx, pelo menos. O que Chris Edges faz é, com a ajuda do artista gráfico e também jornalista Joe Sacco, dar nova visibilidade a um problema que a burocracia oficial e a mídia fazem questão de não enxergar.

Que relação há entre os suicídios militares e a pobreza urbana dos Estados Unidos? Na verdade, me dei conta que há uma analogia fundamental entre os dois casos: em ambos há a conjugação do fato de que para que o sistema funcione – e estamos falando de sistemas diferentes para cada caso – alguém tem que ser sacrificado; e esse sacrifício e suas vítimas sacrificiais devem permanecer invisíveis para a maioria da população. O esforço dos Estados Unidos para manter sua hegemonia militar produz de forma sistemática a morte de uma imensa quantidade de gente, dentre americanos e seus supostos inimigos. E, para que a lucratividade se mantenha alta, florestas, cidades e empregos são destruídos, também de forma sistemática. Uma das expressões usadas nas ciências sociais para descrever esse estado de coisas é violência estrutural.

A invisibilidade dessas coisas é imprescindível – só assim pessoas bem intencionadas e de boa fé podem participar do sistema perverso, sem enxergar sua perversidade. Por isso, por exemplo, o governo Bush (pai) articulou com a imprensa americana um pacto para que não fossem publicadas fotos de caixões de soldados mortos em combate na primeira Guerra do Golfo. O pacto esteve em vigor por quase vinte anos, até que foidesfeito por Obama em 2009.

Mas a forma mais comum, e eficaz, de produzir as formas de violência estrutural que reproduzem desigualdades de forma invisível é a burocracia. E isso se dá, como nos lembra David Graeber, em razão do fato de que é função da burocracia ignorar as minúcias da vida cotidiana e reduzir tudo a fórmulas mecânicas e estatísticas. Isso nos permite focar nossas energias em um número menor de variáveis, e assim realizar coisas grandiosas e incríveis – para o bem e para o mal. O papel que a burocracia tem na produção da invisibilidade que mantém violências estruturais em funcionamento pode ser exemplificado através do uso de estatísticas em políticas públicas, por exemplo. Um dos programas oficiais de apoio à população rural do Nordeste mais importantes da atualidade, o Garantia Safra – em que pequenos agricultores adquirem um seguro e são indenizados em caso de perda de safra -, sistematicamente exclui agricultores em função de miopia burocrática. Para que os agricultores de um município recebam a indenização, as regras do programa exigem que haja 50% de perda da safra de todo o município. No entanto, basta ver a dimensão e os contornos dos municípios brasileiros para rapidamente concluir que não há relação necessária entre os limites municipais e os fenômenos meteorológicos. Há municípios que, de tão extensos, apresentam variações climáticas dramáticas dentro de suas fronteiras. Nesses casos, é comum que muitos agricultores com grandes perdas não recebam qualquer indenização, se outras regiões do município tiverem perdas menores. Por que é que o município tem que ser tomado como unidade de referência nesse caso? Porque há um aparato burocrático municipal para gerir o programa, e não há níveis burocráticos oficiais em escala menor. Ou seja, o sistema é burro mesmo que ninguém o seja, e quem sofre as consequências são os agricultores.

De forma correlata, índices nacionais ou estaduais de desemprego, crescimento do PIB e do PIB per capita, são unidades de referência centrais das políticas públicas atuais, ainda que sejam médias que não levem em consideração as situações extremas onde efetivamente existe vulnerabilidade socioeconômica. É como se o ditado que diz que “a corda sempre se parte no lado mais fraco” fosse sistematicamente ignorado. A vulnerabilidade de qualquer sistema – uma máquina, por exemplo – é definida pelo seu componente mais frágil. Qualquer engenheiro sabe disso; na verdade, a ideia é tão óbvia que qualquer um sabe disso. É ai que entra a burocracia: . Nesse contexto, não importa muito o que as pessoas sabem ou não: elas não serão capazes de identificar como a burocracia produz inconsistências e violência estrutural, a menos que sejam diretamente afetadas. Dessa forma, cidades como Camden ficam sistematicamente fora do radar, camufladas por estatísticas de âmbito estadual ou nacional.

Isso tudo está relacionado a outra notícia veiculada nos jornais na semana passada: a posição do Brasil nos debates na ONU sobre a regulação do comércio mundial de armas. Apesar das evidências de que as armas fabricadas no Brasil foram e continuam sendo vendidas a governos com histórico de violação dos direitos humanos, o Brasil se colocou frontalmente contra a regulação e criação de mecanismos que deem transparência a esse mercado. A justificativa, como não poderia deixar de ser, é burocrática: a disseminação de informações sobre capacidade bélica “poderia expor os recursos e a capacidade dos países […] de sustentar um conflito prolongado”. Colocar isso como argumento que tem precedência sobre a necessidade de proteger os direitos humanos é um escândalo. Por trás dessa desculpa esfarrapada, está a intenção de proteger a lucrativa indústria bélica brasileira. O que faz a história toda mais indigesta é o fato da Dilma ter sido vítima de tortura, durante o período em que o Brasil era dirigido pela burocracia militar. Como pode a mesma presidente que criou aComissão da Verdade ser conivente com uma indústria e um mercado manchados de sangue?

Esse episódio mostra que, em termos éticos, há menos diferença entre Estados Unidos e Brasil do que os brasileiros gostam de acreditar. Para proteger o capitalismo – já não mais num campo de luta ideológica, como à época da guerra fria, mas na forma de interesses privados reais e específicos de empresas norte-americanas -, os Estados Unidos passam a ser um perigo não apenas para nações vulneráveis não-alinhadas, mas a si mesmo, como revela a epidemia de suicídios entre militares. Da mesma forma, e pelas mesmas razões – ou seja, na caminhada rumo à sua consolidação como poder imperialista – o Brasil se preocupa com seus mortos políticos, e estrategicamente finge não ver que, para a engorda do seu PIB e para a prosperidade de sua indústria bélica, uma imensa quantidade de vidas – na África, no Oriente Média, no sul do Pará e nos morros cariocas –  é sacrificada.

Renzo Taddei é professor da Escola de Comunicação da Universidade Federal do Rio de Janeiro. É doutor em antropologia pela Universidade de Columbia, em Nova York. Dedica-se aos estudos sociais da ciência e tecnologia.

The War on Suicide? (Time)

Monday, July 23, 2012

By NANCY GIBBS; MARK THOMPSON

Leslie McCaddon sensed that the enemy had returned when she overheard her husband on the phone with their 8-year-old daughter. “Do me a favor,” he told the little girl. “Give your mommy a hug and tell her that I love her.”

She knew for certain when she got his message a few minutes later. “This is the hardest e-mail I’ve ever written,” Dr. Michael McCaddon wrote. “Please always tell my children how much I love them, and most importantly, never, ever let them find out how I died … I love you. Mike”

She grabbed a phone, sounded every alarm, but by the time his co-workers found his body hanging in the hospital call room, it was too late.

Leslie knew her husband, an Army doctor, had battled depression for years. For Rebecca Morrison, the news came more suddenly. The wife of an AH-64 Apache helicopter pilot, she was just beginning to reckon with her husband Ian’s stress and strain. Rebecca urged Ian to see the flight surgeon, call the Pentagon’s crisis hotline. He did–and waited on the line for more than 45 minutes. His final text to his wife: “STILL on hold.” Rebecca found him that night in their bedroom. He had shot himself in the neck.

Grand Praire, TX. Rebecca Morrison with some of her husband Ian’s belongings in her parents homes. Ian, an AH-64 Apache Helicopter pilot in the U.S. Army committed suicide on March 21, 2012. Ian chose ‘Ike’ for Rebecca. Peter van Agtmael/Magnum for TIME.

Both Army captains died on March 21, a continent apart. The next day, and the next day, and the next, more soldiers would die by their own hand, one every day on average, about as many as are dying on the battlefield. These are active-duty personnel, still under the military’s control and protection. Among all veterans, a suicide occurs every 80 minutes, round the clock.

Have suicides spiked because of the strain of fighting two wars? Morrison flew 70 missions in Iraq over nine months but never engaged the enemy directly. McCaddon was an ob-gyn resident at an Army hospital in Hawaii who had never been to Iraq or Afghanistan. Do the pride and protocols of a warrior culture keep service members from seeking therapy? In the three days before he died, Morrison went looking for help six times, all in vain. When Leslie McCaddon alerted commanders about her husband’s anguish, it was dismissed as the result of a lovers’ quarrel; she, not the Army, was the problem.

This is the ultimate asymmetrical war, and the Pentagon is losing. “This issue–suicides–is perhaps the most frustrating challenge that I’ve come across since becoming Secretary of Defense,” Leon Panetta said June 22. The U.S. military seldom meets an enemy it cannot target, cannot crush, cannot put a fence around or drive a tank across. But it has not been able to defeat or contain the epidemic of suicides among its troops, even as the wars wind down and the evidence mounts that the problem has become dire. While veterans account for about 10% of all U.S. adults, they account for 20% of U.S. suicides. Well trained, highly disciplined, bonded to their comrades, soldiers used to be less likely than civilians to kill themselves–but not anymore.

More U.S. military personnel have died by suicide since the war in Afghanistan began than have died fighting there. The rate jumped 80% from 2004 to 2008, and while it leveled off in 2010 and 2011, it has soared 18% this year. Suicide has passed road accidents as the leading noncombat cause of death among U.S. troops. While it’s hard to come by historical data on military suicides–the Army has been keeping suicide statistics only since the early 1980s–there’s no denying that the current numbers constitute a crisis.

The specific triggers for suicide are unique to each service member. The stresses layered on by war–the frequent deployments, the often brutal choices, the loss of comrades, the family separation–play a role. So do battle injuries, especially traumatic brain injury and posttraumatic stress disorder (PTSD). And the constant presence of pain and death can lessen one’s fear of them.

But combat trauma alone can’t account for the trend. Nearly a third of the suicides from 2005 to 2010 were among troops who had never deployed; 43% had deployed only once. Only 8.5% had deployed three or four times. Enlisted service members are more likely to kill themselves than officers, and 18-to-24-year-olds more likely than older troops. Two-thirds do it by gunshot; 1 in 5 hangs himself. And it’s almost always him: nearly 95% of cases are male. A majority are married.

No program, outreach or initiative has worked against the surge in Army suicides, and no one knows why nothing works. The Pentagon allocates about $2 billion–nearly 4% of its $53 billion annual medical bill–to mental health. That simply isn’t enough money, says Peter Chiarelli, who recently retired as the Army’s second in command. And those who seek help are often treated too briefly.

Army officials declined to discuss specific cases. But Kim Ruocco directs suicideprevention programs at the nonprofit Tragedy Assistance Program for Survivors, or TAPS. She knows what Leslie McCaddon and Rebecca Morrison have endured; her husband, Marine Major John Ruocco, an AH-1 Cobra helicopter-gunship pilot, hanged himself in 2005. These were highly valued, well-educated officers with families, with futures, with few visible wounds or scars; whatever one imagines might be driving the military suicide rate, it defies easy explanation. “I was with them within hours of the deaths,” Ruocco says of the two new Army widows. “I experienced it through their eyes.” Their stories, she says, are true. And they are telling them now, they say, because someone has to start asking the right questions.

The Bomb Grunt

Michael McCaddon was an Army brat born into a uniquely edgy corner of the service: his father served in an ordnance-disposal unit, and after his parents divorced, his mother married another bomb-squad member. McCaddon entered the family business, enlisting at 17. “When I joined the Army I was 5’10” and weighed 129 lbs,” he blogged years later. “I had a great body … for a girl.” But basic training made him stronger and tougher; he pushed to get the top scores on physical-fitness tests; he took up skydiving, snorkeling, hiking. If you plan to specialize in a field in which a single mistake can cost you and your comrades their lives, it helps to have high standards. “Ever since I was new to the Army, I made it my personal goal to do as well as I can,” he recalled. “I thought of it as kind of a representation of my being, my honor, who I was.”

The Army trained him to take apart bombs. He and his team were among the first on the scene of the 1995 Oklahoma City bombing, combing the ruins for any other devices, and he traveled occasionally to help the Secret Service protect then First Lady Hillary Clinton. He met Leslie in 1994 during a break in her college psychology studies. They started dating, sometimes across continents–he did two tours in Bosnia. During a Stateside break in January 2001, he married Leslie in Rancho Santa Fe, Calif. They had three children in four years, and McCaddon, by then an active-duty officer, moved with his family to Vilseck, Germany, where he helped run an Army dental office.

He was still ambitious–two of Leslie’s pregnancies had been difficult, so he decided to apply to the military’s medical school and specialize in obstetrics. But then, while he was back in Washington for his interview, came a living nightmare: his oldest son, who was 3, was diagnosed with leukemia. Just before entering med school, McCaddon prepared for his son’s chemotherapy by shaving his head in solidarity so the little boy wouldn’t feel so strange. McCaddon may not have been a warrior, but he was a fighter. “I became known as a hard-charger,” he wrote. “I was given difficult tasks, and moved through the ranks quickly.” He pushed people who didn’t give 100%; he pushed himself.

The Apache Pilot

Ian Morrison was born at Camp Lejeune in North Carolina, son of a Marine. An honor student at Thomas McKean High School in Wilmington, Del., he sang in the chorus, ran cross-country and was a co-captain of the swimming team before heading to West Point. He had a wicked sense of humor and a sweet soul; he met Rebecca on a Christian singles website in 2006 and spent three months charming her over the phone. One night he gave her his credit-card information. “Buy me a ticket, because I’m going to come see you,” he told her before flying to Houston. “The minute I picked him up,” she recalls, “we later said we both knew it was the real deal.” He proposed at West Point when she flew in for his graduation.

Morrison spent the next two years at Fort Rucker in Alabama, learning to fly the two-seat, 165-m.p.h. Apache helicopter, the Army’s most lethal aircraft. He and his roommate, fellow West Pointer Sean McBride, divided their time among training, Walmart, church, Seinfeld and video games, fueled by macaroni and cheese with chopped-up hot dogs. Morrison and Rebecca were married two days after Christmas 2008 near Dallas. The Army assigned him to an aviation unit at Fort Hood, so they bought a three-bedroom house on an acre of land just outside the town of Copperas Cove, Texas. They supported six African children through World Vision and were planning to have some kids of their own. “We had named our kids,” Rebecca says.

Morrison was surprised when the Army ordered him to Iraq on short notice late in 2010. Like all young Army officers, he saluted and began packing.

Triggers and Traps

One theory of suicide holds that people who feel useful, who feel as if they belong and serve a larger cause, are less likely to kill themselves. That would explain why active-duty troops historically had lower suicide rates than civilians. But now experts who study the patterns wonder whether prolonged service during wartime may weaken that protective function.

Service members who have bonded with their units, sharing important duties, can have trouble once they are at a post back home, away from the routines and rituals that arise in a close-knit company. The isolation often increases once troops leave active duty or National Guardsmen and reservists return to their parallel lives. The military frequently cites relationship issues as a predecessor to suicides; that irritates survivors to no end. “I’m not as quick to blame the Army as the Army is to blame me,” Leslie McCaddon says. “The message I get from the Army is that our marital problems caused Mike to kill himself. But they never ask why there were marriage problems to begin with.”

As McCaddon made his way through med school in Maryland, he encountered ghosts from his past. He was reaching the age at which his biological father had died by suicide, which statistically increased his own risk. But he wasn’t scared by it, Leslie says; he told associates about it. What did bother him was that he was gaining weight, the physical-training tests were getting harder for him, and the course work was challenging to juggle with a young family. He hid the strain, “but inside it is killing me,” he blogged. He called Leslie a hero “for not kicking me out of the house on the several times I’ve given her reason.” And he told her he sometimes thought of suicide.

“But he would tell everyone else that he was fine,” Leslie says. “He was afraid they’d kick him out of medical school if he was really honest about how depressed he was.” McCaddon sought counseling from a retired Army psychiatrist and seemed to be turning a corner in May 2010, when he graduated and got his first choice for a residency, at Tripler Army Medical Center in Honolulu.

“He loved being a soldier,” Leslie said, “and he was going to do everything he could to protect that relationship.”

Leslie had relationships to protect as well. He was increasingly hard on her at home; he was also hard on the kids and on himself. “He was always an amazing father–he loved his children–but he started lashing out at them,” Leslie recalls. “He wasn’t getting enough sleep, and he was under a lot of stress.” Leslie began exploring options but very, very carefully; she had a bomb-disposal problem as well. “When I was reaching out for help, people were saying, Be careful how you phrase this, because it could affect your husband’s career,” she says. “That was terrifying to me. It made me think that by advocating for him I’d be making things worse.”

The Pilot’s Pain

Captain Morrison headed to Iraq in early 2011. Once there, he and Rebecca Skyped nearly every day between his flight assignments. When he took R&R leave in early September, they visited family in Dallas, then San Antonio, and caught concerts by Def Leppard and Heart.

There were no signs of trouble. “He was so mentally stable–he worked out every day, we ate good food, and we always had good communication,” his wife says. “Most people would say he was kind of quiet, but with me he was loud and obnoxious and open.”

Morrison never engaged the enemy in direct combat; still, some 70 missions over Iraq took their toll. His base was routinely mortared. After one mission, he and several other pilots were walking back to their hangar when a rocket shot right past them and almost hit him; he and his comrades ran and dived into a bunker, he told Rebecca once he was safely home. He impressed his commander–“Excellent performance!” his superior raved in a formal review of the man his buddies called Captain Brad Pitt. “Unlimited potential … continue to place in position of greater responsibility.”

It was not the war that turned out to be hard; it was the peace. Morrison returned to Fort Hood late last year and spent his month off with Rebecca riding their horses, attending church and working out. He seemed unnerved by slack time at home. “He said it was really easy to fall into a routine in Iraq–they got up at the exact same time, they ate, they worked out, they flew forever and then they came back, and he’d talk to me, and then they did it all over again,” Rebecca says. “When he came back to Texas, it was really difficult for him to adjust.”

Morrison was due to be reassigned, so he and his wife needed to sell their house, but it just sat on the market. His anxiety grew; he was restless, unable to sleep, and they thought he might be suffering from PTSD. The couple agreed that he should see a doctor. Military wives, especially those studying mental health, have heard the stories, know the risks, learn the questions: Is their spouse drinking more, driving recklessly, withdrawing from friends, feeling trapped? Be direct, they are told. “I looked him right in the face and asked, ‘Do you feel like you want to hurt or kill yourself?'” Rebecca recalls. “He looked me right in the face and said, ‘Absolutely not–no way–I don’t feel like that at all. All I want to do is figure out how to stop this anxiety.'”

The Stigma

When troops return from deployment, they are required to do self-assessments of their experience: Did they see people killed during their tour? Did they feel they had been at risk of dying? Were they interested in getting counseling for stress or alcohol use or other issues? But a 2008 study found that when soldiers answer questions anonymously, they are two to four times as likely to report depression or suicidal thoughts. Independent investigations have turned up reports of soldiers being told by commanders to airbrush their answers or else risk their careers. A report by the Center for a New American Security cited commanders who refuse to grant a military burial after a suicide for fear that doing so would “endorse or glamorize” it.

The U.S. Department of Veterans Affairs (VA) and all the services have launched resiliency-training programs and emergency hotlines, offering slogans like “Never leave a Marine behind” and “Never let your buddy fight alone” that try to speak the language of the unit. Last year the Pentagon released a video game meant to allow soldiers to explore the causes and symptoms of PTSD from the privacy of their homes. “We want people to feel like they are encouraged to get help,” says Jackie Garrick, who runs the new Defense Suicide Prevention Office. “There are a myriad of ways you can access help and support if you need it.”

But faith in that commitment was shaken this year when Army Major General Dana Pittard, commander of the 1st Armored Division at Fort Bliss, Texas, complained on his official blog that he was “personally fed up” with “absolutely selfish” troops who kill themselves, leaving him and others to “clean up their mess. Be an adult, act like an adult, and deal with your real-life problems like the rest of us,” he continued. He later said he wanted to “retract” what he called his “hurtful statement,” but he didn’t apologize for what he said. Many soldiers and family members believe Pittard’s attitude is salted throughout the U.S. military.

Just a Lovers’ Quarrel

In August 2010, Leslie went to McCaddon’s commanding officer at the hospital. She didn’t tell Michael. “It was the scariest thing I’ve ever done,” she says. She recalls sitting in the commander’s office, haltingly laying out her concerns–McCaddon’s history of depression, his struggle to meet his high standards while doing right by his family. She was hoping that maybe the commander would order him into counseling and defuse the stigma somehow: he’d just be following orders. She watched the officer, a female colonel, detonate before her eyes. “No one at the medical school told me he had a history of depression, of being suicidal,” Leslie recalls her shouting. “I have a right to know this. He’s one of my residents. Why didn’t anyone tell me?” The commander was furious–not at Leslie, exactly, but at finding herself not in command of the facts.

The colonel called several colleagues into the room and then summoned McCaddon as well. Leslie registered the shock and fear on his face when he saw his wife sitting with his bosses. “I was shaking,” she says. “I told him I continued to be concerned that his depression was affecting our family and that I was really concerned for his safety but also for the well-being of our children and myself.”

The commander encouraged McCaddon to get help but wouldn’t order him to do it. He left the room, livid, and Leslie burst into tears. “Honey, don’t worry,” Leslie remembers the commander saying. “My first marriage was a wreck too.”

Can’t you make him get some help? Leslie pleaded again, but the colonel pushed back. McCaddon was doing fine at work, with no signs of a problem. “‘Leslie, I know this is going to be hard to hear, but this just doesn’t sound like an Army issue to me,'” McCaddon’s wife recalls the colonel saying. “‘It sounds like a family issue to me.'” Leslie felt her blood run cold. “No one was going to believe me so long as things were going fine at work.”

McCaddon did try to see an Army psychiatrist, but a month or more could pass without his finding the time. “I’d say, ‘He’s in the Army,'” Leslie recalls telling the doctor, “‘and you make him do everything else, so you should be able to make him go to mental-health counseling.'” But McCaddon was not about to detour from rounds to lie on the couch. He barely ate while on his shift. “Everybody here is under stress,” he stormed at Leslie. “I can’t just walk out for an hour a week–I’m not going to leave them when we’re already short-staffed.”

The marriage was cracking. Back in Massachusetts, Leslie’s mother was not well. Leslie and the kids moved home so she could take care of her. She and Michael talked about divorce.

The Waiting Room

Early on Monday, March 19, Ian Morrison showed up at a Fort Hood health clinic, where he sat waiting in his uniform, with his aviation badge, for three hours. Finally someone saw him. “‘I’m sorry you had to wait all this time,'” Rebecca says he was told. “‘But we can’t see you. We can’t prescribe you anything.'” He had to see the doctor assigned to his unit. When Morrison arrived at the flight surgeon’s office, he told Rebecca, the doctor was upset that Morrison hadn’t shown up at the regular daily sick call a couple of hours earlier.

“He told me this guy was so dismissive and rude to him. ‘You need to follow procedure. You should have been here hours ago,'” Rebecca says. “Ian wanted to tell the doctor he was anxious, depressed and couldn’t sleep, but this guy shut him down.” Morrison acknowledged only his sleeplessness, leading the doctor to give him 10 sleeping pills with orders to return the next week. He’d be grounded for the time being.

But that didn’t seem to affect his mood. Morrison toasted his wife’s success on a big exam that day–she was close to earning her master’s in psychology–by cooking a steak dinner and drawing a bubble bath for her that night. “He was dancing around and playing music and celebrating for me,” she remembers. “He seemed really hopeful.” He took a pill before bed but told Rebecca in the morning that he hadn’t slept.

On Tuesday, March 20, Morrison tried to enroll in an Army sleep study but was told he couldn’t join for a month. “Well, I’ll just keep taking Ambien and then go see the flight surgeon,” he told the woman involved with the study. She asked if he felt like hurting himself. “No, ma’am, you don’t have to worry about me at all,” he said. “I would never do that.” That day, Morrison typed an entry in his journal: “These are the things I know that I can’t change: whether or not the house sells, the state of the economy, and the world … these are things that I know to be true: I’m going to be alive tomorrow, I will continue to breathe and get through this, and God is sovereign over my life.”

Rebecca awoke the next morning to find her husband doing yoga. “I’m self-medicating,” he told her. She knew what that meant. “You couldn’t sleep again, huh?” Rebecca asked.

“No,” Morrison said. “I’m going back to the doctor today.” Given the lack of success with the medication, she told him that was probably a good idea. She left the house, heading for the elementary school on post where she taught second grade.

A System Overwhelmed

The Army reported in January that there was no way to tell how well its suicide-prevention programs were working, but it estimated that without such interventions, the number of suicides could have been four times as high. Since 2009, the Pentagon’s ranks of mental-health professionals have grown by 35%, nearing 10,000. But there is a national shortage of such personnel, which means the Army is competing with the VA and other services–not to mention the civilian world–to hire the people it needs. The Army has only 80% of the psychiatrists and 88% of the social workers and behavioral-health nurses recommended by the VA. Frequent moves from post to post mean that soldiers change therapists often, if they can find one, and mental-health records are not always transferred.

Military mental-health professionals complain that the Army seemed to have put its suicide-prevention efforts on the back burner after Chiarelli, a suicide fighter, left the service in January. “My husband did not want to die,” Rebecca says. “Ian tried to get help–six times in all … Think about all the guys who don’t even try to get help because of the stigma. Ian was so past the stigma, he didn’t care. He just wanted to be healthy.”

The Breaking Point

On March 15, McCaddon gave a medical presentation that got rave reviews. Then he called Massachusetts to speak to his children and sent Leslie that last e-mail. He regretted his failures as a husband, as a father. Don’t tell the children how I died, he begged her. “Know that I love you and my biggest regret in life will always be failing to cherish that, and instead forsaking it.” Leslie read the e-mail in horror. “In the back of my mind, I’m saying to myself, He’s at work–he’s safe,” she recalls. “It never occurred to me that he would do what he did at work.” But she immediately dialed the hospital’s delivery center. She had just received a suicide note from her husband, she told the doctor who answered, and they needed to find him immediately. The hospital staff fanned out.

“They’ve sent people to the roof, the basement, to your house. We’re looking everywhere,” a midwife told Leslie in a call minutes later. As they talked, Leslie suddenly heard people screaming and crying in the background. Then she heard them call a Code Blue. They had found him hanging from a noose in a call room. It had been less than 30 minutes since McCaddon had sent his final e-mail to his wife. Among the voices Leslie thought she recognized was that of McCaddon’s commander, whose words came rushing back. “Does it seem like a family issue to her now?” Leslie remembers thinking. “Because it looks like it happened on her watch.”

It took 15 minutes for the first responders to bring back a heartbeat. By then he had been without oxygen for too long. Leslie flew to Hawaii, and Captain McCaddon was taken off life support late Tuesday, March 20. He was pronounced dead early the next day.

That same day, Wednesday, March 21, Morrison saw a different Army doctor, who in a single 20-minute session diagnosed him with clinical depression. He got prescriptions for an antidepressant and a med to treat anxiety but hadn’t taken either when he called his wife. Rebecca encouraged him to stop by the resiliency center on post to see if he might get some mental-health counseling there. Just before noon, Morrison texted Rebecca, saying he was “Hopeful :)” about it. She wanted to know what they told him. “Will have to come back,” he responded. “Wait is about 2 hrs.” He needed to get back to his office.

Rebecca was still concerned. At about 4 p.m., she urged her husband to call a military hotline that boasted, “Immediate help 24/7–contact a consultant now.” He promised he would. “I said, ‘Perfect. Call them, and I’ll talk to you later,'” Rebecca says. “He was like, ‘O.K., bye.'”

That was the last time she ever talked to him. Their final communication was one more text about 45 minutes later. “STILL on hold,” he wrote to her. Rebecca responded moments later: “Can’t say you’re not trying.”

Morrison called Rebecca at 7:04 p.m., according to her cell phone, but she was leading a group-therapy session and missed it. He didn’t leave a message.

Two and a half hours later, she returned home from her grad-school counseling class. She threw her books down when she entered the living room and called his name. No answer. She saw his boots by the door; the mail was there, so she knew he had to be home. “I walked into our bedroom, and he was lying on the floor with his head on a pillow, on my side of the bed.” He was still in his uniform.

Rebecca stammers, talking softly and slowly through her sobs. “He had shot himself in the neck,” she says. “There was no note or anything. He was fully dressed, and I ran over to him and checked his pulse … and he had no pulse. I just ran out of the house screaming, ‘Call 911!’ and ran to the neighbors.”

The Next Mission

At a suicide-prevention conference in June, Panetta laid down a charge: “We’ve got to do everything we can to make sure that the system itself is working to help soldiers. Not to hide this issue, not to make the wrong judgments about this issue, but to face facts and deal with the problems up front and make sure that we provide the right diagnosis and that we follow up on that kind of diagnosis.”

But what makes preventing suicide so confounding is that even therapy often fails. “Over 50% of the soldiers who committed suicide in the four years that I was vice [chief] had seen a behavioral-health specialist,” recalls Chiarelli. “It was a common thing to hear about someone who had committed suicide who went in to see a behavioral-health specialist and was dead within 24, 48 or 72 hours–and to hear he had a diagnosis that said, ‘This individual is no danger to himself or anyone else.’ That’s when I realized that something’s the matter.”

There’s the horrific human cost, and there is a literal cost as well. The educations of McCaddon and Morrison cost taxpayers a sum approaching $2 million. “If the Army can’t be reached through the emotional side of it–that I lost my husband–well, they lost a $400,000 West Point education and God knows how much in flight school,” Rebecca says. (The Army says Morrison’s pilot training cost $700,000.) Adds Leslie: “They’d invested hundreds of thousands of dollars into this asset. At the very least, why didn’t they protect their asset?”

Captain McCaddon was buried with full military honors on April 3 in Gloucester, Mass. A pair of officers traveled from Hawaii for the service and presented his family with the Army Commendation Medal “for his selfless and excellent service.” Leslie and their three children also received the U.S. flag that had been draped over his casket and three spent shells fired by the honor guard. They visited his grave on Father’s Day to leave flowers, and each child left a card. After two years of chemotherapy, their oldest child’s leukemia remains in remission.

Captain Morrison was buried in central Texas on March 31. The Army had awarded him several decorations, including the Iraq Campaign Medal with Campaign Star. There were military honors graveside, and a bugler played taps. At his widow’s request, there was no rifle volley fired.