Justin McNaull grew up in a hurry. By the time he was 23, McNaull had graduated from college, married and gone to work for his local police force in Virginia. But McNaull, now 36, still bristles at the memory of something he wasn’t allowed to do at 23: go down to the airport counter and rent a car. “I’d been involved in police pursuits at more than 100 mph,” he says, “and yet they still wouldn’t rent me a car.”
To many young people, rental-car restrictions are more than an annoyance. They’re also a confusing contradiction, in terms of what society expects of them. After all, states trust people to drive at a much younger age: Most states issue driver’s licenses to persons as young as 16 years old. Yet nearly a decade must pass before the same persons can earn the trust of Hertz or Avis.
By the time adolescents become adults, they are accustomed to such inconsistent treatment. Practically from puberty, young people are bombarded with mixed signals about the scope of their rights and the depth of their responsibilities. And most of those mixed signals come from the laws of state and local governments. In most respects, people are considered adults at 18. That’s when they can vote and enter into legal contracts—including the purchase, if not rental, of a car. But a 20-year-old Marine, just back from patrolling the streets of Baghdad, would have to turn 21 before he could join a local police force in most cities in the United States. A 20-year-old college junior, far more educated than the average American, cannot buy alcohol or enter a casino. In 10 states, a single 20-year-old cannot legally have sex with a 17-year old. But in nearly every state, a 16-year-old can marry—if he has his parents’ permission. (A handful of states allow girls to marry before boys.)
The most glaring examples lie within the criminal justice system. A spike in juvenile violence two decades ago spurred state legislators to adopt the mantra “adult time for adult crimes.” Consequently, in most states, a 10-year-old charged with murder can be tried as an adult. Slightly older teens can be tried in adult courts for virtually every other crime. Even when states wait until 18 to treat criminals as adults, they don’t like to wait long. Until recently, inmates at youth detention facilities in New Mexico were woken up just one minute after midnight on their 18th birthdays, in order to be moved to adult prisons.
Recently, many of these lines drawn between adolescence and maturity have been called into question. For example, the presidents of 135 universities are campaigning to consider lowering the drinking age from 21. They note that binge drinking on campus is rampant despite the stricture, and argue that if students were given the right to drink at an earlier age, they might handle it more responsibly. Another argument is a reprise of the one that came up 40 years ago when servicemen came home from Vietnam. Then, the complaint was that soldiers were old enough to die but not to vote. (The 26th Amendment took care of that problem by lowering the voting age to 18.) Today, military personnel returning from Iraq and Afghanistan are left to question why they can fight America’s wars but still can’t patronize its bars.
Meanwhile, legislatures and courts are hearing a very different argument from a group of people that haven’t traditionally testified before them: neuroscientists. Using advanced brain-scanning technology, scientists are getting a better view of how the human brain develops than ever before. And what they’ve found is that in most people, the prefrontal cortex and its links to other regions of the brain are not fully formed until age 25—much later than anyone had realized. These areas are the seat of “executive decision making”—the parts of the brain that allow people to think through the likely consequences of an action, weigh the risks and benefits and stop themselves from acting on impulse. In other words, the stuff that makes you a mature person.
To state and local lawmakers and judges, the brain research can come as a revelation: Maybe the car-rental companies were right all along. What to do about this is another matter. In America, “adulthood” already has its familiar compass points, 18 and 21. But what is the age of responsibility? And what if that age—the point when citizens are responsible enough to earn all of the rights a democracy confers upon its people—bears no resemblance to the ages already enshrined in law? Finding the answers to those questions is a more complicated task than simply choosing a milestone birthday. “There’s been a growing recognition that most of our earlier law in how we treat adolescents and young adults was chaotic and not tied to any empirical rationale,” says Brian Wilcox, a psychologist at the University of Nebraska. “When many of these laws were established, there really wasn’t research on which they could be based.”
The age at which children are considered mature is rooted in a mix of culture, convenience and historical precedent. Aristotle wrote of 21 as the age when a person would have completed three 7-year stages of youth development. During the Middle Ages, legend has it that 21 was considered the age of adulthood because that’s when men were capable of wearing a full suit of armor. Arbitrary as such reasoning may sound to modern Americans, 21 stuck as a threshold age through the 19th century and into the 20th. Until they turned 21, young people owed their parents either their labor or their wages, whether that meant working on the family farm or operating a machine in an urban factory and handing over their pay.
But during the Progressive Era, reform efforts and adolescent research began to change notions about growing up. States, and eventually the federal government, enacted child-labor laws, keeping kids from working and ultimately making their attendance in high school compulsory. Such laws were opposed by business groups, which hated to let go of the cheap labor, and supported by unions, which didn’t like the cheaper competition.
Through the middle of the 20th century, the onset of adulthood seemed to come earlier and earlier. War was partly responsible for that, as 18-year-olds went off to fight in World War II, followed by the wars in Korea and Vietnam. On the home front, manufacturing jobs didn’t require a high-school diploma. It was thus common for 18-year-olds to support themselves and start their own families. And the rise of youth culture in the 1950s and 60s turned the teen years into their own distinctive stage of development—and consumer spending. There was a new sense that reaching the end of this life phase was a rite of passage in and of itself.
Nowadays, teens face more cultural pressure than ever to grow up fast, in certain ways. Recent controversies over whether 16-year-old pop star Miley Cyrus has sexualized her image is the latest symptom of that. Yet there’s a strong pull in exactly the opposite direction, too. Many more 18-year-olds are choosing college over work now than a generation or two ago. They live independently at school for part of the year but under their parents’ roofs for the rest. People are getting married later than they used to, and many have become slower about starting their own careers. Even before the current recession, plenty of college grads and dropouts had “boomeranged” back to Mom and Dad’s house. Sociologists now talk of “extended adolescence” and “delayed adulthood.”
That means that the window of time during which teens and young adults “grow up” is opening wider. This partly explains why state and local governments are so haphazard when it comes to young people: The law, and the people who write and interpret it, are just as befuddled about how to handle this situation as any anxious parent. Mostly, they have responded by cracking down. On an annual basis, the number of laws regulating the behavior of people under 18 has more than tripled since the 1950s. Curfews are now common. Recently, states have banned minors from purchasing items such as nitrous-oxide inhalants and fruit-flavored mini-cigars. Various jurisdictions have restricted “sexting”—sending lewd photos via cell phones. And 20 states ban only those under 18 from talking on cell phones while driving, despite evidence that the behavior (even using a hands-free device) is treacherous among drivers of all ages.
So there is a bit of hypocrisy, too, in the way governments define the age of responsibility. While nearly every state recently has put new limits on teen drivers, no state has begun restricting—or even testing—elderly drivers, some of whom may, like teens, lack mastery of their vehicles. Franklin Zimring, a UC Berkeley law professor, suggests that it’s easier to block youngsters from obtaining rights than it is to take away rights to which adults have grown accustomed. That’s because states aren’t really denying young people rights, Zimring says. They’re asking them to wait.
As Jack McCardell sees it, the wait can be counterproductive. McCardell is the former president of Middlebury College in Vermont. He’s also the leader of the group of college presidents calling for a national debate about the drinking age. Technically, states hold the power to set their own drinking ages. But since the mid-1980s, Congress has all but required the age to be set at 21. If states were to set it any lower, they would forfeit 10 percent of their federal highway funds.
McCardell points to surveys showing that upwards of 90 percent of young people have had drinks or gotten drunk before turning 21. Those numbers only confirm what everyone knows—that binge drinking is out of control on college campuses. Of the current drinking age, McCardell says, “it’s pretty hard to argue on the most basic terms that it’s been at all successful, given the number who continue to consume.”
McCardell believes that the current laws not only are ineffective and unenforceable but are in fact leading students to drink more heavily in illicit and unsafe circumstances. The problem, he says, is that underage kids don’t actually consider themselves underage. McCardell believes this is a direct consequence of the mixed messages states send teenagers about responsibility. “We have a law that is out of step with social and cultural reality,” he says. “In the eyes of a culture and a polity that understands in the most general way that 18 is the age of adulthood, the most glaring exception is the prohibition on alcohol, and that is why we’ve had such a difficult time enforcing this law.”
A half-dozen states have taken McCardell up on the challenge of at least debating the idea of lowering the drinking age. But McCardell is the first to admit that none of them will ever pass legislation as long as a big chunk of their highway dollars is at risk. In fact, if there’s any trend among states, it’s to crack down further on drinking by those under age 21. States have created new keg-registration requirements, stepped up enforcement of carding at convenience stores and passed “social host” laws that impose liability on adults who serve alcohol to teens at parties.
Some supporters of holding the drinking age steady acknowledge that 21, when it comes right down to it, is an arbitrary age. Twenty-five might be better, if unrealistic. But they argue that enforcement is a problem at any age, and lowering the legal limit to 18 would only mean pushing the drinking problem further down to 16- and 17-year-olds. Alexander Wagenaar, a health policy professor at the University of Florida, goes further. He believes that lowering the drinking age would be disastrous. After states set the age at 21, he says, teen highway deaths immediately dropped by 15 to 20 percent. “The people who are advocating going down to 18,” says Wagenaar, “should acknowledge that they’re willing to risk an extra thousand deaths per year and double that number of injuries.”
The debate about drinking hinges on the question of whether the age of responsibility has been set too high. But in the juvenile justice world, a parallel debate has been going on about whether the age of responsibility has been set too low.
In the early 20th century, every state created stand-alone legal systems for handling juveniles, defined as those under 18. Advocates of that era described the states as “a sheltering wise parent” that would shield a child from the rigors of criminal law. By the 1980s, however, the idea that rehabilitating such offenders should be the main goal of the system had lost credibility. Due to a spike in juvenile homicides involving handguns—and concerns that young “superpredators” presented an extreme and growing danger to society—legislators passed countless laws that made it easier to try minors as adults. This was true not only for serious matters such as murder and drug crimes but also for minor infractions and misdemeanors. Some plea bargains are available to teens only if they agree to adult handling. Specific numbers are hard to come by, but on any given day, an estimated 10,000 minors are housed in adult facilities.
Now, states are just starting to rethink the wisdom of sending 13-year-olds to spend hard time among older, more experienced criminals. According to the federal Centers for Disease Control and Prevention, youths who had previously been tried as adults are 34 percent more likely to commit a crime again than those who went through the juvenile justice system. Not only do young offenders treated as adults reoffend sooner and more frequently, they’re also more likely to go on to commit violent crimes.
On this matter, states are finding, nothing is more persuasive than crime data. Despite all the media attention given years ago to superpredators, the vast majority of youth crimes involve property theft and drugs and seldom involve murder. And while there are still roughly 250,000 juveniles tried each year, the rate of crime for this cohort, as measured by arrests, has gone down in each of the past 15 years.
Tough policies toward juveniles remain prevalent, but a few states have begun loosening up. In 2005, Illinois ended its policy of automatically transferring juvenile misdemeanor cases to adult courts, leaving the decision up to judges. A follow-up study found a dramatic drop in the number of cases referred to adult court, suggesting that most of the old automatic transfers had not involved serious crimes.
As of January 1, Connecticut will end its policy of treating all offenders 16 and up as adults. A similar proposal in North Carolina stalled this summer. While the latest research and crime statistics have opened up room for a fresh debate about juvenile justice, that space could evaporate at any time. There’s no telling when a high-profile teen crime may catch the attention of cable news. “If we have another crime wave for whatever reason,” says Shay Bilchik, of the Center for Juvenile Justice Reform, “it will be very difficult to resist going back to lock ’em up.”
It’s precisely because policy toward teens can be so random and emotionally charged that some people find the discoveries about brain development reassuring. The brain scans are putting hard science behind what anyone who has raised an adolescent knows—that young people simply aren’t always capable of making good decisions.
Increasingly, this scientific evidence is being introduced in regard to juvenile justice. In 2005, the U.S. Supreme Court struck down the juvenile death penalty after receiving stacks of briefs summarizing the latest adolescent brain research. The justices will surely get an update on the science this fall when they hear a pair of cases from Florida meant to determine whether sentencing juveniles to life without parole constitutes cruel and unusual punishment. Scientists now regularly appear before legislative committees, showing pictures that make clear the developmental differences between a 16-year-old brain and that of a 25-year-old. The scans show, in the words of Temple University psychologist Laurence Steinberg, that juveniles may be “less guilty by reason of adolescence.”
But while brain research is “sexy,” Steinberg says, it hasn’t necessarily persuaded legislators that they need to change laws regarding crime and punishment. Nor has it fundamentally changed the way policy makers view the age of responsibility in terms of when young people can drink, smoke or drive. The conclusion that 25 might be the most scientifically defensible age for any of those things is simply a nonstarter politically. Texas state Representative Jerry Madden says he’s sympathetic to the argument that “the brain isn’t fully developed until 25, and that’s when people should be allowed to do certain things.” But he says he suggested to a brain scientist who once made that case to him that “she could carry that bill—I wasn’t going to.”
Even scientists are cautious about leaning too hard on the neurobiology. Research linking brain structure to actual human behavior is still limited. And neuroscientists are clear about the fact that different parts of the brain mature along different timetables. In other words, executive thinking may not reach its peak until 25 but most people are capable of performing many adult functions adequately at an earlier age—probably between 16 and 21. “We’re very early in the curve of finding out how the brain research should be interpreted,” says Ronald Dahl, a professor of pediatrics and psychiatry at the University of Pittsburgh.
The fact that every person is different and develops at his own pace doesn’t make the creation of policy any easier. Parents can guide their children, let them learn from their mistakes when they need to and bail them out when they have to. But laws are less sympathetic. Laws must draw lines, in order to be fair and comprehensible. And there will never be enough brain scans to go around to draw those lines as accurately as we might like.
What those laws can do, however, is acknowledge that growing up is a process, not a birthday. And in at least one major policy area—the driving age—states are finding ways to recognize this by introducing youngsters to increasing levels of responsibility, rather than foisting it upon them all at once.
The driving age is more rooted in practical experience than the arbitrary conventions that define the drinking age and most other adult responsibilities. Early in the 20th century, there essentially was no regulation. As soon as someone’s feet could reach the pedals, he or she was free to drive. Driving tests didn’t come into widespread practice until the 1940s. And until recently, many states, particularly in agricultural areas, gave licenses to kids who passed the test when they turned 14. South Dakota will still grant a driver’s license to a person as young as 14 years, 3 months. On the other end of the spectrum, New Jersey is the only state that makes teenagers wait as late as their 17th birthdays.
Of the rights and rites of adulthood, driving holds a special place. On one hand, in a country with meager access to public transit, being able to operate a car is tantamount to mobility. Learning to drive is as essential to taking a first job as it is to going out on a first date—or at least doing those things without being chauffeured around by parents. On the other hand, driving is by far the most likely way that a young person will kill himself or others. According to the CDC, 4,500 Americans between 16 and 19 die from motor vehicle crashes annually, while another 400,000 are injured seriously enough to require emergency treatment. Obviously, driving is a responsibility that must be given to young people with great care.
The new approach that has taken hold among the states is called “graduated driver licensing,” or GDL. The idea is to license kids to start driving at a certain age, but on a probationary basis. They might have to put in more hours driving with their parents or with professional instructors. They might not be allowed to drive at night. Or they might not be permitted to drive in the company of friends—peer pressure is often a factor when young drivers make bad decisions behind the wheel. GDLs have been implemented in some form in every state except North Dakota.
One reason why GDLs have become popular with state lawmakers is because they represent the middle ground in a highly emotional debate. Following a horrific car crash in his district, Illinois state Representative John D’Amico introduced legislation to raise the driving age in his state from 16 to 18. But D’Amico, who is from Chicago, quickly found out that the rural roots of early driving run deep. “I couldn’t get Southern Illinois to agree to it,” he says. Instead, D’Amico proposed a GDL. The law that passed in 2007 tightened nighttime driving curfews for 16- and 17-year-olds and required new drivers to wait a full year before they can carry more than one non-relative.
The impact was immediate. In 2007 in Illinois, 155 teens between the ages of 16 and 19 died in automobile crashes. In 2008, that number fell to 92. Those results track with findings on GDLs nationally. According to a Johns Hopkins University study, states with strong GDL laws have cut accidents among young drivers by 40 percent, with injuries down 38 percent.
Would the roads be even safer if the driving age were 25? Probably. But the GDL approach at least recognizes that young drivers are at their most dangerous in their first six months on the road. GDLs give adolescents time to practice, with less risk to themselves and other drivers. Their brains may not always make the best judgments about how fast to drive at night or in the rain. But that’s somewhat compensated for by the experience they’re getting behind the wheel. “The science says that what you want to do with kids is what parents and grandparents know,” says Dahl. “If you give them freedom and they can handle it, then they get a little bit more.”
That’s what Justin McNaull thinks, too. Having complained about not being able to rent a car as a 23-year-old cop, McNaull now works for AAA, where he lobbies for restrictions on young drivers. The key, he says, is finding the right balance between safety and responsibility. “We could maximize safety by raising the driving age to 25, but that’s not practical,” he says. “We know we’re not going to push the driving age anywhere near 25 and none of us is trying to.”
Could a GDL-type approach work in other policy areas? McCardell, the former university president, believes it could. He favors subjecting young people to testing to receive drinking permits that could be revoked if they in some way abuse the privilege. The idea needs some work: How, exactly, would states go about designing a drinking test? But he has a point. A right such as drinking could be made more contingent on one’s ability to handle it responsibly and less a function of merely reaching a milestone age.
Robert Epstein, a psychologist and author of Teen 2.0, says states could learn something from the way they regulate pharmacists and masseurs. Just as those groups are licensed based on the competence requirements of their professions, adolescents could accrue rights based on somehow proving they’re up to the task. Teens would do pretty well under such a system, he argues. He’s just completed a study of more than 30,000 people showing that 30 percent of American teens are more competent than the median adult in a variety of areas. “If we’re trying to decide what rights and privileges to extend,” Epstein says, “we have to look at individual competence.”
Which brings us back to the problem of there not being enough brain scans to go around. States are never going to spend the time and money needed to test individuals on their ability to drink or understand legal contracts. Should government really decide when an individual is ready to have sex? And there’s certainly a long and sordid history that argues against the idea of testing people on their competence to vote. Franklin Zimring, the UC Berkeley law professor, suggests that the GDL approach may be uniquely suited to the particular skills and risks of driving.
It would be useful, however, for states to think more broadly when it comes to the age of responsibility. States have been acting in ever-more-punitive ways toward teens. Yet the point of laws regulating the behavior of young people should not be to restrict them. It’s to begin educating them in the ways of responsible adulthood. What’s important, after all, is not passing a test or meeting an arbitrary age requirement, but learning lessons and applying them to real life.