Abstract
This article proposes a framework for theory and research on risk-taking that is informed by developmental neuroscience. Two fundamental questions motivate this review. First, why does risk-taking increase between childhood and adolescence? Second, why does risk-taking decline between adolescence and adulthood? Risk-taking increases between childhood and adolescence as a result of changes around the time of puberty in the brain’s socio-emotional system leading to increased reward-seeking, especially in the presence of peers, fueled mainly by a dramatic remodeling of the brain’s dopaminergic system. Risk-taking declines between adolescence and adulthood because of changes in the brain’s cognitive control system – changes which improve individuals’ capacity for self-regulation. These changes occur across adolescence and young adulthood and are seen in structural and functional changes within the prefrontal cortex and its connections to other brain regions. The differing timetables of these changes make mid-adolescence a time of heightened vulnerability to risky and reckless behavior.
Introduction
Adolescent Risk-Taking as a Public Health Problem
It is widely agreed among experts in the study of adolescent health and development that the greatest threats to the well-being of young people in industrialized societies come from preventable and often self-inflicted causes, including automobile and other accidents (which together account for nearly half of all fatalities among American youth), violence, drug and alcohol use, and sexual risk-taking (Blum & Nelson-Mmari, 2004; Williams et al., 2002). Thus, while considerable progress has been made in the prevention and treatment of disease and chronic illness among this age group, similar gains have not been made with respect to reducing the morbidity and mortality that result from risky and reckless behavior (Hein, 1988). Although rates of certain types of adolescent risk-taking, such as driving under the influence of alcohol or having unprotected sex, have dropped, the prevalence of risky behavior among teenagers remains high, and there has been no decline in adolescents’ risk behavior in several years (Centers for Disease Control and Prevention, 2006).It is also the case that adolescents engage in more risky behavior than adults, although the magnitude of age differences in risk-taking vary as a function of the specific risk in question and the age of the “adolescents” and “adults” used as comparison groups; rates of risk-taking are high among 18- to 21-year-olds, for instance, some of whom may be classified as adolescents and some who may be classified as adults. Nonetheless, as a general rule, adolescents and young adults are more likely than adults over 25 to binge drink, smoke cigarettes, have casual sex partners, engage in violent and other criminal behavior, and have fatal or serious automobile accidents, the majority of which are caused by risky driving or driving under the influence of alcohol. Because many forms of risk behavior initiated in adolescence elevate the risk for the behavior in adulthood (e.g., drug use), and because some forms of risk-taking by adolescents put individuals of other ages at risk (e.g., reckless driving, criminal behavior), public health experts agree that reducing the rate risk-taking by young people would make a substantial improvement in the overall well-being of the population (Steinberg, 2004).
False Leads in the Prevention and Study of Adolescent Risk-Taking
The primary approach to reducing adolescent risk-taking has been through educational programs, most of them school-based. There is reason to be highly skeptical about the effectiveness of this effort, however. According to AddHealth data (Bearman, Jones, & Udry, 1997), virtually all American adolescents have received some form of educational intervention designed to reduce smoking, drinking, drug use, and unprotected sex, but the most recent report of findings from the Youth Risk Behavior Survey, conducted by the Centers for Disease Control and Prevention, indicates that more than one-third of high school students did not use a condom either the first time or even the last time they had sexual intercourse, and that during the year prior to the survey, nearly 30% of adolescents rode in a car driven by someone who had been drinking, more than 25% reported multiple episodes of binge drinking, and nearly 25% were regular cigarette smokers (Centers for Disease Control and Prevention, 2006).Although it is true, of course, that the situation might be even worse were it not for these educational efforts, most systematic research on health education indicates that even the best programs are far more successful at changing individuals’ knowledge than in altering their behavior (Steinberg, 2004, 2007). Indeed, well over a billion dollars each year are spent educating adolescents about the dangers of smoking, drinking, drug use, unprotected sex, and reckless driving – all with surprisingly little impact. Most taxpayers would be surprised – perhaps shocked – to learn that vast expenditures of public dollars are invested in health, sex, and driver education programs that either do not work, such as D.A.R.E. (Ennett, Tobler, Ringwall, & Flewelling, 1994), abstinence education (Trenholm, Devaney, Fortson, Quay, Wheeler, & Clark, 2007), or driver training (National Research Council, 2007), or are at best of unproven or unstudied effectiveness (Steinberg, 2007).The high rate of risky behavior among adolescents relative to adults, despite massive, ongoing, and costly efforts to educate teenagers about its potentially harmful consequences, has been the focus of much theorizing and empirical research by developmental scientists for at least 25 years. Most of this work has been informative, but in an unexpected way. In general, where investigators have looked to find differences between adolescents and adults that would explain the more frequent risky behavior of youth, they have come up empty handed. Among the widely-held beliefs about adolescent risk-taking that have not been supported empirically are (a) that adolescents are irrational or deficient in their information processing, or that they reason about risk in fundamentally different ways than adults; (b) that adolescents do not perceive risks where adults do, or are more likely to believe that they are invulnerable; and (c) that adolescents are less risk-averse than adults. None of these assertions is correct: The logical reasoning and basic information-processing abilities of 16-year-olds are comparable to those of adults; adolescents are no worse than adults at perceiving risk or estimating their vulnerability to it (and, like adults, overestimate the dangerousness associated with various risky behaviors); and increasing the salience of the risks associated with making a poor or potentially dangerous decision has comparable effects on adolescents and adults (Millstein & Halpern-Felsher, 2002; Reyna & Farley, 2006; Steinberg & Cauffman, 1996; see also Rivers, Reyna, & Mills, this issue). Indeed, most studies find few, if any, age differences in individuals’ evaluations of the risks inherent in a wide range of dangerous behaviors (e.g., driving while drunk, having unprotected sex), in their judgments about the seriousness of the consequences that might result from risky behavior, or in the ways that they evaluate the relative costs and benefits of these activities (Beyth-Marom et al., 1993). In sum, adolescents’ greater involvement than adults in risk-taking does not stem from ignorance, irrationality, delusions of invulnerability, or faulty calculations (Reyna & Farley, 2006).The fact that adolescents are knowledgeable, logical, reality-based, and accurate in the ways in which they think about risky activity – or, at least, as knowledgeable, logical, reality-based, and accurate as their elders – but engage in higher rates of risky behavior than adults raises important considerations for both scientists and practitioners. For the former, this observation pushes us to think differently about the factors that may contribute to age differences in risky behavior and to ask what it is that changes between adolescence and adulthood that might account for these differences. For the latter, it helps explain why educational interventions have been so limited in their success, suggests that providing adolescents with information and decision-making skills may be a misguided strategy, and argues that we need a new approach to public health interventions aimed at reducing adolescent risk-taking if it is adolescents’ actual behavior that we wish to change. These sets of scientific and practical considerations form the basis for this article. In it, I argue that the factors that lead adolescents to engage in risky activity are social and emotional, not cognitive; that the field’s emerging understanding of brain development in adolescence suggests that immaturity in these realms may have a strong maturational and perhaps unalterable basis; and that efforts to prevent or minimize adolescent risk-taking should therefore focus on changing the context in which risky activity takes place rather than mainly attempting, as current practice does, to change what adolescents know and the ways they think.
Advances in the Developmental Neuroscience of Adolescence
The last decade has been one of enormous and sustained interest in patterns of brain development during adolescence and young adulthood. Enabled by the growing accessibility and declining cost of structural and functional Magnetic Resonance Imaging (MRI) and other imaging techniques, such as Diffusion Tensor Imaging (DTI), an expanding network of scientists have begun to map out the course of changes in brain structure between childhood and adulthood, describe age differences in brain activity during this period of development, and, to a more modest degree, link findings on the changing morphology and functioning of the brain to age differences in behavior. Although it is wise to heed the cautions of those who have raised concerns about “brain overclaim” (Morse, 2006), there is no doubt that our understanding of the neural underpinnings of adolescent psychological development is shaping – and reshaping – the ways in which developmental scientists think about normative (Steinberg, 2005) and atypical (Steinberg, Dahl, Keating, Kupfer, Masten, & Pine, 2006) development in adolescence. It is important to point out that our knowledge of changes in brain structure and function during adolescence far exceeds our understanding of the actual links between these neurobiological changes and adolescent behavior, and that much of what is written about the neural underpinnings of adolescent behavior – including a fair amount of this article – is what we might characterize as “reasonable speculation.” Frequently, contemporaneous processes of adolescent neural and behavioral development – for example, the synaptic pruning that occurs in the prefrontal cortex during adolescence and improvements in long-term planning – are presented as causally linked without hard data that even correlates these developments, much less demonstrates that the former (brain) influences the latter (behavior), rather than the reverse. It is therefore wise to be cautious about simple accounts of adolescent emotion, cognition, and behavior that attribute changes in these phenomena directly to changes in brain structure or function. Readers of a certain age are reminded of the many premature claims that characterized the study of hormone-behavior relationships in adolescence that appeared in the developmental literature in the mid-1980s soon after techniques for performing salivary assays became widespread and relatively inexpensive, much as brain imaging techniques have in the last decade. Alas, the search for direct hormone-behavior linkages proved more difficult and less fertile than many scientists had hoped (Buchanan, Eccles, & Becker, 1992), and there are few effects of hormones on adolescent behavior that are not conditioned on the environment in which the behavior occurs; even something as hormonally driven as libido only affects sexual behavior in the right context (Smith, Udry, & Morris, 1985). There is no reason to expect that brain-behavior relationships will be any less complicated. There is, after all, a long history of failed attempts to explain everything adolescent as biologically determined dating back not only to Hall (1904), but to early philosophical treatises on the period (Lerner & Steinberg, 2004). These caveats notwithstanding, the current state of our knowledge about adolescent brain development (both structural and functional) and possible brain-behavior links during this period, although incomplete, is nonetheless sufficient to offer some insight into “emerging directions” in the study of adolescent risk-taking. The aim of this article is to provide a review of the most important discoveries in our understanding of adolescent brain development relevant to the study of adolescent risk-taking and to sketch out a rudimentary framework for theory and research on risk-taking that is informed by developmental neuroscience. Before proceeding, a few words about this point of view are in order. Any behavioral phenomenon can be studied at multiple levels. The development of risk-taking in adolescence, for example, can be approached from a psychological perspective (focusing on increases in emotional reactivity that may underlie risky decision-making), a contextual perspective (focusing on interpersonal processes that influence risky behavior), or a biological perspective (focusing on the endocrinology, neurobiology, or genetics of sensation-seeking). All of these levels of analysis are potentially informative, and most scholars of adolescent psychopathology agree that the study of psychological disorder has profited from cross-fertilization among these various approaches (Cicchetti & Dawson, 2002).My emphasis on the neurobiology of adolescent risk-taking in this review is not intended to downplay the importance of studying the psychological or contextual aspects of the phenomenon, any more than studying changes in neuroendocrine functioning in adolescence that might increase vulnerability to depression (e.g., Walker, Sabuwalla, & Huot, 2004) would obviate the need to study the psychological or contextual contributors to, manifestations of, or treatment of the illness. Nor does my focus on the neurobiology of adolescent risk-taking reflect a belief in the primacy of biological explanation over other forms of explanation, or a subscription to a naïve form of biological reductionism. At some level, of course, every aspect of adolescent behavior has a biological basis; what matters is whether understanding the biological basis helps us understand the psychological phenomenon. My point, though, is that any psychological theory of adolescent risk-taking needs to be consistent with what we know about neurobiological functioning during this time period (just as any neurobiological theory ought to be consistent with what we know about psychological functioning), and that most extant psychological theories of adolescent risk-taking, in my view, do not map well onto what we know about adolescent brain development. To the extent that these theories are inconsistent with what we know about brain development they are likely to be wrong, and so long as they continue to inform the design of preventive interventions, they are unlikely to be effective.
A Tale of Two Brain Systems
Two fundamental questions about the development of risk-taking in adolescence motivate this review. First, why does risk-taking increase between childhood and adolescence? Second, why does risk-taking decline between adolescence and adulthood? I believe that developmental neuroscience provides clues that may lead us toward an answer to both questions. In brief, risk-taking increases between childhood and adolescence as a result of changes around the time of puberty in what I refer to as the brain’s socio-emotional system that lead to increased reward-seeking, especially in the presence of peers. Risk-taking declines between adolescence and adulthood because of changes in what I refer to as the brain’s cognitive control system – changes which improve individuals’ capacity for self-regulation, which occur gradually and over the course of adolescence and young adulthood. The differing timetables of these changes – the increase in reward-seeking, which occurs early and is relatively abrupt, and the increase in self-regulatory competence, which occurs gradually and is not complete until the mid-20s, makes mid-adolescence a time of heightened vulnerability to risky and reckless behavior.
Why Does Risk-Taking Increase Between Childhood and Adolescence?
In my view, the increase in risk-taking between childhood and adolescence is due primarily to increases in sensation seeking that are linked to changes in patterns of dopaminergic activity around the time of puberty. Interestingly, however, as I shall explain, although this increase in sensation-seeking is coincident with puberty, it is not entirely caused by the increase in gonadal hormones that takes place at this time, as is widely assumed. Nonetheless, there is some evidence that the increase in sensation-seeking that takes place in adolescence is correlated more with pubertal maturation than with chronological age (Martin, Kelly, Rayens, Brogli, Brenzel, Smith, et al., 2002), which argues against accounts of adolescent risk-taking that are solely cognitive, given that there is no evidence linking changes in thinking in adolescence to pubertal maturation.
Remodeling of the Dopaminergic System at Puberty
Important developmental changes in the dopaminergic system take place at puberty (Chambers et al., 2003; Spear, 2000). Given the critical role of dopaminergic activity in affective and motivational regulation, these changes likely shape the course of socioemotional development in adolescence, because the processing of social and emotional information relies on the networks underlying coding for affective and motivational processes. Key nodes of these networks comprise the amygdala, nucleus accumbens, orbitofrontal cortex, medial prefrontal cortex, and superior temporal sulcus (Nelson et al., 2005). These regions have been implicated in diverse aspects of social processing, including the recognition of socially relevant stimuli (e.g. faces, Hoffman & Haxby, 2000; biological motion, Heberlein et al., 2004), social judgments (appraisal of others, Ochsner, et al., 2002; judging attractiveness, Aharon, et al., 2001; evaluating race, Phelps et al., 2000; assessing other’s intentions, Gallagher, 2000; Baron-Cohen et al., 1999), social reasoning (Rilling et al., 2002), and many other aspects of social processing (for a review, see Adolphs, 2003). Importantly, among adolescents the regions that are activated during exposure to social stimuli overlap considerably with regions also shown to be sensitive to variations in reward magnitude, such as the ventral striatum and medial prefrontal areas (cf. Galvan et al., 2005; Knutson et al., 2000; May et al., 2004). Indeed, a recent study of adolescents engaged in a task in which peer acceptance and rejection were experimentally manipulated (Nelson et al., 2007) revealed greater activation when subjects were exposed to peer acceptance, relative to rejection, within brain regions implicated in reward salience (i.e., the ventral tegmental area, extended amygdala, and ventral pallidum). Because these same regions have been implicated in many studies of reward-related affect (cf., Berridge, 2003; Ikemoto & Wise, 2004; Waraczynski, 2006), these findings suggest that, at least in adolescence, social acceptance by peers may be processed in ways similar to other sorts of rewards, including nonsocial rewards (Nelson et al., 2007). As I explain later, this overlap between the neural circuits that mediate social information processing and reward processing helps to explain why so much adolescent risk-taking occurs in the context of the peer group. The remodeling of the dopaminergic system within the socio-emotional network involves an initial post-natal rise and then, starting at around 9 or 10 years of age, a subsequent reduction of dopamine receptor density in the striatum and prefrontal cortex, a transformation that is much more pronounced among males than females (at least in rodents) (Sisk & Foster, 2004; Sisk & Zehr, 2005; Teicher, Andersen, & Hostetter, Jr., 1995). Importantly, however, the extent and timing of increases and decreases in dopamine receptors differ between these cortical and subcortical regions; there is some speculation that it is changes in the relative density of dopamine receptors in these two areas that underlies changes in reward processing in adolescence. As a result of this remodeling, dopaminergic activity in the prefrontal cortex increases significantly in early adolescence and is higher during this period than before or after. Because dopamine plays a critical role in the brain’s reward circuitry, the increase, reduction, and redistribution of dopamine receptor concentration around puberty, especially in projections from the limbic system to the prefrontal area, may have important implications for sensation-seeking. Several hypotheses concerning the implications of these changes in neural activity have been offered. One hypothesis is that the temporary imbalance of dopamine receptors in the prefrontal cortex relative to the striatum creates a “reward deficiency syndrome,” producing behavior among young adolescents that is not unlike that seen among individuals with certain types of functional dopamine deficits. Individuals with this syndrome have been postulated to “actively seek out not only addicting drugs but also environmental novelty and sensation as a type of behavioral remediation of reward deficiency” (Gardner, 1999, cited in Spear, 2002, p. 82). If a similar process takes place at puberty, we would expect to see increases in reward salience (the degree to which adolescents are attentive to rewards and sensitive to variations in rewards) and in reward-seeking (the extent to which they pursue rewards). As Spear writes:
[A]dolescents may generally attain less positive impact from stimuli with moderate to low incentive value, and may pursue new appetitive reinforcers through increases in risk taking/novelty seeking and via engaging in deviant behaviors such as drug taking. The suggestion is thus that adolescents display a mini-‘reward deficiency syndrome’ which is similar, albeit typically transient and of lesser intensity, to that hypothesized to be associated in adults with [dopamine] hypofunctioning in reward circuitry…. Indeed, adolescents appear to show some signs of attaining less appetitive value from a variety of stimuli relative to individuals at other ages, perhaps leading them to seek additional appetitive reinforcers via pursuit of new social interactions and engagement in risk taking or novelty seeking behaviors. Such adolescent-typical features may have been adaptive evolutionarily in helping adolescents to disperse from the natal unit and to negotiate with success the developmental transition from dependence to independence. In the human adolescent, these propensities may be expressed, however, in alcohol and drug use, as well as a variety of other problem behaviors (2000, pp. 446–447).
The notion that adolescents suffer from a “reward deficiency syndrome,” although intuitively appealing, is undermined by several studies that indicate elevated activity in subcortical regions, especially the accumbens, in response to reward during adolescence (Ernst et al., 2005; Galvan et al., 2006). An alternative account is that the increase in sensation-seeking in adolescence is due not to functional dopamine deficits but to a temporary loss of “buffering capacity” associated with the disappearance of dopamine autoreceptors in the prefrontal cortex that serve a regulatory negative-feedback function during childhood (Dumont et al., 2004, cited in Ernst & Spear, in press). This loss of buffering capacity, resulting in diminished inhibitory control of dopamine release, would result in relatively higher levels of circulating dopamine in prefrontal regions in response to comparable degrees of reward during adolescence than would be the case during childhood or adulthood. Thus, the increase in sensation-seeking seen during adolescence would not be the result, as has been speculated, of a decline in the “rewardingness” of rewarding stimuli that drives individuals to seek higher and higher levels of reward (as would be predicted if adolescents were especially likely to suffer from a “reward deficiency syndrome”), but to an increase in the sensitivity and efficiency of the dopaminergic system, which, in theory, would make potentially rewarding stimuli experienced as more rewarding and thereby heighten reward salience. This account is consistent with the observation of increased dopaminergic innervation in the prefrontal cortex during adolescence (Rosenberg & Lewis, 1995), despite a reduction in dopamine receptor density.
Steroid-Independent and Steroid-Dependent Processes
I noted earlier that it is common to attribute this dopaminergic-mediated change in reward salience and reward-seeking to the impact of pubertal hormones on the brain, an attribution that I myself made in earlier writings on the subject (e.g., Steinberg, 2004). Although this remodeling is coincident with puberty, however, it is not clear that it is directly caused by it. Animals that have had their gonads removed prepubertally (and thus do not experience the increase in sex hormones associated with pubertal maturation) show the same patterns of dopamine receptor proliferation and pruning as animals who have not been gonadectomized (Andersen, Thompson, Krenzel, & Teicher, 2002). Thus it is important to distinguish between puberty (the process that leads to reproductive maturation) and adolescence (the behavioral, cognitive, and socioemotional changes of the period) which are not the same thing, either conceptually or neurobiologically. As Sisk and Foster explain, “gonadal maturation and behavioral maturation are two distinct brain-driven processes with separate timing and neurobiological mechanisms, but they are intimately coupled through iterative interactions between the nervous system and gonadal steroid hormones” (Sisk & Foster, 2004, p. 1040). Thus, there may well be a maturationally-driven increase in reward salience and reward seeking in early adolescence that has a strong biological basis, that is cotemporaneous with puberty, but that may only be partially related to changes in gonadal hormones in early adolescence.In point of fact, many behavioral changes that occur at puberty (and that are sometimes mistakenly attributed to puberty) are pre-programmed by a biological clock whose timing makes them coincident with, but independent of, changes in pubertal sex hormones. Accordingly, some changes in adolescent neurobiological and behavioral functioning at puberty are steroid-independent, others are steroid-dependent, and others are the product of an interaction between the two (where steroid independent processes affect susceptibility to steroid-dependent ones) (Sisk & Foster, 2004). Moreover, within the category of steroid-dependent changes are those that are the outcome of hormonal influences on brain organization during the pre- and perinatal periods, which set in motion changes in behavior that do not manifest themselves until puberty (referred to as organizational effects of sex hormones); changes that are the direct result of hormonal influences at puberty (both on brain organization and on psychological and behavioral functioning, the latter of which are referred to as activational effects); and changes that are the result of the interaction between organizational and activational influences. Even changes in sexual behavior, for example, which we normally associate with the hormonal changes of puberty, is regulated by a combination of organizational, activational, and steroid-independent processes. At this point, the extent to which changes in dopaminergic functioning at puberty are (1) steroid-independent, (2) due to the organizational effects of exposure to sex steroids (either early in life or during adolescence, which may build on or amplify early organizational influences), (3) due to the activational influences of sex steroids at puberty, or more likely, (4) due to some mix of these factors has not been determined. It may be the case, for instance, that the structural remodeling of the dopaminergic system is not influenced by gonadal steroids at puberty but that its functioning is (Cameron, 2004; Sisk & Zehr, 2005).There is also reason to hypothesize that sensitivity to the organizational effects of pubertal hormones decreases with age (see Schulz & Sisk, 2006), suggesting that the impact of pubertal hormones on reward-seeking might be stronger among early maturers than on-time or late maturers. Early maturers may also be at heightened risk for risk-taking because there is a longer temporal gap between the change in the dopaminergic system and the full maturation of the cognitive control system. Given these biological differences, we would therefore expect to see higher rates of risk-taking among early maturing adolescents than among their same-aged peers (again, arguing against a purely cognitive account of adolescent recklessness, since there are no major differences in cognitive performance between early and late physical maturers), as well as a drop over historical time in the age of initial experimentation with risky behavior, because of the secular trend toward the earlier onset of puberty. (The average age of menarche in industrialized nations declined by about 3 to 4 months per decade during the first part of the 20th century and continued to drop between the 1960s and 1990s, by about 2½ months in total [see Steinberg, 2008]). There is clear evidence for both of these predictions: Early maturing boys and girls report higher rates of alcohol and drug use, delinquency, and problem behavior, a pattern seen in different cultures and across different ethnic groups within the United States (Collins & Steinberg, 2006; Deardorff, Gonzales, Christopher, Roosa, & Millsap, 2005; Steinberg, 2008), and the age of experimentation with alcohol, tobacco, and illegal drugs (as well as the age of sexual debut) clearly has declined over time (Johnson & Gerstein, 1998), consistent with the historical decline in the age of pubertal onset.
Adolescent Sensation-Seeking and Evolutionary Adaptation
Although structural changes in the dopaminergic system that occur at puberty may not be directly due to the activational influences of pubertal hormones, it nevertheless makes good evolutionary sense that the emergence of some behaviors, such as sensation-seeking, occur around puberty, especially among males (among whom the dopaminergic remodeling is more pronounced, as noted earlier) (see also Spear, 2000). Sensation-seeking, because it involves ventures into uncharted waters, carries with it a certain degree of risk, but such risk-taking may be necessary in order to survive and facilitate reproduction. As Belsky and I have written elsewhere, “The willingness to take risks, even life-threatening risks, might well have proved advantageous to our ancestors when refusing to incur such risk was in fact even more dangerous to survival or reproduction. However chancy running through a burning savannah or attempting to cross a swollen stream might have been, not doing so might have been even more risky” (Steinberg & Belsky, 1996, p. 96). To the extent that individuals inclined to take such risks were differentially advantaged when it came to surviving and producing descendants who would themselves survive and reproduce in future generations, natural selection would favor the preservation of inclinations toward at least some risk-taking behavior during adolescence, when sexual reproduction begins. In addition to promoting survival in inherently risky situations, risk-taking might also confer advantages, especially upon males, by means of dominance displays and through a process called “sexual selection” (Diamond, 1992). With respect to the dominance displays, being willing to take risks might well have been a tactic for achieving and maintaining dominance in social hierarchies. Such means of status attainment and maintenance might have been selected for not only because they contributed to obtaining for oneself and one’s kin a disproportionate share of physical resources (e.g., food, shelter, clothing), but because they also increased reproductive opportunities by preventing other males from mating. To the extent that dominance displays mediate the link between risk-taking and reproduction, it makes good evolutionary sense to delay the increase in risk-taking until pubertal maturation has taken place, so that risk-takers are more adult-like in strength and appearance. With respect to sexual selection, displays of sensation-seeking by males may have sent messages about their desirability as a sexual partner to prospective mates. It makes biological sense for males to engage in those behaviors that attract females and for females to choose males most likely to bear offspring with high prospects of surviving and reproducing themselves (Steinberg & Belsky, 1996). In aboriginal societies that are studied by anthropologists to gain insight into the conditions under which human behavior evolved (e.g., the Ache in Venezuela; the Yamamano in Brazil; the !Kung in Africa), “young men are constantly being assessed as prospects by those who might select them as husbands and lovers…” (Wilson & Daly, 1993, p. 99, emphasis in original). Moreover, “prowess in hunting, warfare, and other dangerous activity is evidently a major determinant of young men’s marriageability” (Wilson & Daly, 1993, p. 98). Readers skeptical of this evolutionary argument are reminded of the wealth of literary and cinematic allusions to the fact that adolescent girls find “bad boys” sexually appealing. Even in contemporary society, there is empirical evidence that adolescent girls prefer and find more attractive dominant and aggressive boys (Pellegrini & Long, 2003).Although the notion that risk-taking is adaptive in adolescence makes more intuitive sense when applied to the analysis of male than female behavior, and although there is evidence that male adolescents engage in some forms of real-world risk-taking more frequently than females (Harris, Jenkins, & Glaser, 2006), sex differences in risk-taking are not always seen in laboratory studies of risk-taking (e.g., Galvan et al., 2007). Moreover, higher levels of risk-taking among adolescents versus adults have been reported in studies of females as well as males (Gardner & Steinberg, 2005). The fact that the gender gap in real-world risk-taking appears to be narrowing (Byrnes, Miller, & Schafer, 1999) and that imaging studies employing risk-taking paradigms do not find gender differences (Galvan et al., 2007) suggests that sex differences in risky behavior may mediated more by context than by biology.
Changes in Sensation Seeking, Risk-Taking, and Reward Sensitivity in Early Adolescence
Several findings from a recent study my colleagues and I have conducted on age differences in capacities that likely affect risk-taking are consistent with the notion that early adolescence in particular is a time of important changes in individuals’ inclinations toward and risk-taking (see Steinberg, Cauffman, Woolard, Graham, & Banich, 2007 for a description of the study). To my knowledge, this is one of the only studies of these phenomena with a sample that spans a wide enough age range (from 10 to 30 years) and is large enough (N=935) to examine developmental differences across preadolescence, adolescence, and early adulthood. Our battery included a number of widely-used self-report measures, including the Benthin Risk Perception Measure (Benthin, Slovic, & Severson, 1993), the Barratt Impulsiveness Scale (Patton, Stanford, & Barratt, 1995), and the Zuckerman Sensation-Seeking Scale (Zuckerman et al., 1978)1, as well as several new ones developed for this project, including a measure of Future Orientation (Steinberg et al., 2007) and a measure of Resistance to Peer Influence (Steinberg & Monahan, in press). The battery also included numerous computer-administered performance tasks, including the Iowa Gambling Task, which measures reward sensitivity (Bechara, Damasio, Damasio, & Anderson, 1994); a Delay Discounting task, which measures relative preference for immediate versus delayed rewards (Green, Myerson, Ostaszewski, 1999); and the Tower of London, which measures planning ahead (Berg & Byrd, 2002).We found a curvilinear relation between age and the extent to which individuals reported that the benefits outweighed the costs of various risky activities, such as having unprotected sex or riding in a car driven by someone who had been drinking, and between age and self-reported sensation seeking (Steinberg, 2006). Because our version of the Iowa Gambling Task permitted us to create independent measures of respondents’ selection of decks that produced monetary gains versus their avoidance of decks that produced monetary losses, we could look separately at age differences in reward and punishment sensitivity. Interestingly, we found a curvilinear relation between age and reward sensitivity, similar to the pattern seen for risk preference and sensation-seeking, but not between age and punishment sensitivity, which increased linearly (Cauffman, Claus, Shulman, Banich, Graham, Woolard, & Steinberg, 2007). More specifically, scores on sensation-seeking, risk preference, and reward sensitivity all increased from age 10 until mid-adolescence (peaking somewhere between 13 and 16, depending on the measure) and declined thereafter. Preference for short-term rewards in the Delay Discounting task was greatest among the 12- to 13-year-olds (Steinberg, Graham, O’Brien, Woolard, Cauffman, & Banich, 2007), also consistent with heightened reward sensitivity around puberty. In contrast, scores on measures of other psychosocial phenomena, such as future orientation, impulse control, and resistance to peer influence, as well punishment sensitivity on the Iowa Gambling Task and planning on the Tower of London task, showed a linear increase over this same age period, suggesting that the curvilinear pattern observed with respect to sensation-seeking, risk preference, and reward sensitivity is not simply a reflection of more general psychosocial maturation. As I will explain, these two different patterns of age differences are consistent with the neurobiological model of developmental change in risk-taking I set forth in this article. The increase in sensation-seeking, risk preference, and reward sensitivity between preadolescence and middle adolescence observed in our study is consistent with behavioral studies of rodents showing an especially significant increase in reward salience around the time of puberty (e.g., Spear, 2000). There is also evidence of a shift in the anticipation of consequences of risk-taking, with risky behavior more likely to be associated with the anticipation of negative consequences among children but with more positive consequences among adolescents, a developmental shift that is accompanied by an increase in activity in the nucleus accumbens during risk-taking tasks (Galvan et al., 2007).
Changes in Neural Oxytocin at Puberty
The remodeling of the dopaminergic system is one of several important changes in synaptic organization that likely undergird the increase in risk-taking that takes place early in adolescence. Another important change in synaptic organization is more directly linked to the rise in gonadal hormones at puberty. In general, studies find that gonadal steroids exert a strong influence on memory for social information and on social bonding (Nelson, Leibenluft, McClure, & Pine, 2005), and that these influences are mediated, at least in part, through the influence of gonadal steroids on the proliferation of receptors for oxytocin (a hormone that also functions as a neurotransmitter) in various limbic structures, including the amygdala and nucleus accumbens. Although most work on changes in oxytocin receptors at puberty has examined the role of estrogen (e.g., Miller et al., 1989; Tribollet, Charpak, Schmidt, Dubois-Dauphin, & Dreifuss, 1989), there is also evidence of similar effects of testosterone (Chibbar et al., 1990; Insel et al., 1993). Moreover, in contrast to studies of gonadectomized rodents, which indicate few effects of gonadal steroids at puberty on dopamine receptor remodeling (Andersen et al., 2002), experimental studies that manipulate gonadal steroids at puberty through post-gonadectomy administration of steroids indicate direct effects of estrogen and testosterone on oxytocin-mediated neurotransmission (Chibbar et al., 1990; Insel et al., 1993).Oxytocin is perhaps best known for the role it plays in social bonding, especially with respect to maternal behavior, but it is also important in regulating the recognition and memory of social stimuli (Insel & Fernald, 2004; Winslow & Insel, 2004). As Nelson et al. note, “gonadal hormones have important effects on how structures within the [socio-emotional system] respond to social stimuli, and will ultimately influence the emotional and behavioral responses elicited by a social stimulus during adolescence” (2005, p. 167). These hormonal changes help explain why, relative to children and adults, adolescents show especially heightened activation of limbic, paralimbic, and medial prefrontal areas in response to emotional and social stimuli, including faces with varying emotional expressions and social feedback. They also explain why early adolescence is a time of heightened awareness of others’ opinions, so much so that adolescents often engage in “imaginary audience” behavior, which involves having such a strong sense of self-consciousness that the teenager imagines that his or her behavior is the focus of everyone else’s concern and attention. Feelings of self-consciousness increase during early adolescence, peak around age 15, and then decline (Ranking, Lane, Gibbons, & Gerrard, 2004). This rise and fall in self-consciousness has been attributed both to changes in hypothetical thinking (Elkind, 1967) and to fluctuations in social confidence (Ranking, Lane, Gibbons, & Gerrard, 2004), and although these may in fact be contributors to the phenomenon, the arousal of the socio-emotional network as a result of increases in pubertal hormones probably plays a role as well.
Peer Influences on Risk-Taking
The proposed link between the proliferation of oxytocin receptors and increased risk-taking in adolescence is not intuitively obvious; indeed, given the importance of oxytocin in maternal bonding, one might predict just the reverse (i.e., it would be disadvantageous for mothers to engage in risky behavior while caring for highly dependent offspring). My argument is not that the increase in oxytocin leads to risk-taking, however, but that it leads to an increase in the salience of peer relations, and that this increase in the salience of peers plays a role in encouraging risky behavior. The heightened attentiveness to social stimuli that results as a consequence of puberty is particularly important in understanding adolescent risk-taking. One of the hallmarks of adolescent risk-taking is that it is far more likely than that of adults to occur in groups. The degree to which an adolescent’s peers use alcohol or illicit drugs is one of the strongest, if not the single strongest, predictor of that adolescent’s own substance use (Chassin et al., 2004). Research on automobile accidents indicates that the presence of same-aged passengers in a car driven by an adolescent driver significantly increases the risk of a serious accident (Simons-Morton, Lerner, & Springer, 2005). Adolescents are more likely to be sexually active when their peers are (DiBlasio & Benda, 1992; East, Felice, & Morgan, 1993; Udry, 1987) and when they believe that their friends are sexually active, whether or not their friends actually are (Babalola, 2004; Brooks-Gunn & Furstenberg, 1989; DiIorio et al., 2001; Prinstein, Meade, & Cohen, 2003). And statistics compiled by the Federal Bureau of Investigation show quite compellingly that adolescents are far more likely than adults to commit crimes in groups than by themselves (Zimring, 1998).There are several plausible explanations for the fact that adolescent risk-taking often occurs in groups. The relatively greater prevalence of group risk-taking observed among adolescents may stem from the fact that adolescents simply spend more time in peer groups than adults do (Brown, 2004). An alternative view is that the presence of peers activates the same neural circuitry implicated in reward processing, and that this impels adolescents toward greater sensation seeking. In order to examine whether the presence of peers plays an especially important role in risk-taking during adolescence, we conducted an experiment in which adolescents (mean age 14), youths (mean age 20), and adults (mean age 34) were randomly assigned to complete a battery of computerized tasks under one of two conditions: alone or in the presence of two friends (Gardner & Steinberg, 2005). One of the tasks included in this study was a video driving game that simulates the situation in which one is approaching an intersection, sees a traffic light turn yellow, and tries to decide whether to stop or proceed through the intersection. In the task, a moving car is on the screen, and a yellow traffic light appears, signaling that at some point soon, a wall will appear and the car will crash. Loud music is playing in the background. As soon as the yellow light appears, participants must decide whether to keep driving or apply the brakes. Participants are told that the longer they drive, the more points they earn but that if the car crashes into the wall, all the points that have been accumulated are lost. The amount of time that elapses between the appearance of the light and the appearance of the wall is varied across trials, so there is no way to anticipate when the car will crash. Individuals who are more inclined to take risks in this game drive the car longer than those who are more risk averse. When subjects were alone, levels of risky driving were comparable across the three age groups. However, the presence of friends doubled risk-taking among the adolescents, increased it by fifty percent among the youths, but had no effect on the adults, a pattern that was identical among both males and females (not surprisingly, we did find a main effect for sex, with males taking more risks than females). The presence of peers also increased individuals’ stated willingness to behave in an antisocial fashion significantly more among younger than older subjects, again, among both males and females. Further evidence that the impact of peers on adolescent risk-taking may be neurally mediated by heightened activation of the socioemotional network comes from some pilot work we have conducted with two male 19-year-old subjects (Steinberg & Chein, 2006). In this work, we collected fMRI data while the subjects performed an updated version of the driving task, in which they encountered a series of intersections with traffic lights that turned yellow and had to decide whether to attempt to drive through the intersection (which would increase their reward if they made it through safely but decrease it if they crashed into an approaching car) or apply the brakes (which would decrease their reward but not as much as if they crashed the car). As in the Gardner and Steinberg (2005) study, subjects came to the lab with two friends, and we manipulated the peer context by having the peers either present in the magnet control room (viewing the subject’s behavior on an external computer monitor and receiving a share of the subject’s monetary incentives) or moved to an isolated room. Subjects performed two runs of the driving task in the peer-present condition, and two in the peer-absent condition; in the peer-present condition, they were told that their friends would be watching, and in the peer-absent condition, they were told that their friends would not be able to see their performance. Behavioral data collected from subjects in the scanner indicated an increase in risk-taking in the presence of peers that was similar in magnitude to that observed in the earlier study, as evidenced by an increase in the number of crashes and concomitant decrease in the frequency of braking when the traffic lights turned yellow. Examination of the fMRI data indicated that the presence of peers activated certain regions that were not activated when the driving game was played in the peer-absent condition. As expected, regardless of peer condition, decisions in the driving task elicited a widely distributed network of brain regions including prefrontal and parietal association cortices (regions linked to cognitive control and reasoning). But in the peer-present condition, we also saw increased activity in the medial frontal cortex, left ventral striatum (primarily in the accumbens), left superior temporal sulcus, and left medial temporal structures. In other words, the presence of peers activated the socio-emotional network and led to more risky behavior. This is pilot work, of course, so it is important to be very cautious in its interpretation. But the fact that the presence of peers activated the same circuitry that is activated by exposure to reward is consistent with the notion that peers may actually make potentially rewarding – and potentially risky – activities even more rewarding. In adolescence, then, more might not only be merrier – more may also be riskier.
Summary: Arousal of the Socio-Emotional System at Puberty
In summary, there is strong evidence that the pubertal transition is associated with a substantial increase in sensation-seeking that is likely due to changes in reward salience and reward sensitivity resulting from a biologically-driven remodeling of dopaminergic pathways in what I have called the socio-emotional brain system. This neural transformation is accompanied by a significant increase in oxytocin receptors, also within the socio-emotional system, which in turn heightens adolescents’ attentiveness to, and memory for, social information. As a consequence of these changes, relative to prepubertal individuals, adolescents who have gone through puberty are more inclined to take risks in order to gain rewards, an inclination that is exacerbated by the presence of peers. This increase in reward-seeking is most apparent during the first half of the adolescent decade, has its onset around the onset of puberty, and likely peaks sometime around age 15, after which it begins to decline. Behavioral manifestations of these changes are evident in a wide range of experimental and correlational studies using a diverse array of tasks and self-report instruments, are seen across many mammalian species, and are logically linked to well-documented structural and functional changes in the brain. This set of assertions must be tempered, however, in view of the absence of direct evidence in humans that link the biology with the behavior. As noted earlier, the fact that particular sets of neurobiological and behavioral changes occur concurrently in development can only be taken as suggestive of a connection between them. More research that simultaneously examines brain structure function and its relation to risky behavior, either in studies of age differences or in studies of individual differences, is much needed. It also is important to emphasize that, although the increase in sensation-seeking observed in early adolescence may be maturationally driven, all individuals do not manifest this inclination in the form of dangerous, harmful, or reckless behavior. As Dahl notes, “For some adolescents, this tendency to activate strong emotions and this affinity for excitement can be subtle and easily managed. In others these inclinations toward high-intensity feelings can lead to emotionally-charged and reckless adolescent behaviors and at times to impulsive decisions by (seemingly) intelligent youth that are completely outrageous” (2004, p. 8). Presumably, many factors moderate and modulate the translation of sensation seeking into risky behavior, including maturational timing (i.e., with early maturers at greater risk), opportunities to engage in antisocial risk-taking (e.g., the degree to which adolescents’ behavior is monitored by parents and other adults, the availability of alcohol and drugs, and so forth), and temperamental predispositions that may amplify or attenuate tendencies to engage in potentially dangerous activities. Individuals who are behaviorally inhibited by nature, prone to high levels of anxiety, or especially fearful would be expected to shy away from harmful activities. For example, a recent follow-up of adolescents who had been highly reactive as infants (i.e., exhibiting high motor activity and frequent crying) found them to be significantly more nervous, introverted, and morose than their counterparts who had been low-reactive (Kagan, Snidman, Kahn, & Towsley, 2007).
Why Does Risk-Taking Decline Between Adolescence and Adulthood?
There are two plausible neurobiological processes that may help account for the decline in risky behavior that occurs between adolescence and adulthood. The first, which has received only scant attention, is that further changes in the dopaminergic system, or in reward processing that is mediated by some other neurotransmitter, take place in late adolescence that alter reward sensitivity, and, in turn, diminish reward-seeking. Little is known about changes in reward seeking after adolescence, however, and there remain inconsistencies in the literature with respect to age differences in reward sensitivity after adolescence (cf. Bjork et al., 2004; Ernst et al., 2005; Galvan et al., 2006), likely due to methodological differences between studies in the manipulation of reward salience (e.g., whether the comparison of interest is in reward versus cost or among rewards of different magnitudes) and whether the task involves the anticipation or actual receipt of the reward. Nevertheless, studies of age differences in sensation seeking (in addition to our own) show a decrease in this tendency after age 16 (Zuckerman et al., 1978), and there is some behavioral evidence (Millstein & Halpern-Felsher, 2002) suggesting that adolescents may be more sensitive than adults to variation in rewards and comparably or even less sensitive to variation in costs, a pattern borne out in our Iowa Gambling Task data (Cauffman et al., 2007).A more likely (although not mutually exclusive) cause of the decline in risky activity after adolescence concerns the development of self-regulatory capacities that occurs over the course of adolescence and during the 20s. Considerable evidence suggests that higher level cognition, including the uniquely human capacities for abstract reasoning and deliberative action, is supported by a recently evolved brain system including the lateral prefrontal and parietal association cortices and parts of the anterior cingulate cortex to which they are highly interconnected. The maturation of this cognitive control system during adolescence is likely a primary contributor to the decline in risk-taking seen between adolescence and adulthood. This account is consistent with a growing body of work on structural and functional changes in the prefrontal cortex, which plays a substantial role in self-regulation, and in the maturation of neural connections between the prefrontal cortex and the limbic system, which permits the better coordination of emotion and cognition. These changes permit the individual to put the brakes on impulsive sensation-seeking behavior and to resist the influence of peers, which, together, should diminish risk-taking.
Structural Maturation of the Cognitive Control System
Three important changes in brain structure during adolescence are now well-documented (see Paus, 2005, for a summary). First, there is a decrease in gray matter in prefrontal regions of the brain during adolescence, reflective of synaptic pruning, the process through which unused neuronal connections are eliminated. This elimination of unused neuronal connections occurs mainly during preadolescence and early adolescence, the period during which major improvements in basic information processing and logical reasoning are seen (Keating, 2004; Overton, 1990), consistent with the timetable for synaptic pruning in the prefrontal cortex, most of which is complete by mid-adolescence (Casey et al., 2005; see also Casey, Getz, & Galvan, this issue). Although some improvements in these cognitive capacities continue until age 20 or so (Kail, 1991, 1997), changes after mid-adolescence are very modest in magnitude and tend to be seen mainly in studies employing relatively demanding cognitive tasks on which performance is facilitated by greater connectivity among cortical areas, permitting more efficient processing (see below). In our study of capacities related to risk-taking described earlier, we saw no improvement in basic cognitive processes, such as working memory or verbal fluency, after age 16 (Steinberg et al., 2007).Second, there is an increase in white matter in these same regions, reflective of myelination, the process through which nerve fibers become sheathed in myelin, a fatty substance that provides a sort of insulation of the neural circuitry. Unlike the synaptic pruning of the prefrontal areas, which takes place early adolescence, myelination is ongoing well into the second decade of life and perhaps beyond (Lenroot, Gogtay, Greenstein, Wells, Wallace, Clasen, et al., 2007). Improved connectivity within the prefrontal cortex should be associated with subsequent improvements in higher-order functions subserved by multiple prefrontal areas, including many aspects of executive function, such as response inhibition, planning ahead, weighing risks and rewards, and the simultaneous consideration of multiple sources of information. In contrast to our findings with respect to basic information processing, which showed no maturation beyond age 16, we found continued improvement beyond this age in self-reported future orientation (which increased through age 18) and in planning (as indexed by the amount of time subjects waited before making their first move on the Tower of London task, which increased not only through adolescence but through the early 20s).Generally speaking, performance on tasks that activate the frontal lobes continues to improve through middle adolescence (until about age 16 on tasks of moderate difficulty), in contrast to performance on tasks that activate more posterior brain regions, which reaches adult levels by the end of preadolescence (Conklin, Luciana, Hooper, & Yarger, 2007). Improved executive function in adolescence is reflected in better performance with age on tasks known to activate the dorsolateral prefrontal cortex, such as relatively difficult tests of spatial working memory (Conklin et al., 2007) or especially challenging tests of response inhibition (Luna et al., 2001); and the ventromedial prefrontal cortex, such as the Iowa Gambling Task (Crone & van der Molen, 2004; Hooper, Luciana, Conklin, & Yarger, 2004). Although some tests of executive function simultaneously activate both the dorsolateral and ventromedial regions, there is some evidence that the maturation of these regions may take place along somewhat different timetables, with performance on exclusively ventromedial tasks reaching adult levels somewhat earlier than performance on exclusively dorsolateral tasks (Conklin et al., 2007; Hooper et al., 2004). In one recent study of age differences in cognitive performance using tasks known to differentially activate these two prefrontal regions, there was age-related improvement into middle adolescence on both types of tasks, but there were no significant correlations between performance on the ventromedial and dorsolateral tasks, suggesting that maturation of the ventromedial prefrontal cortex may be a developmentally distinct process from the maturation of the dorsolateral prefrontal cortex (Hooper et al., 2004). Performance on especially difficult tasks known to activate dorsolateral areas continues to improve during late adolescence (Crone, Donohue, Honomichl, Wendelken, & Bunge, 2006; Luna et al., 2001).Third, as evidenced in the proliferation of projections of white matter tracts across different brain regions, there is an increase not only in connections among cortical areas (and between different areas of the prefrontal cortex), but between cortical and subcortical areas (and, especially, between the prefrontal regions and the limbic and paralimbic areas, including the amygdala, nucleus accumbens, and hippocampus) (Eluvathingal, Hasan, Kramer, Fletcher, & Ewing-Cobbs, 2007). This third anatomical change should be associated with improved coordination of affect and cognition, and reflected in improved emotion regulation, facilitated by the increased connectivity of regions important in the processing of emotional and social information (e.g., the amygdala, ventral striatum, orbitofrontal cortex, medial prefrontal cortex, and superior temporal sulcus) and regions important in cognitive control processes (e.g., the dorsolateral prefrontal cortex, anterior and posterior cingulate, and temporo-parietal cortices). Consistent with this, we found increases in self-reported impulse control through the mid-20s (Steinberg, 2006).
Functional Changes in the Cognitive Control System
Functional studies of brain development in adolescence are largely consistent with the findings from structural studies and from studies of cognitive and psychosocial development. Several overarching conclusions can be drawn from this research. First, studies point to a gradual development of cognitive control mechanisms over the course of adolescence and early adulthood, consistent with the anatomical changes in the dorsolateral prefrontal cortex described earlier. Imaging studies examining performance on tasks requiring cognitive control (e.g., Stroop, flanker tasks, Go-No/Go, antisaccade) have shown that adolescents tend to recruit the network less efficiently than do adults, and that regions whose activity correlates with task performance (i.e., cognitive control areas) become more focally activated with age (Durston et al., 2006). It has been suggested that this increasingly focal engagement of cognitive control areas reflects a strengthening of connections within the control network, and of its projections to other regions (a claim consistent with data on increased connectivity among cortical areas with development; Liston et al., 2006).Improved performance on cognitive control tasks between childhood and adulthood is accompanied by two different functional changes: Between childhood and adolescence, there appears to be an increase in activation of the dorsolateral prefrontal cortex (Adelman et al., 2002; Casey et al., 2000; Durston et al., 2002; Luna et al., 2001; Tamm et al., 2002;), consistent with the synaptic pruning and myelination of this region at this time. The period between adolescence and adulthood, in contrast, appears to be one of fine-tuning (rather than one characterized by an overall increase or decrease in activation; Brown et al., 2005), presumably facilitated by the more extensive connectivity within and across brain areas (Crone et al., 2006; Luna et al., 2001). For example, imaging studies using tasks in which individuals are asked to inhibit a “prepotent” response, like trying to look away from, rather than toward, a point of light (an antisaccade task), have shown that adolescents tend to recruit the cognitive control network less selectively and efficiently than do adults, perhaps overtaxing the capacity of the regions they activate (Luna et al., 2001). In essence, whereas the advantage that adolescents have over children in cognitive control inheres in the maturation of brain regions implicated in executive function (mainly, dorsolateral prefrontal cortex), the reasons the cognitive control system of adults is more effective than that of adolescents may be because adults’ brains evince more differentiated activation in response to different task demands. This would be consistent with the notion that performance on relatively basic tests of executive processing reaches adult levels around age 16, whereas performance of especially challenging tasks, which may require more efficient activation, continues to improve in late adolescence. While the cognitive control network is clearly implicated in reasoning and decision-making, several recent findings suggest that decision-making is often governed by a competition between this network and the socio-emotional network (Drevets & Raichle, 1998). This competitive interaction has been implicated in a wide range of decision making-contexts, including drug use (Bechara, 2005; Chambers, 2003), social decision processing (Sanfey et al., 2003), moral judgments (Greene et al., 2004), and the valuation of alternative rewards and costs (McClure et al., 2004; Ernst et al., 2004), as well as in an account of adolescent risk-taking (Chambers, 2003). In each instance, impulsive or risky choices are presumed to arise when the socio-emotional network dominates the cognitive control network. More specifically, risk-taking is more likely when the socio-emotional network is relatively more activated or when processes mediated by the cognitive control network are disrupted. For example, McClure et al. (2004) have shown that decisions reflecting a preference for smaller immediate rewards over larger delayed rewards are associated with relatively increased activation of the ventral striatum, orbitofrontal cortex, and medial prefrontal cortex, all regions linked to the socio-emotional network, whereas regions implicated in cognitive control (dorsolateral prefrontal cortex, parietal areas) are engaged equivalently across decision conditions. Similarly, two recent studies (Matthews et al., 2004; Ernst et al., 2004) show that increased activity in regions of the socio-emotional network (ventral striatum, medial prefrontal cortex) predicts the selection of comparatively risky (but potentially highly rewarding) choices over more conservative choices. Finally, one recent experimental study found that transient disruption of right dorsolateral prefrontal cortical function via transcranial magnetic stimulation (i.e., disruption of a region known to be crucial to cognitive control) increased risk-taking in a gambling task (Knoch, Gianotti, Pascual-Leone, Treyer, Regard, Hohmann, et al., 2006).
Coordination of Cortical and Subcortical Functioning
A second, but less well documented, change in brain function during adolescence involves the increasing involvement of multiple brain regions in tasks involving the processing of emotional information (e.g., facial expressions, emotionally arousing stimuli). Although it has been widely reported that adolescents show significantly greater limbic activity than adults when exposed to emotional stimuli (which is popularly interpreted as evidence for adolescents’ “emotionality”), this is not consistently the case. In some such studies adolescents do show a tendency toward relatively more limbic activation than adults (e.g., Baird, Gruber, Fein, Maas, Steingard, Renshaw, et al., 1999; Killgore & Yurgulen-Todd, 2007), but in others, adolescents show relatively more prefrontal activation (e.g., Baird, Fugelsang, & Bennett, 2005; Nelson, McClure, Monk, Zarahn, Leibenluft, Pine, & Ernst, 2003). Much depends on the stimuli used, whether the stimuli are presented explicitly or subliminally, and the specific instructions given to the participant (e.g., whether the participant is asked to pay attention to the emotion or to pay attention to some other aspect of the stimulus material). A more cautious reading of this literature is not that adolescents are unequivocally more prone than adults to activation of subcortical brain systems when presented with emotional stimuli (or that they are more “emotional”), but that they may be less likely to activate multiple cortical and subcortical areas simultaneously, suggesting deficits, relative to adults, in the synchronization of cognition and affect. This lack of cross-talk across brain regions results not only in individuals acting on gut feelings without fully thinking (the stereotypic portrayal of adolescent risk-taking), but also in thinking too much when one’s gut feelings ought to be attended to (which teenagers also do from time to time) (see also Reyna & Farley, 2006, for a discussion of adolescents’ deficiencies in intuitive, or “gist-based,” decision-making). Few readers would be surprised to hear of studies showing more impulsivity and less deliberative thinking among adolescents than adults. But in one recent study (Baird, Fugelsang, & Bennett, 2005), when asked whether some obviously dangerous activities (e.g., setting one’s hair on fire, swimming with sharks) were “good ideas,” adolescents took significantly longer (i.e., deliberated more) than adults to respond to the questions and activated a less narrowly distributed set of cognitive control regions, particularly in the dorsolateral prefrontal cortex – a result reminiscent of Luna’s study of age differences in response inhibition (Luna et al., 2001). This was not the case when the queried activities were not dangerous ones, however (e.g., eating salad, taking a walk), where adolescents and adults performed similarly and showed similar patterns of brain activation. Thus, it is the lack of coordination of affect and thinking, rather than the dominance of affect over thinking, that may characterize adolescence. This results in two patterns of risk-taking that are behaviorally quite different (impulsively acting before thinking, and overthinking rather than acting impulsively) but that actually may have a similar neurobiological origin. The temporal gap between the development of basic information-processing abilities, which is facilitated by maturation of the prefrontal cortex and largely complete by age 16, and the development of abilities that require the coordination of affect and cognition, which is facilitated by improved connections among cortical regions and between cortical and subcortical regions, and which is a later development, is illustrated in Figure 1. The figure is based on data from our study of 10 to 30-year-olds mentioned earlier (Steinberg et al., 2007). The two capacities graphed are basic intellectual ability, which is a composite score that combines performance on tests of working memory (Thompson-Schill, 2002), digit-span, and verbal fluency; and psychosocial maturity, which composites scores of the self-report measures of impulsivity, risk perception, sensation-seeking, future orientation, and resistance to peer influence mentioned earlier. Mature functioning with respect to these psychosocial capacities requires the effective coordination of emotion and cognition. The figure shows the proportion of individuals in each age group who score at or above the mean level of the 26- to 30-year-olds in our sample on the psychosocial and intellectual composites. As the figure indicates, and consistent with other studies, basic intellectual abilities reach adult levels around age 16, long before the process of psychosocial maturation is complete – well into the young adult years. Figure 1 Proportion of individuals in each age group scoring at or above the mean for 26- to 30- year-olds on indices of intellectual and psychosocial maturity. From Steinberg et al., 2007.
Changes in Brain Connectivity and the Development of Resistance to Peer Influence
The improved connectivity between cortical and subcortical areas also has implications for understanding changes in susceptibility to peer influence, which, as I noted, is an important contributor to risk behavior during adolescence. Resistance to peer influence, I believe, is achieved by cognitive control of the impulsive reward-seeking behavior that is stimulated by the presence of peers through activation of the socio-emotional network. To the extent that improved coordination between the cognitive control and socio-emotional networks facilitates this regulatory process, we should see gains in resistance to peer influence over the course of adolescence that continue at least into late adolescence (when maturation of inter-region connections are still ongoing). This is precisely what we have found in our own work, in which we show that gains in self-reported resistance to peer influence continue at least until 18 (Steinberg & Monahan, in press), and that the actual impact of the presence of peers on risky behavior is still evident among college undergraduates averaging 20 years in age (Gardner & Steinberg, 2005).Two recent studies of the relation between resistance to peer influence and brain structure and function provide further support for this argument. In an fMRI study of 43 10-year-olds who were exposed to emotionally-arousing video clips containing social information (clips of angry hand movements or angry facial expressions), we found that individuals with relatively lower scores on our self-report measure of resistance to peer influence showed significantly greater activation of regions implicated in the perception of others’ actions (i.e., right dorsal premotor cortex), whereas those with relatively higher scores showed greater functional connectivity between these action-processing regions and regions implicated in decision-making (i.e., dorsolateral prefrontal cortex); such differences were not observed when individuals were presented with emotionally-neutral clips (Grosbras, Jansen, Leonard, McIntosh, Osswald, Poulsen, et al., 2007). These results suggest that individuals who are especially susceptible to peer influence may be unusually aroused by signs of anger in others but less able to exert inhibitory control over their responses to such stimuli. In a second study, of differences in brain morphology between individuals (aged 12 to 18) scoring high versus low in resistance to peer influence, we found morphological evidence that, after controlling for age, adolescents high in resistance to peer influence showed evidence of greater structural connectivity between premotor and prefrontal regions, a pattern consistent with the more frequent concurrent engagement of these networks among individuals more able to resist peer pressure (Paus, Toro, Leonard, Lerner, Lerner, Perron, et al., in press). Also consistent with this is work showing that recruitment of cognitive control resources (which would counter impulsive susceptibility to peer pressure) is greater among individuals with stronger connections between frontal and striatal regions (Liston et al., 2005).
Summary: Improvements in Cognitive Control Over Adolescence and Young Adulthood
In sum, risk taking declines between adolescence and adulthood for two, and perhaps, three reasons. First, the maturation of the cognitive control system, as evidenced by structural and functional changes in the prefrontal cortex, strengthens individuals’ abilities to engage in longer-term planning and inhibit impulsive behavior. Second, the maturation of connections across cortical areas and between cortical and subcortical regions facilitates the coordination of cognition and affect, which permits individuals to better modulate socially and emotionally aroused inclinations with deliberative reasoning and, conversely, to modulate excessively deliberative decision-making with social and emotional information. Finally, there may be developmental changes in patterns of neurotransmission after adolescence that change reward salience and reward-seeking, but this is a topic that requires further behavioral and neurobiological research before saying anything definitive.
Go to:
Implications for Prevention and Intervention
In many respects, then, risk-taking during adolescence can be understood and explained as the product of an interaction between the socio-emotional and cognitive control networks (Drevets & Raichle, 1998), and adolescence is a period in which the former abruptly becomes more assertive at puberty while the latter gains strength only gradually, over a longer period of time. It is important to note, however, that the socio-emotional network is not in a state of constantly high activation, even during early and middle adolescence. Indeed, when the socio-emotional network is not highly activated (for example, when individuals are not emotionally excited or are alone), the cognitive control network is strong enough to impose regulatory control over impulsive and risky behavior, even in early adolescence; recall that in our video driving game study, when individuals were alone we found no age differences in risk-taking between adolescents who averaged 14 and adults who averaged 34 (Gardner & Steinberg, 2005). In the presence of peers or under conditions of emotional arousal, however, the socio-emotional network becomes sufficiently activated to diminish the regulatory effectiveness of the cognitive control network. (We are currently beginning research in our lab to examine whether positive or negative emotional arousal has differential effects on risk-taking during adolescence and adulthood.) During adolescence, the cognitive control network matures, so that by adulthood, even under conditions of heightened arousal in the socio-emotional network inclinations toward risk-taking can be modulated. What does this formulation mean for the prevention of unhealthy risk-taking in adolescence? Given extant research suggesting that it is not the way that adolescents think or what they don’t know or understand that is the problem, rather than attempting to change how adolescents view risky activities a more profitable strategy might focus on limiting opportunities for immature judgment to have harmful consequences. As I noted in the introduction to this article, more than 90% of all American high school students have had sex, drug, and driver education in their schools, yet large proportions of them still have unsafe sex, binge drink, smoke cigarettes, and drive recklessly (some all at the same time; Steinberg, 2004). Strategies such as raising the price of cigarettes, more vigilantly enforcing laws governing the sale of alcohol, expanding adolescents’ access to mental health and contraceptive services, and raising the driving age would likely be more effective in limiting adolescent smoking, substance abuse, pregnancy, and automobile fatalities than attempts to make adolescents wiser, less impulsive, or less shortsighted. Some things just take time to develop, and mature judgment is probably one of them. The research reviewed here suggests that heightened risk-taking during adolescence is likely to be normative, biologically driven, and, to some extent, inevitable. There is probably very little we can or ought to do to either attenuate or delay the shift in reward sensitivity that takes place at puberty, a developmental shift that likely has evolutionary origins. It may be possible to accelerate the maturation of self-regulatory competence, but no research has examined whether this can be done. We do know that individuals of the same age vary in their impulse control, planfulness, and susceptibility to peer influence, and that variations in these characteristics are related to variations in risky and antisocial behavior (Steinberg, 2008). Although there is a wealth of studies showing familial influences on psychosocial maturity in adolescence, indicating that adolescents who are raised in homes characterized by authoritative parenting (i.e., parenting that is warm but firm) are more mature and less likely to engage in risky or antisocial behavior (Steinberg, 2001), we do not know whether this link is mediated by changes in the underlying bases of self-regulation, or whether they mainly reflect the imposition of external constraints (through parental monitoring) on adolescents’ access to harmful situations and substances. Nonetheless, there is reason to study whether altering the context in which adolescents develop may have beneficial effects on the development of self-regulatory capacities. Understanding how contextual factors, both inside and outside the family, influence the development of self-regulation, and the neural underpinnings of these processes, should be a high priority for those interested in the physical and psychological well being of young people.