Braking and Accelerating of the Adolescent Brain
BJ Casey
Rebecca M. Jones
Leah H. Somerville
SimpleOriginal

Summary

Reward areas mature faster than brain regions for control, leading to risky behavior in adolescents. Brain scans support this theory, showing stronger reward centers and weaker control centers in teens compared to children and adults.

2011

Braking and Accelerating of the Adolescent Brain

Keywords Adolescence; brain; development; fMRI; risk; incentive; cognitive control; connectivity

Abstract

Adolescence is a developmental period often characterized as a time of impulsive and risky choices leading to increased incidence of unintentional injuries and violence, alcohol and drug abuse, unintended pregnancy and sexually transmitted diseases. Traditional neurobiological and cognitive explanations for such suboptimal choices and actions have failed to account for nonlinear changes in behavior observed during adolescence, relative to childhood and adulthood. This review provides a biologically plausible conceptualization of the mechanisms underlying these nonlinear changes in behavior, as an imbalance between a heightened sensitivity to motivational cues and immature cognitive control. Recent human imaging and animal studies provide a biological basis for this view, suggesting differential development of subcortical limbic systems relative to top-down control systems during adolescence relative to childhood and adulthood. This work emphasizes the importance of examining transitions into and out of adolescence and highlights emerging avenues of future research on adolescent brain development.

Introduction

Adolescence is characterized as a time when we act more impulsively, fail to consider long-term consequences, and engage in riskier behavior than we do as adults (Gardner & Steinberg, 2005; Scott, 1992; Steinberg, et al., 2008). This propensity to take risks is reflected in higher incidences of accidents, suicides, unsafe sexual practices, and criminal activity (Scott, 1992). Juveniles fifteen years of age and younger act more impulsively than do older adolescents, but even sixteen- and seventeen year old youth fail to exhibit adult levels of self-control (Feld, 2008).

In the past decade, a number of cognitive and neurobiological hypotheses have been postulated for why adolescents engage in impulsive and risky acts. Traditional accounts of adolescence suggest that it is a period of development associated with progressively greater efficiency of cognitive control capacities. This efficiency in cognitive control is described as dependent on maturation of the prefrontal cortex as evidenced by imaging (Galvan, et al., 2006; Gogtay, et al., 2004; Hare, et al., 2008; Sowell, et al., 2003) and post mortem studies (Bourgeois, Goldman-Rakic, & Rakic, 1994; Huttenlocher, 1979; Rakic, 1994) showing continued structural and functional development of this region well into young adulthood.

The general pattern of improved cognitive control with maturation of the prefrontal cortex (Crone & van der Molen, 2007) suggests a linear increase in development from childhood to adulthood. If cognitive control and an immature prefrontal cortex were the basis for suboptimal choice behavior alone, then children should look remarkably similar or presumably worse than adolescents, given their less developed prefrontal cortex and cognitive abilities (Casey, Getz, & Galvan, 2008). Yet suboptimal choices and actions observed during adolescence represent an inflection in development (Windle, et al., 2008) that is unique from either childhood or adulthood, as evidenced by the National Center for Health Statistics on adolescent behavior and mortality (Eaton, et al., 2008).

This review addresses the primary question of how the brain is changing during adolescence in ways that may explain inflections in risky behavior. We outline a testable neurobiological model that emphasizes the dynamic interplay between subcortical and cortical brain regions and speculate on the emergence of these systems from an evolutionary perspective. We provide evidence from behavioral and human brain imaging studies to support this model in the framework of actions in motivational contexts (Cauffman, et al., 2010; Figner, Mackinlay, Wilkening, & Weber, 2009; Galvan, Hare, Voss, Glover, & Casey, 2007; Galvan, et al., 2006) and address why some teenagers may be at greater risk than others for making suboptimal decisions leading to poorer long-term outcomes (Galvan, et al., 2007; Hare, et al., 2008).

Neurobiological Model of Adolescence

An accurate conceptualization of cognitive and neurobiological changes during adolescence must treat adolescence as a transitional developmental period (Spear, 2000), rather than a single snapshot in time. In other words, to understand this developmental period, characterizing transitions into and out of adolescence are necessary for distinguishing distinct attributes of this period of development (Casey, Galvan, & Hare, 2005; Casey, Tottenham, Liston, & Durston, 2005). Establishing developmental trajectories for cognitive processes is essential in characterizing these transitions and constraining interpretations about changes in behavior during this period.

We have developed a testable neurobiological model of adolescent development within this framework that builds on rodent models (Brenhouse, Sonntag, & Andersen, 2008; Laviola, Adriani, Terranova, & Gerra, 1999; Spear, 2000) and recent imaging studies of adolescence (Ernst, et al., 2005; Galvan, et al., 2007; Galvan, et al., 2006; Hare, et al., 2008; Somerville, Hare, & Casey, in press; Van Leijenhorst, Moor, et al., 2010; Van Leijenhorst, Zanolie, et al., 2010). Figure 1 depicts this model. This characterization of adolescence goes beyond exclusive association of risky behavior to the immaturity of the prefrontal cortex. Rather, the proposed neurobiological model illustrates how subcortical and cortical top-down control regions must be considered together. The cartoon illustrates different developmental trajectories for these systems, with subcortical systems such as the ventral striatum developing earlier than prefrontal control regions. According to this model, the individual is biased more by functionally mature subcortical regions during adolescence (i.e., imbalance of subcortical relative to prefrontal cortical control), compared to children, for whom these systems (i.e., subcortical and prefrontal) are both still developing, and compared to adults, for whom these systems are fully mature.

This perspective provides a basis for nonlinear shifts in risky behavior across development, due to earlier maturation of subcortical systems relative to less mature top down prefrontal control systems. With development and experience, the functional connectivity between these regions provides a mechanism for top down control of this circuitry (Hare, et al., 2008). Further, the model reconciles the contradiction of health statistics of risky behavior during adolescence, with the astute observation by Reyna and Farley (2006) that adolescents are quite capable of rational decisions and understand risks of behaviors in which they engage. However, in emotionally salient situations, subcortical systems will win out (accelerator) over control systems (brakes) given their maturity relative to the prefrontal control system.

This model is consistent with models of adolescent development (Ernst, Pine, & Hardin, 2006; Ernst, Romeo, & Andersen, 2009; Geier & Luna, 2009; Nelson, Leibenluft, McClure, & Pine, 2005; Steinberg, 2008; Steinberg, et al., 2009) that suggest differential development of subcortical and cortical regions. For example, the triadic model proposed by Ernst and colleagues (Ernst, et al., 2006) describes motivated behavior as having three distinct neural circuits (approach, avoidance and regulatory). The approach system relates to reward behaviors and is largely controlled by the ventral striatum. The avoidance system relates to avoidance behaviors and is mostly controlled by the amygdala. Lastly, the regulatory system balances the approach and avoidance systems and is largely controlled by the prefrontal cortex. Accordingly, increased risk taking behavior during adolescence is due to greater influence of the approach system and a weaker influence of the regulatory system.

Our model differs from others in that it is based on empirical evidence for brain changes not only in the transition from adolescence to adulthood, but also the transition into adolescence from childhood. Further, we do not suggest that the striatum and amygdala are specific to approach and avoidant behavior given recent studies showing valence independence of these structures (Levita, et al., 2009), but rather that they are systems important in detecting motivationally and emotionally relevant cues in the environment that can bias behavior. This sensitivity to appetitive and emotive cues during adolescence has been described across species (see Spear, 2009) and is reviewed here.

Comparative and Evolutionary Perspectives on Adolescence

A question that emerges from the imbalance model of adolescent brain development is why the brain might be programmed to develop in this way. This question may be addressed by taking a step backward and considering the definition of adolescence as the transitional period between childhood and adulthood. Puberty marks the onset of adolescence with the beginnings of sexual maturation (Graber & Brooks-Gunn, 1998) and can be defined by biological markers. Adolescence can be described as a progressive transition into adulthood with a nebulous ontogenetic time course (Spear, 2000, p.419). A complete discussion of the effect of pubertal hormones on brain and behavior is beyond the scope of this paper; see (Forbes & Dahl, 2010; Romeo, 2003) for detailed reviews on the subject.

Evolutionarily speaking, adolescence is a period of gaining independence from the protection of the family, which simultaneously may put the individual in harms way (Kelley, Schochet, & Landry, 2004). Independence-seeking behaviors are observed across mammalian species, with increases in peer-directed social interactions and intensification in novelty seeking which impact adolescents' propensity for risky behavior (Brown, 2004; Chassin, et al., 2004; Collins & Laursen, 2004; Laviola, et al., 1999). This risky behavior may be defined as the product of a biologically driven imbalance between increased novelty and sensation seeking in conjunction with immature “self-regulatory competence” (Steinberg, 2004). Speculation would suggest that this developmental pattern is an evolutionary feature in that an individual needs to engage in high-risk behavior to leave a safe and familiar niche in order to find a mate and procreate (Spear, 2000). Thus, risk taking appears to coincide with the time in which hormones drive adolescents to seek out sexual partners. In today's society -when adolescence may extend indefinitely- with children living with parents and having financial dependence and choosing mates later in life, this behavior may be less adaptive. Our neurobiological model suggests this occurs through differential development of subcortical and cortical systems. Empirical behavioral and imaging data are reviewed in support of this view.

Adolescent Behavioral Development

A core component of behavioral development is the ability to suppress inappropriate actions in favor of goal-directed ones, especially in the presence of compelling incentives. This ability is typically referred to as cognitive control (Casey, Galvan, et al., 2005; Casey, Giedd, & Thomas, 2000; Casey, Thomas, et al., 2000). We review classic cognitive developmental literature in the context of changes in cortically driven cognitive processes with age and provide behavioral and neuroanatomical evidence for its distinction from risky behaviors.A number of classic developmental studies have shown that cognitive control develops throughout childhood and adolescence (Case, 1972; Flavell, Beach, & Chinksy, 1966; Keating & Bobbitt, 1978; Pascual-Leone, 1970). Several theorists have argued that this development is due to increases in processing speed and efficiency (e.g., (Bjorklund, 1985, 1987; Case, 1972)), but others have suggested “inhibitory” processes are the key factor (Harnishfeger & Bjorklund, 1993). According to this account, suboptimal choices in childhood are due to greater susceptibility to interference from competing sources that must be suppressed (e.g., (Brainerd & Reyna, 1993; Casey, Thomas, Davidson, Kunz, & Franzen, 2002; Dempster, 1993; Diamond, 1985; Munakata & Yerys, 2001). Thus optimal decision-making requires the control of impulses (Mischel, Shoda, & Rodriguez, 1989) and this ability matures in a linear fashion across childhood and adolescence (Eigsti, et al., 2006).

In contrast, risk taking or reward seeking behaviors, seem to peak during adolescence and then decline in adulthood (Eaton, et al., 2008; Windle, et al., 2008) and are associated with puberty maturation (Dahl, 2004; Martin, et al., 2001). A recent study by Steinberg et al. (2008) delineated the construct of impulse/cognitive control from sensation seeking behaviors, defined as the desire to seek out novel experiences and taking risks in order to achieve them. They tested individuals between the ages of 10 and 30 and showed that differences in sensation seeking with age followed a curvilinear pattern, with peaks in sensation seeking increasing between 10 and 15 years and declining or remaining stable thereafter. In contrast, age differences in impulsivity followed a linear pattern, with decreasing impulsivity with age.These findings suggest distinct developmental trajectories for the two constructs. Specifically, impulsivity diminishes with age across childhood and adolescence (Casey, Galvan, et al., 2005; Casey, Thomas, et al., 2002; Galvan, et al., 2007), although there are differences in the degree to which a given individual is impulsive or not, regardless of age (Eigsti, et al., 2006). In contrast to impulse/cognitive control, sensation seeking/risk taking appears to show a curvilinear pattern, with an increase during adolescence relative to childhood and adulthood (Cauffman, et al., 2010; Figner, et al., 2009; Galvan, et al., 2007). As will be reviewed in the following sections, these findings suggest a distinct neural system for the construct of risky behavior, separate from the neural system for impulse control, with earlier development of risk taking behavior relative to protracted development of impulse control (Galvan, et al., 2007; Steinberg, et al., 2008).

Adolescent Brain Development

Recent investigations of adolescent brain development have been based on advances in neuroimaging methodologies that can be easily used with developing human populations. These methods rely on magnetic resonance imaging (MRI) methods and include: structural MRI, which is used to measure the size and shape of structures; functional MRI (fMRI) which is used to measure patterns of brain activity; and diffusion tensor imaging (DTI) which is used to index connectivity of white matter fiber tracts. Evidence for our developmental model of competition between cortical and subcortical regions is supported by immature structural and functional connectivity as measured by DTI and fMRI, respectively.

MRI Studies of Human Brain Development

Several studies have used structural MRI to map the anatomical course of normal brain development (see review (Casey, Tottenham, et al., 2005)). Although total brain size is approximately 90% of its adult size by age six, the gray and white matter subcomponents of the brain continue to undergo dynamic changes throughout adolescence. Data from recent longitudinal MRI studies indicate that gray matter volume has an inverted U-shape pattern, with greater regional variation than white matter (Giedd, 2004; Gogtay, et al., 2004; Sowell, et al., 2003; Sowell, Thompson, & Toga, 2004). In general, regions subserving primary functions, such as motor and sensory systems, mature earliest; higher-order association areas, which integrate these primary functions, mature later (Gogtay, et al., 2004; Sowell, et al., 2004). For example, studies using MRI-based measures show that cortical gray matter loss occurs earliest in the primary sensorimotor areas and latest in the dorsolateral prefrontal and lateral temporal cortices (Gogtay, et al., 2004). This pattern is consistent with nonhuman primate and human postmortem studies showing that the prefrontal cortex is one of the last brain regions to mature (Bourgeois, et al., 1994; Huttenlocher, 1979) while subcortical and sensorimotor regions develop sooner. In contrast to gray matter, white matter volume increases in a roughly linear pattern, increasing throughout development well into adulthood (Gogtay, et al., 2004). These changes presumably reflect ongoing myelination of axons by oligodendrocytes enhancing neuronal conduction and communication of relevant connections.

Although less attention has been given to subcortical regions when examining structural changes, some of the largest changes in the brain across development are seen in portions of the basal ganglia such as the striatum (Sowell, Thompson, Holmes, Jernigan, & Toga, 1999), especially in males (Giedd, et al., 1996). These developmental changes in structural volume within basal ganglia and prefrontal regions suggest that cortical connections are becoming more refined consistent with neural developmental processes (e.g. dendritic arborization, cell death, synaptic pruning, myelination) that are occurring during childhood and adolescence (Huttenlocher, 1979). These processes allow for fine-tuning and strengthening of connections between prefrontal and subcortical regions with learning that may coincide with greater cognitive control (e.g., signaling of prefrontal control regions to adjust behavior) (Casey, Amso, & Davidson, 2006; Casey & Durston, 2006).

It is unclear exactly how structural changes relate to behavior changes. A few studies have shown indirect associations between MRI-based volumetric change and cognitive function using neuropsychological measures (e.g., (Casey, Castellanos, et al., 1997; Sowell, et al., 2003)). Specifically, associations have been reported between MRI-based prefrontal cortical and basal ganglia regional volumes and measures of cognitive control (i.e., ability to override an inappropriate choice/action in favor of another (Casey, Castellanos, et al., 1997) (Casey, Trainor, et al., 1997)). These findings suggest that cognitive changes are reflected in structural changes in the brain and underscore the importance of subcortical (striatum) as well as cortical (e.g., prefrontal cortex) development.

DTI Studies of Human Brain Development

The MRI-based morphometry studies reviewed suggest that cortical connections are being fine-tuned with the elimination of an overabundance of synapses and strengthening of relevant connections with development and experience. Recent advances in MRI technology like DTI provide a tool for examining the developmental modulation of specific white matter tracts and their relation to behavior. In one study, development of cognitive control was positively correlated with prefrontal-parietal fiber tracts (Nagy, Westerberg, & Klingberg, 2004) consistent with functional neuroimaging studies showing differential recruitment of these regions in children relative to (Klingberg, Forssberg, & Westerberg, 2002).Using a similar approach, Liston and colleagues (2006) examined the strength of white matter tracts in frontostriatal circuits, which continue to develop across childhood into adulthood. The frontostriatal fiber tracts were defined by connecting two regions of interest in the striatum and ventral prefrontal cortex identified in an fMRI study using the same task (Durston, Thomas, Worden, Yang, & Casey, 2002; Epstein, et al., 2007). Across these developmental DTI studies, fiber tract measures across the entire brain were correlated with development. However, there was specificity in which particular fiber tracts were associated with cognitive control (Casey, et al., 2007; Liston, et al., 2006) or cognitive ability (Nagy, et al., 2004). Specifically, frontostriatal connection strength positively predicted impulse control capacity, as measured by performance on a go/nogo task (Casey, et al., 2007; Liston, et al., 2006). These findings underscore the importance of examining not only regional structural changes, but also circuitry related changes when making claims about age-dependent maturation of neural substrates of cognitive development.

Functional MRI Studies of Behavioral and Brain Development

Although structural changes as measured by MRI and DTI have been associated with behavioral changes during development, a more direct approach for examining structure-function associations is to measure changes in the brain and behavior simultaneously, as with fMRI. The ability to measure functional changes in the developing brain with MRI has significant potential for the field of developmental science. In the context of the current article, fMRI provides a means for constraining interpretations of adolescent decision-making. As stated previously, the development of the prefrontal cortex is believed to play an important role in the maturation of higher cognitive abilities such as decision-making and goal oriented choice behavior (Casey, Tottenham, & Fossella, 2002; Casey, Trainor, et al., 1997). Many paradigms have been used, together with fMRI, to assess the neurobiological basis of these abilities. These paradigms include go/nogo, (participants must respond to one stimulus but suppress responses to a second stimulus) flanker (participants choose the directionality of a target surrounded by symbols that are either compatible or incompatible with the target), stop signal (participants respond as fast as possible to a stimulus but must suppress this response when they receive a stop signal such as an auditory tone), and antisaccade tasks (participants must inhibit reflexive eye movements to gaze in the opposite direction of a target) (Bunge, Dudukovic, Thomason, Vaidya, & Gabrieli, 2002; Casey, Giedd, et al., 2000; Casey, Trainor, et al., 1997; Durston, et al., 2003; Luna, et al., 2001). Collectively, these studies show that children recruit distinct but often larger, more diffuse prefrontal regions when performing these tasks than do adults. The pattern of activity within brain regions central to task performance (i.e., that correlate with cognitive performance) become more focal or fine-tuned with age; while regions not correlated with task performance diminish in activity with age. This pattern has been observed across both cross-sectional (Brown, et al., 2005) and longitudinal studies (Durston, et al., 2006) and across a variety of paradigms.

Although neuroimaging studies cannot definitively characterize the mechanism of such developmental changes (e.g. dendritic arborization, synaptic pruning) the findings reflect development within, and refinement of, projections to and from activated brain regions with maturation. Further, the findings suggest that these neuroanatomical changes occur over a protracted period of time (Brown, et al., 2005; Bunge, et al., 2002; Casey, Thomas, et al., 2002; Casey, Trainor, et al., 1997; Crone, Donohue, Honomichl, Wendelken, & Bunge, 2006; Luna, et al., 2001; Moses, et al., 2002; Schlaggar, et al., 2002; Tamm, Menon, & Reiss, 2002; Thomas, et al., 2004; Turkeltaub, Gareau, Flowers, Zeffiro, & Eden, 2003).

How can this methodology inform us about whether adolescent decisions are indeed impulsive or are risky? Impulse control as measured by tasks such as the go/nogo task show a linear pattern of development across childhood and adolescence as described above. However, recent neuroimaging studies have begun to examine reward-related processing relevant to risk-taking in adolescents (Bjork, et al., 2004; Ernst, et al., 2005; Galvan, et al., 2005; May, et al., 2004; Van Leijenhorst, Moor, et al., 2010). These studies have focused primarily on the region of the ventral striatum, a region implicated in learning and predicting reward outcomes.

Sensitivity to Appetitive cues in Adolescence

Our neurobiological model suggests that the combination of heightened responsiveness to motivational cues and immaturity in behavioral control may bias adolescents to seek immediate, rather than long-term gains. Tracking subcortical (e.g., ventral striatum) and cortical (e.g., prefrontal) development across childhood through adulthood provides constraints on whether changes reported in adolescence are specific to this period of development, or reflect maturation that is steadily occurring in a somewhat linear pattern from childhood to adulthood.

Several groups have shown that adolescents show heightened activation of the ventral striatum in anticipation and/or receipt of rewards compared to adults (Ernst, et al., 2005; Galvan, et al., 2006; Geier, Terwilliger, Teslovich, Velanova, & Luna, 2009; Van Leijenhorst, Zanolie, et al., 2010), coupled with less activation in the prefrontal cortex relative to adults. In one of the first studies to examine this response across the full range of childhood to adulthood Galvan and her colleagues examined behavioral and neural responses to reward manipulations in 6 to 29 year olds. They focused on brain circuitry implicated in reward related learning and behavior in animal studies (Hikosaka & Watanabe, 2000; Pecina, Cagniard, Berridge, Aldridge, & Zhuang, 2003; Schultz, 2006), adult human imaging studies (e.g. (Knutson, Adams, Fong, & Hommer, 2001; O'Doherty, Kringelbach, Rolls, Hornak, & Andrews, 2001; Zald, et al., 2004)) and in studies of addiction (Hyman & Malenka, 2001; Volkow & Li, 2004). Based on rodent models (Laviola, et al., 1999; Spear, 2000) and previous imaging work (Ernst, et al., 2005), they hypothesized that relative to children and adults, adolescents would show exaggerated activation of the ventral striatum in concert with less mature recruitment of top down prefrontal control regions. Their results supported this hypothesis showing that the spatial extent of brain activity in adolescents in the ventral striatum to reward, was similar to that observed in adults, whereas the extent of activity in prefrontal regions was more similar to children. The extent of activity between these two regions was associated with elevated magnitude of activity in the ventral striatum in adolescents relative to children and adults assumed to result from the imbalance in corticosubcortial development (see Figure 2). Recent work showing delayed functional connectivity between prefrontal and subcortical regions in adolescence relative to adults provides a mechanism for the lack of top down control of regions related to processing motivational cues (Hare, et al., 2008).

These findings are consistent in part with rodent models (Laviola, Macri, Morley-Fletcher, & Adriani, 2003) and previous imaging studies (Ernst, et al., 2005; Van Leijenhorst, Moor, et al., 2010) showing enhanced ventral striatal activity to rewards and anticipation of rewards during adolescence. Relative to children and adults, adolescents showed an exaggerated ventral striatal response to reward. However, both children and adolescents showed a less mature response in prefrontal control regions than adults. These findings suggest different developmental trajectories for these regions may underlie the enhancement in ventral striatal activity, relative to children or adults, which may in turn relate to the increased risky decisions observed during this period of development (Figner, et al., 2009). It is relevant to note that while several laboratories (Ernst, et al., 2005; Galvan, et al., 2006; Geier, et al., 2009; Somerville, et al., in press; Van Leijenhorst, Moor, et al., 2010) have shown this heightened response in the ventral striatum in adolescents, one laboratory has failed to observe this response (Bjork, et al., 2004; Bjork, Smith, Chen, & Hommer, 2010) Future studies will be needed to clarify the specific conditions under which this pattern of brain activity is or is not observed.

Differential recruitment of prefrontal and subcortical regions have been reported across a number of developmental fMRI studies (Casey, Thomas, et al., 2002; Geier, et al., 2009; Luna, et al., 2001; Monk, et al., 2003; Thomas, et al., 2004; Van Leijenhorst, Zanolie, et al., 2010). Typically these findings have been interpreted in terms of immature prefrontal regions rather than an imbalance between prefrontal and subcortical regional development. Given evidence of prefrontal regions in guiding appropriate actions in different contexts (Miller & Cohen, 2001), immature prefrontal activity might hinder appropriate estimation of future outcomes and appraisal of risky choices, and might thus be less influential on reward valuation than the ventral striatum. This pattern is consistent with previous research showing elevated subcortical, relative to cortical activity when decisions are biased by immediate over long-term gains (McClure, Laibson, Loewenstein, & Cohen, 2004). During adolescence, relative to childhood or adulthood, immature prefrontal cortex engagement may not provide sufficient top-down control of robustly activated reward processing regions (e.g., ventral striatum), resulting in less influence of prefrontal systems relative to the ventral striatum in reward valuation.

While differential recruitment of cortical and subcortical regions has been robustly reported across development, only a few studies have addressed how cognitive control and reward systems interact. A recent study by (Geier, et al., 2009) examined this interaction using a version of an antisaccade task during fMRI in adolescents and adults. Their findings showed that on trials for which money was at stake, performance was enhanced, with the greatest enhancement (faster and more accurate responses) observed in adolescents. This performance was paralleled by exaggerated activation in the ventral striatum in adolescents following a cue that the next trial would be rewarded while they were preparing for and subsequently executing the antisaccade. Adolescents also showed elevated prefrontal activity in regions important for controlling eye movements. These findings suggest a reward-related upregulation in these control regions.

The Geier study provides an example of how appetitive cues can facilitate cognitive performance in adolescents, but high risk behavior in adolescence in every day life suggest that appetitive cues may impair cognitive decisions. To test this hypothesis, Somerville and colleagues (Somerville, et al., in press) tested children, adolescents and adults while performing a go/nogo task on which they had to suppress a response to an appetitive social cue. She showed that adolescents had greater difficulty resisting appetitive social cues compared to children and adults, as evidenced by more false alarms to these cue than neutral ones. This behavioral performance was paralleled by enhanced activity in the ventral striatum. In contrast, activation in the prefrontal cortex was associated with overall accuracy and showed a linear decrease in activity with improvement in performance and age. A functional connectivity analyses identified the dorsal striatum as a key convergence point for cortical and subcortical signals. Collectively, these studies suggest that differences in adolescent behavior from adults depend on the context of the behavior. In appetitively charged situations, subcortical systems involved in detection of appetitive cues will win out (accelerator) over cortical control systems (brakes), given differential regional development. However, in situations in which appetitive or emotive cues are not present, cortical control systems are not compromised, leading to more optimal performance in adolescents.

Adolescence and Individual Differences

Individuals vary in their ability to control impulses and in risk taking, a phenomenon that has been recognized in psychology for some time (Benthin, Slovic, & Severson, 1993). Therefore, some adolescents will be more likely to engage in risky behaviors, and be more prone to poorer outcomes. Thus, examining individual variability may help to identify potential bio-behavioral markers to identify individuals who might be at greater risk for poor outcomes during adolescence.

A classic example of individual differences reported in these abilities in the social, cognitive and developmental psychology literature is that of delay of gratification (Mischel, et al., 1989). Delay of gratification is typically assessed in 3 to 4 year old children. The child is asked whether they would prefer a small reward (one marshmallow) now or a large reward (two marshmallows) later. The child is then told that the experimenter will leave the room in order to prepare for upcoming activities and explains to the child that if she remains in her seat and does not eat a marshmallow during that time, then she will receive the large reward of both marshmallows. If the child does not or cannot wait, she should ring a bell to summons the experimenter and thereby receive the smaller reward. Once it is clear the child understands the task, she is seated at the table with the two rewards and the bell. Distractions in the room are minimized, with no toys, books or pictures. The experimenter returns after 15 minutes or after the child has rung the bell, eaten the rewards, or shown any signs of distress. Using this paradigm, Mischel showed that children typically behave in one of two ways on this task: 1) they ring the bell almost immediately in order to have the marshmallow, which means they only get one; 2) they wait and optimize their gains, and receive both marshmallows. This observation suggests that some individuals are better than others in their ability to control impulses in the face of highly salient incentives and this bias can be detected in early childhood (Mischel, et al., 1989) and appear to remain throughout adolescence and young adulthood (Eigsti, et al., 2006).

What might explain individual differences in optimal choice behavior? Some theorists have postulated that dopaminergic mesolimbic circuitry, implicated in reward processing, underlies risky behavior (Blum, et al., 2000). Developmental studies provide neurochemical evidence indicates that the balance in the adolescent brain between cortical and subcortical dopamine systems begins to shift toward greater cortical dopamine levels during adolescence (Brenhouse, et al., 2008; Spear, 2000). Similarly, there is a delayed time course of dopaminergic enervation of the nonhuman primate prefrontal cortex through adolescence into adulthood suggesting that functional maturity is not reached until adulthood (Rosenberg & Lewis, 1995). Individual differences in this circuitry, such as allelic variants in dopamine-related genes, resulting in too little or too much dopamine in subcortical regions, might relate to the propensity of some to engage in risky behavior more than others (O'Doherty, 2004).

The ventral striatum has been shown to increase in activity immediately prior to making risky choices on monetary-risk paradigms (Kuhnen & Knutson, 2005; Matthews, Simmons, Lane, & Paulus, 2004; Montague & Berns, 2002) and as described previously, adolescents show exaggerated striatal activity to rewarding outcomes relative to children or adults (Ernst, et al., 2005; Galvan, et al., 2006). Collectively, these data suggest that adolescents may be more prone to risky choices as a group (Figner, et al., 2009; Gardner & Steinberg, 2005), but some adolescents will be more prone than others to engage in risky behaviors, putting them at potentially greater risk for negative outcomes.

To explore individual differences in risk taking behavior, Galvan and colleagues (2007) examined the association between activity in reward-related neural circuitry in response to a large monetary reward with personality trait measures of risk taking and impulsivity in adolescence. Functional magnetic resonance imaging and anonymous self-report rating scales of risky behavior, risk perception and impulsivity were acquired in individuals between the ages of 7 and 29 years. There was a positive association between ventral striatal activity and the likelihood of engaging in risky behavior across development. This activity varied as a function of individuals' ratings of anticipated positive or negative consequences of such behavior. The individuals who perceived risky behaviors as leading to dire consequences, activated the ventral striatum less to reward. This negative association was driven by the child participants, whereas a positive association was seen in the adults who rated the consequences of such behavior as positive.

In addition to linking risk taking to reward circuitry, Galvan showed no association between activity of this circuitry and ratings of impulsivity (Galvan, et al., 2007). Instead she showed that impulsivity was negatively correlated with age. This finding is consistent with a recent report Steinberg (2008) showing differential development of sensation seeking and impulsivity, with sensation seeking increasing during adolescence relative to childhood and adulthood, but impulsivity followed a linear pattern of decreasing with age. These findings suggest that during adolescence, some individuals may be more prone to engage in risky behaviors due to developmental changes in concert with variability in a given individual's predisposition to engage in risky behavior, rather than to simple changes in impulsivity. Further, these individual and developmental differences may help explain vulnerability in some individuals to risk-taking associated with substance use, and ultimately, addiction.

Conclusion

Human imaging studies show structural and functional changes in corticosubcortical circuitry (for review, (Casey, Tottenham, et al., 2005; Giedd, et al., 1999; Giedd, et al., 1996; Jernigan, et al., 1991; Sowell, et al., 1999)) that parallel increases in cognitive control and self-regulation (Casey, Trainor, et al., 1997; Luna & Sweeney, 2004; Luna, et al., 2001; Rubia, et al., 2000; Steinberg, 2004; Steinberg, et al., 2008). These changes show a shift in activation of prefrontal regions from diffuse to more focal recruitment over time (Brown, et al., 2005; Bunge, et al., 2002; Casey, Trainor, et al., 1997; Durston & Casey, 2006; Moses, et al., 2002) and elevated recruitment of subcortical regions during adolescence (Casey, Thomas, et al., 2002; Durston & Casey, 2006; Luna, et al., 2001). Although neuroimaging studies cannot definitively characterize the mechanism of such developmental changes, these changes in volume and structure may reflect development within, and refinement of, projections to and from these brain regions during maturation suggestive of fine-tuning of the system with development (Hare, et al., 2008; Liston, et al., 2006).

Taken together, the findings synthesized here indicate that increased risk taking behavior in adolescence is associated with different developmental trajectories of subcortical motivational and cortical control regions. However, this is not to say that adolescents are incapable of making rational decisions. Rather, in emotionally charged situations, the more mature limbic system may win over the prefrontal control system in guiding actions.

Although adolescence has been distinguished as a period characterized by reward seeking and risk taking behaviors (Gardner & Steinberg, 2005; Spear, 2000) individual differences in neural responses to reward, predispose some adolescents to take more risks than others, putting them at greater risk for poor outcomes such as addiction, substance abuse and mortality. These findings provide crucial groundwork by synthesizing the various findings related to impulsivity and risk taking in adolescence and in understanding individual differences and developmental markers for propensities for suboptimal choices leading to negative consequences.

Link to Article

Abstract

Adolescence is a developmental period often characterized as a time of impulsive and risky choices leading to increased incidence of unintentional injuries and violence, alcohol and drug abuse, unintended pregnancy and sexually transmitted diseases. Traditional neurobiological and cognitive explanations for such suboptimal choices and actions have failed to account for nonlinear changes in behavior observed during adolescence, relative to childhood and adulthood. This review provides a biologically plausible conceptualization of the mechanisms underlying these nonlinear changes in behavior, as an imbalance between a heightened sensitivity to motivational cues and immature cognitive control. Recent human imaging and animal studies provide a biological basis for this view, suggesting differential development of subcortical limbic systems relative to top-down control systems during adolescence relative to childhood and adulthood. This work emphasizes the importance of examining transitions into and out of adolescence and highlights emerging avenues of future research on adolescent brain development.

The Enigmatic Inflection: Examining the Neurobiological Underpinnings of Risk-Taking in Adolescence

Introduction

Adolescence presents a distinctive developmental period marked by increased impulsivity, diminished consideration of long-term consequences, and a heightened proclivity for risk-taking behaviors compared to adulthood (Gardner & Steinberg, 2005; Scott, 1992; Steinberg, et al., 2008). This predisposition towards risk-taking manifests in elevated rates of accidents, suicides, unsafe sexual practices, and criminal activity (Scott, 1992). While younger adolescents (aged fifteen and below) exhibit greater impulsivity compared to their older counterparts, even youth aged sixteen and seventeen fall short of demonstrating adult-like self-control (Feld, 2008).

Over the past decade, a multitude of cognitive and neurobiological hypotheses have been proposed to elucidate the reasons behind adolescents' engagement in impulsive and risky actions. Traditional perspectives on adolescence posit that it constitutes a developmental stage characterized by the progressive enhancement of cognitive control capacities. This enhanced efficiency in cognitive control is attributed to the ongoing maturation of the prefrontal cortex, a notion corroborated by imaging studies (Galvan, et al., 2006; Gogtay, et al., 2004; Hare, et al., 2008; Sowell, et al., 2003) and post-mortem analyses (Bourgeois, Goldman-Rakic, & Rakic, 1994; Huttenlocher, 1979; Rakic, 1994), which collectively reveal continued structural and functional development within this brain region extending well into young adulthood.

The prevailing trend of improved cognitive control coinciding with prefrontal cortex maturation (Crone & van der Molen, 2007) implies a linear developmental trajectory from childhood to adulthood. If cognitive control deficits stemming from an immature prefrontal cortex were the sole determinant of suboptimal decision-making, then children, with their comparatively less developed prefrontal cortices and cognitive abilities, should exhibit comparable or even more pronounced deficits than adolescents (Casey, Getz, & Galvan, 2008). However, the suboptimal choices and actions observed during adolescence represent a developmental inflection point (Windle, et al., 2008) that distinguishes this period from both childhood and adulthood. This distinction is further underscored by data from the National Center for Health Statistics regarding adolescent behavior and mortality rates (Eaton, et al., 2008).

This review endeavors to address the fundamental question of how the adolescent brain undergoes changes that might account for this inflection in risky behavior. We present a testable neurobiological model that posits a dynamic interplay between subcortical and cortical brain regions and speculate on the evolutionary underpinnings of these systems. Drawing upon evidence from behavioral and human brain imaging studies, we examine this model within the context of motivated actions (Cauffman, et al., 2010; Figner, Mackinlay, Wilkening, & Weber, 2009; Galvan, Hare, Voss, Glover, & Casey, 2007; Galvan, et al., 2006) and explore factors that might render certain teenagers more susceptible than others to making suboptimal decisions with detrimental long-term outcomes (Galvan, et al., 2007; Hare, et al., 2008).

A Neurobiological Model of Adolescence

An accurate conceptualization of the cognitive and neurobiological transformations occurring during adolescence necessitates viewing this period as a transitional phase (Spear, 2000) rather than a static snapshot. In other words, understanding this developmental period requires characterizing both the entry into and exit from adolescence to delineate its unique attributes (Casey, Galvan, & Hare, 2005; Casey, Tottenham, Liston, & Durston, 2005). Establishing developmental trajectories for cognitive processes is essential for characterizing these transitions and refining our interpretations of behavioral changes during this time.

We propose a testable neurobiological model of adolescent development, grounded in rodent models (Brenhouse, Sonntag, & Andersen, 2008; Laviola, Adriani, Terranova, & Gerra, 1999; Spear, 2000) and recent human imaging studies (Ernst, et al., 2005; Galvan, et al., 2007; Galvan, et al., 2006; Hare, et al., 2008; Somerville, Hare, & Casey, in press; Van Leijenhorst, Moor, et al., 2010; Van Leijenhorst, Zanolie, et al., 2010). Figure 1 provides a visual representation of this model. Our characterization of adolescence transcends the simplistic notion of attributing risky behavior solely to prefrontal cortex immaturity. Instead, the proposed model highlights the necessity of considering both subcortical and cortical top-down control regions in concert. The figure illustrates distinct developmental trajectories for these systems, with subcortical structures like the ventral striatum maturing earlier than prefrontal control regions. According to this model, adolescents are biased more by their functionally mature subcortical regions compared to children, for whom both subcortical and prefrontal systems are still developing, and compared to adults, for whom both systems have reached full maturity.

This perspective provides a framework for understanding the nonlinear shifts in risky behavior observed across development. The earlier maturation of subcortical systems relative to prefrontal control systems creates an imbalance during adolescence. As individuals mature and gain experience, functional connectivity between these regions strengthens, enabling top-down control over this circuitry (Hare, et al., 2008). This model reconciles the apparent contradiction between the high prevalence of risky behavior during adolescence and the observation that adolescents are capable of rational decision-making and understanding the risks associated with their actions (Reyna & Farley, 2006). In emotionally charged situations, however, the more developed subcortical systems ("accelerator") can override the less mature prefrontal control systems ("brakes").

This model aligns with existing models of adolescent development (Ernst, Pine, & Hardin, 2006; Ernst, Romeo, & Andersen, 2009; Geier & Luna, 2009; Nelson, Leibenluft, McClure, & Pine, 2005; Steinberg, 2008; Steinberg, et al., 2009) that emphasize the differential developmental trajectories of subcortical and cortical regions. For instance, the triadic model proposed by Ernst and colleagues (Ernst, et al., 2006) describes motivated behavior as governed by three distinct neural circuits: approach, avoidance, and regulatory. The approach system, primarily controlled by the ventral striatum, is associated with reward-seeking behaviors. The avoidance system, largely regulated by the amygdala, is linked to avoidance behaviors. Lastly, the regulatory system, primarily under the purview of the prefrontal cortex, modulates the balance between the approach and avoidance systems. Consequently, heightened risk-taking during adolescence arises from a stronger influence of the approach system coupled with a weaker influence of the regulatory system.

Our model distinguishes itself by grounding its framework in empirical evidence for brain changes not only during the transition from adolescence to adulthood but also during the transition from childhood to adolescence. Moreover, we do not ascribe specific functionalities to the striatum and amygdala, recognizing that recent studies indicate their involvement in processing both positive and negative valence (Levita, et al., 2009). Instead, we posit that these structures play a crucial role in detecting motivationally and emotionally salient environmental cues that can influence behavior. This heightened sensitivity to appetitive and emotive cues during adolescence has been documented across species (Spear, 2009) and is further explored below.

Comparative and Evolutionary Perspectives on Adolescence

The imbalance model of adolescent brain development raises the question of why the brain might be wired to develop in this manner. Addressing this question requires considering the broader evolutionary context of adolescence as a transitional period between childhood and adulthood, demarcated by the onset of puberty and the beginnings of sexual maturation (Graber & Brooks-Gunn, 1998). This transition represents a gradual and somewhat undefined progression into adulthood (Spear, 2000, p.419). A comprehensive discussion of the impact of pubertal hormones on the brain and behavior falls beyond the scope of this review; readers are referred to (Forbes & Dahl, 2010; Romeo, 2003) for detailed reviews on this topic.

From an evolutionary standpoint, adolescence serves as a period for individuals to gain independence from the protection of their families, which inevitably exposes them to potential harm (Kelley, Schochet, & Landry, 2004). These independence-seeking behaviors are observed across mammalian species and manifest as increased peer-directed social interactions and heightened novelty-seeking, both of which contribute to adolescents' propensity for risky behavior (Brown, 2004; Chassin, et al., 2004; Collins & Laursen, 2004; Laviola, et al., 1999). This risk-taking can be conceptualized as the outcome of a biologically driven imbalance between an increased drive for novelty and sensation-seeking and an immature capacity for self-regulation (Steinberg, 2004). This developmental pattern might reflect an evolutionary adaptation, as individuals needed to engage in high-risk behaviors to venture out from their safe and familiar environments to find mates and reproduce (Spear, 2000). Therefore, risk-taking appears to coincide with the time when hormones drive adolescents to seek out sexual partners. In contemporary society, where adolescence can be extended indefinitely—with young adults delaying financial independence, living with parents for longer periods, and choosing mates later in life—this behavior may be less adaptive. Our neurobiological model suggests that this heightened risk-taking arises from the differential development of subcortical and cortical systems, a notion supported by empirical behavioral and imaging data, as reviewed below.

Adolescent Behavioral Development

A cornerstone of behavioral development is the capacity to suppress inappropriate actions in favor of goal-directed ones, especially when confronted with compelling incentives. This ability is commonly referred to as cognitive control (Casey, Galvan, et al., 2005; Casey, Giedd, & Thomas, 2000; Casey, Thomas, et al., 2000). Here, we revisit classic cognitive developmental research within the framework of age-related changes in cortically driven cognitive processes and present behavioral and neuroanatomical evidence that distinguishes these processes from those involved in risky behaviors.

Extensive developmental research has shown that cognitive control undergoes significant development throughout childhood and adolescence (Case, 1972; Flavell, Beach, & Chinksy, 1966; Keating & Bobbitt, 1978; Pascual-Leone, 1970). While some theorists attribute this development to increases in processing speed and efficiency (e.g., Bjorklund, 1985, 1987; Case, 1972), others highlight the crucial role of inhibitory processes (Harnishfeger & Bjorklund, 1993). This perspective posits that suboptimal choices in childhood stem from increased susceptibility to interference from competing sources that require suppression (e.g., Brainerd & Reyna, 1993; Casey, Thomas, Davidson, Kunz, & Franzen, 2002; Dempster, 1993; Diamond, 1985; Munakata & Yerys, 2001). Therefore, optimal decision-making necessitates impulse control (Mischel, Shoda, & Rodriguez, 1989), and this ability follows a linear developmental trajectory throughout childhood and adolescence (Eigsti, et al., 2006).

In contrast, risk-taking and reward-seeking behaviors appear to peak during adolescence before declining in adulthood (Eaton, et al., 2008; Windle, et al., 2008). This peak coincides with pubertal maturation (Dahl, 2004; Martin, et al., 2001). A recent study by Steinberg et al. (2008) aimed to disentangle the constructs of impulse/cognitive control and sensation seeking, defined as the desire to seek novel experiences and take risks to achieve them. Testing individuals between the ages of 10 and 30, they found that age-related differences in sensation seeking followed a curvilinear pattern, with peaks observed between 10 and 15 years of age, followed by a decline or stabilization thereafter. Conversely, age-related differences in impulsivity followed a linear pattern, with impulsivity decreasing with age. These findings suggest distinct developmental trajectories for the two constructs. Impulsivity diminishes with age throughout childhood and adolescence (Casey, Galvan, et al., 2005; Casey, Thomas, et al., 2002; Galvan, et al., 2007), although individuals exhibit varying degrees of impulsivity regardless of age (Eigsti, et al., 2006). In contrast to this linear decline in impulsivity, sensation-seeking/risk-taking appears to follow a curvilinear pattern, increasing during adolescence relative to childhood and adulthood (Cauffman, et al., 2010; Figner, et al., 2009; Galvan, et al., 2007). As discussed in subsequent sections, these findings suggest that risky behavior and impulse control are governed by distinct neural systems, with risk-taking behaviors developing earlier than impulse control mechanisms (Galvan, et al., 2007; Steinberg, et al., 2008).

Adolescent Brain Development

Recent investigations into adolescent brain development have capitalized on advances in neuroimaging, particularly magnetic resonance imaging (MRI) techniques, which are well-suited for studying developing populations. These techniques include: structural MRI, used to assess the size and shape of brain structures; functional MRI (fMRI), which measures patterns of brain activity; and diffusion tensor imaging (DTI), used to examine the connectivity of white matter fiber tracts. Evidence supporting our developmental model of competition between cortical and subcortical regions comes from DTI and fMRI studies, which reveal immature structural and functional connectivity, respectively.

MRI Studies of Human Brain Development

Numerous studies have employed structural MRI to map the anatomical trajectory of normal brain development (see review by Casey, Tottenham, et al., 2005). While total brain size reaches approximately 90% of its adult volume by the age of six, the gray and white matter components continue to undergo dynamic changes throughout adolescence. Data from longitudinal MRI studies indicate that gray matter volume follows an inverted U-shaped developmental trajectory, with greater regional variation than white matter (Giedd, 2004; Gogtay, et al., 2004; Sowell, et al., 2003; Sowell, Thompson, & Toga, 2004). Generally, regions subserving primary functions, such as motor and sensory systems, mature earlier, while higher-order association areas, responsible for integrating these primary functions, mature later (Gogtay, et al., 2004; Sowell, et al., 2004). For example, MRI studies show that cortical gray matter loss occurs earliest in primary sensorimotor areas and latest in the dorsolateral prefrontal and lateral temporal cortices (Gogtay, et al., 2004). This pattern aligns with findings from nonhuman primate and human postmortem studies indicating that the prefrontal cortex is among the last brain regions to reach maturity (Bourgeois, et al., 1994; Huttenlocher, 1979), while subcortical and sensorimotor regions develop earlier. In contrast to the inverted U-shaped pattern observed for gray matter, white matter volume increases in a roughly linear fashion throughout development, continuing well into adulthood (Gogtay, et al., 2004). These changes likely reflect ongoing myelination of axons by oligodendrocytes, which enhances neuronal conduction and communication between connected regions.

The precise relationship between structural brain changes and behavioral changes remains an active area of investigation. Some studies have reported indirect associations between MRI-based volumetric changes and cognitive function using neuropsychological measures (e.g., Casey, Castellanos, et al., 1997; Sowell, et al., 2003). Specifically, associations have been found between prefrontal cortical and basal ganglia volumes and measures of cognitive control, defined as the ability to override inappropriate choices or actions in favor of more appropriate ones (Casey, Castellanos, et al., 1997; Casey, Trainor, et al., 1997). These findings suggest that structural brain changes reflect underlying cognitive changes and underscore the importance of considering both subcortical (e.g., striatum) and cortical (e.g., prefrontal cortex) development.

DTI Studies of Human Brain Development

The MRI-based morphometry studies discussed earlier suggest that cortical connections become increasingly refined during development, with the elimination of redundant synapses and the strengthening of essential connections through experience. DTI, a more recent MRI technique, offers a means of examining the developmental modulation of specific white matter tracts and their relationship to behavior. One study found a positive correlation between the development of cognitive control and the integrity of prefrontal-parietal fiber tracts (Nagy, Westerberg, & Klingberg, 2004), consistent with fMRI studies showing differential recruitment of these regions in children compared to adults (Klingberg, Forssberg, & Westerberg, 2002). Using a similar approach, Liston and colleagues (2006) examined the strength of white matter tracts within frontostriatal circuits, which continue to develop throughout childhood and into adulthood. These frontostriatal fiber tracts were defined by connecting regions of interest in the striatum and ventral prefrontal cortex identified in a previous fMRI study using the same task (Durston, Thomas, Worden, Yang, & Casey, 2002; Epstein, et al., 2007). Across these developmental DTI studies, measures of fiber tract integrity throughout the brain were correlated with age. However, specific fiber tracts were differentially associated with cognitive control (Casey, et al., 2007; Liston, et al., 2006) or cognitive ability (Nagy, et al., 2004). Specifically, the strength of frontostriatal connections positively predicted impulse control capacity, as measured by performance on a go/no-go task (Casey, et al., 2007; Liston, et al., 2006). These findings underscore the importance of examining not only regional structural brain changes but also changes in connectivity when investigating the neural substrates of cognitive development.

Functional MRI Studies of Behavioral and Brain Development

While structural changes measured by MRI and DTI have been linked to behavioral changes during development, fMRI offers a more direct approach to examining structure-function relationships by simultaneously measuring brain activity and behavior. The ability to measure functional changes in the developing brain with fMRI holds significant promise for advancing our understanding of developmental processes. In the context of adolescent decision-making, fMRI provides a valuable tool for refining our interpretations of the underlying neural mechanisms. As mentioned previously, the development of the prefrontal cortex is believed to be instrumental in the maturation of higher-order cognitive abilities, including decision-making and goal-directed behavior (Casey, Tottenham, & Fossella, 2002; Casey, Trainor, et al., 1997). Numerous paradigms have been employed in conjunction with fMRI to investigate the neurobiological underpinnings of these abilities. These paradigms include the go/no-go task (requiring participants to respond to one stimulus while suppressing responses to another), the flanker task (requiring participants to indicate the directionality of a target surrounded by flanking stimuli that are either compatible or incompatible with the target), the stop-signal task (requiring participants to respond as quickly as possible to a stimulus but inhibit their response upon presentation of a stop signal, such as an auditory tone), and antisaccade tasks (requiring participants to inhibit reflexive eye movements towards a target and instead look in the opposite direction) (Bunge, Dudukovic, Thomason, Vaidya, & Gabrieli, 2002; Casey, Giedd, et al., 2000; Casey, Trainor, et al., 1997; Durston, et al., 2003; Luna, et al., 2001). Collectively, these studies reveal that children recruit distinct but often more extensive and diffuse prefrontal regions during these tasks compared to adults. With age, the pattern of activity within brain regions central to task performance (i.e., regions whose activity correlates with cognitive performance) becomes more focal and refined, while activity in regions not correlated with task performance diminishes. This pattern has been observed in both cross-sectional (Brown, et al., 2005) and longitudinal studies (Durston, et al., 2006) across a variety of paradigms.

Although neuroimaging studies cannot definitively pinpoint the specific neurobiological mechanisms underlying these developmental changes (e.g., dendritic arborization, synaptic pruning), the findings suggest that maturation involves the development and refinement of connections to and from activated brain regions. Moreover, the findings indicate that these neuroanatomical changes occur over an extended period (Brown, et al., 2005; Bunge, et al., 2002; Casey, Thomas, et al., 2002; Casey, Trainor, et al., 1997; Crone, Donohue, Honomichl, Wendelken, & Bunge, 2006; Luna, et al., 2001; Moses, et al., 2002; Schlaggar, et al., 2002; Tamm, Menon, & Reiss, 2002; Thomas, et al., 2004; Turkeltaub, Gareau, Flowers, Zeffiro, & Eden, 2003).

How can this methodology inform our understanding of whether adolescent decisions are truly impulsive or reflect a heightened risk-taking propensity? Performance on tasks measuring impulse control, such as the go/no-go task, shows a linear developmental improvement across childhood and adolescence, as described earlier. However, recent neuroimaging studies have begun to investigate reward-related processing in adolescents, which is relevant to understanding risk-taking behavior (Bjork, et al., 2004; Ernst, et al., 2005; Galvan, et al., 2005; May, et al., 2004; Van Leijenhorst, Moor, et al., 2010). These studies have primarily focused on the ventral striatum, a region implicated in reward learning and prediction.

Sensitivity to Appetitive Cues in Adolescence

Our neurobiological model proposes that the combination of heightened responsiveness to motivational cues and immature behavioral control may bias adolescents towards seeking immediate gratification over long-term gains. Examining the developmental trajectories of subcortical (e.g., ventral striatum) and cortical (e.g., prefrontal) regions across childhood, adolescence, and adulthood can help determine whether the changes observed during adolescence are unique to this period or reflect a more gradual, linear developmental progression.

Several research groups have demonstrated that adolescents exhibit greater activation in the ventral striatum compared to adults when anticipating or receiving rewards (Ernst, et al., 2005; Galvan, et al., 2006; Geier, Terwilliger, Teslovich, Velanova, & Luna, 2009; Van Leijenhorst, Zanolie, et al., 2010). This heightened subcortical activation is accompanied by less activation in the prefrontal cortex compared to adults. In one of the first studies to examine these responses across a wide age range (6 to 29 years), Galvan and colleagues investigated behavioral and neural responses to reward manipulations, focusing on brain circuitry implicated in reward-related learning and behavior in animal studies (Hikosaka & Watanabe, 2000; Pecina, Cagniard, Berridge, Aldridge, & Zhuang, 2003; Schultz, 2006), adult human imaging studies (e.g., Knutson, Adams, Fong, & Hommer, 2001; O'Doherty, Kringelbach, Rolls, Hornak, & Andrews, 2001; Zald, et al., 2004), and studies of addiction (Hyman & Malenka, 2001; Volkow & Li, 2004). Based on rodent models (Laviola, et al., 1999; Spear, 2000) and previous imaging work (Ernst, et al., 2005), they hypothesized that adolescents would exhibit exaggerated activation in the ventral striatum and less mature recruitment of top-down prefrontal control regions compared to both children and adults. Their results supported this hypothesis, revealing that adolescents showed similar spatial extent of activity in the ventral striatum to adults, while their prefrontal activity resembled that of children. The heightened activity in the ventral striatum in adolescents relative to children and adults was associated with a greater imbalance between corticosubcortical activity (see Figure 2). Recent work demonstrating delayed functional connectivity between prefrontal and subcortical regions in adolescents compared to adults provides a potential mechanism for this lack of top-down control over regions involved in processing motivational cues (Hare, et al., 2008).

These findings align with rodent models (Laviola, Macri, Morley-Fletcher, & Adriani, 2003) and previous imaging studies (Ernst, et al., 2005; Van Leijenhorst, Moor, et al., 2010) showing enhanced ventral striatal activity to rewards and reward anticipation during adolescence. Compared to both children and adults, adolescents exhibited an exaggerated ventral striatal response to reward. However, both children and adolescents displayed less mature prefrontal control region activity compared to adults. These distinct developmental trajectories suggest that the heightened ventral striatal activity observed in adolescents relative to children and adults might contribute to the increased risky decision-making observed during this period (Figner, et al., 2009). It is worth noting that while several laboratories have replicated this finding of heightened ventral striatal activity in adolescents (Ernst, et al., 2005; Galvan, et al., 2006; Geier, et al., 2009; Somerville, et al., in press; Van Leijenhorst, Moor, et al., 2010), one laboratory has not (Bjork, et al., 2004; Bjork, Smith, Chen, & Hommer, 2010). Future research is needed to clarify the specific conditions under which this pattern of brain activity is or is not observed.

Differential recruitment of prefrontal and subcortical regions has been reported across numerous developmental fMRI studies (Casey, Thomas, et al., 2002; Geier, et al., 2009; Luna, et al., 2001; Monk, et al., 2003; Thomas, et al., 2004; Van Leijenhorst, Zanolie, et al., 2010). These findings have often been interpreted as reflecting immature prefrontal regions rather than an imbalance between prefrontal and subcortical development. Given the role of prefrontal regions in guiding appropriate actions in different contexts (Miller & Cohen, 2001), immature prefrontal activity might hinder accurate estimations of future outcomes and appraisals of risky choices, rendering this region less influential than the ventral striatum in reward valuation. This pattern aligns with research showing elevated subcortical activity relative to cortical activity when decisions are biased by immediate gratification over long-term gains (McClure, Laibson, Loewenstein, & Cohen, 2004). During adolescence, the immature engagement of the prefrontal cortex may not provide sufficient top-down control over robustly activated reward-processing regions (e.g., ventral striatum), resulting in a greater influence of the ventral striatum relative to prefrontal systems in reward valuation.

While differential recruitment of cortical and subcortical regions has been consistently observed across development, few studies have investigated how cognitive control and reward systems interact. A recent study by Geier et al. (2009) examined this interaction in adolescents and adults using a modified antisaccade task during fMRI. Their findings revealed that performance, measured as response speed and accuracy, improved when monetary rewards were at stake, with the greatest enhancement observed in adolescents. This performance enhancement was accompanied by exaggerated activation in the ventral striatum in adolescents following a cue indicating that the next trial would be rewarded. This heightened activity occurred both during the preparation for and execution of the antisaccade. Additionally, adolescents showed increased prefrontal activity in regions involved in controlling eye movements, suggesting a reward-related upregulation of these control regions.

While the Geier study demonstrates how appetitive cues can facilitate cognitive performance in adolescents, the high prevalence of risky behavior in adolescents in everyday life suggests that these cues can also impair cognitive decisions. To test this hypothesis, Somerville and colleagues (Somerville, et al., in press) tested children, adolescents, and adults on a go/no-go task that required them to suppress responses to appetitive social cues. They found that adolescents had greater difficulty resisting these appetitive cues compared to children and adults, committing more false alarms to appetitive cues than to neutral ones. This behavioral pattern was accompanied by enhanced activity in the ventral striatum. Conversely, prefrontal cortex activation was associated with overall accuracy and decreased linearly with both age and performance improvement. A functional connectivity analysis identified the dorsal striatum as a key convergence point for cortical and subcortical signals. Taken together, these studies suggest that differences in adolescent behavior compared to adults depend on the specific context. In appetitively charged situations, the more mature subcortical systems involved in detecting appetitive cues ("accelerator") may override the less mature cortical control systems ("brakes") due to their differential developmental trajectories. However, in the absence of appetitive or emotive cues, cortical control systems are not compromised, leading to more optimal performance in adolescents.

Adolescence and Individual Differences

Individuals vary in their capacity for impulse control and their propensity for risk-taking, a phenomenon that has long been recognized in psychology (Benthin, Slovic, & Severson, 1993). Consequently, some adolescents are more likely than others to engage in risky behaviors and experience negative consequences. Examining individual variability might help identify potential bio-behavioral markers that could be used to identify individuals at greater risk for negative outcomes during adolescence.

The delay of gratification paradigm, a classic example of individual differences in the social, cognitive, and developmental psychology literature, highlights this variability in impulse control (Mischel, et al., 1989). Typically administered to children aged 3 to 4 years, the task presents the child with a choice between receiving a small reward (one marshmallow) immediately or a larger reward (two marshmallows) after a delay. The experimenter informs the child that they will leave the room to prepare for upcoming activities and explains that if the child can wait patiently without eating the marshmallow, they will receive both marshmallows upon the experimenter's return. If the child cannot wait, they can ring a bell to summon the experimenter and receive the smaller reward. Once the child demonstrates understanding of the task, they are seated at a table with both rewards and the bell. The experimenter minimizes distractions in the room, removing any toys, books, or pictures. After 15 minutes, or sooner if the child rings the bell, eats the reward, or shows signs of distress, the experimenter returns. Using this paradigm, Mischel found that children generally exhibit one of two behavioral patterns: 1) they ring the bell almost immediately to obtain the single marshmallow, or 2) they wait patiently and maximize their reward by receiving both marshmallows. This observation suggests that individuals differ in their ability to control their impulses when confronted with highly salient incentives, and this bias, detectable in early childhood (Mischel, et al., 1989), appears to persist throughout adolescence and young adulthood (Eigsti, et al., 2006).

What might account for these individual differences in optimal choice behavior? Some researchers propose that dopaminergic mesolimbic circuitry, implicated in reward processing, plays a key role in risky behavior (Blum, et al., 2000). Developmental studies provide neurochemical evidence indicating that the balance between cortical and subcortical dopamine systems in the adolescent brain begins to shift towards greater cortical dopamine levels during adolescence (Brenhouse, et al., 2008; Spear, 2000). Similarly, studies in nonhuman primates show a delayed time course of dopaminergic innervation of the prefrontal cortex, with full functional maturity not reached until adulthood (Rosenberg & Lewis, 1995). Individual differences in this circuitry, such as allelic variations in dopamine-related genes leading to either insufficient or excessive dopamine levels in subcortical regions, might contribute to individual differences in risk-taking propensity (O'Doherty, 2004).

The ventral striatum exhibits increased activity immediately before individuals make risky choices on monetary risk-taking tasks (Kuhnen & Knutson, 2005; Matthews, Simmons, Lane, & Paulus, 2004; Montague & Berns, 2002). As previously discussed, adolescents show exaggerated striatal activity to rewarding outcomes compared to children or adults (Ernst, et al., 2005; Galvan, et al., 2006). These findings collectively suggest that adolescents, as a group, may be more prone to risky choices (Figner, et al., 2009; Gardner & Steinberg, 2005). However, individual differences in risk-taking propensity are evident, with some adolescents exhibiting a greater tendency than others to engage in risky behaviors, potentially placing them at higher risk for negative outcomes.

To explore individual differences in risk-taking behavior, Galvan and colleagues (2007) examined the relationship between reward-related neural circuitry activity in response to a large monetary reward and personality trait measures of risk-taking and impulsivity in individuals aged 7 to 29 years. Functional MRI data and anonymous self-report measures of risky behavior, risk perception, and impulsivity were collected. The researchers found a positive association between ventral striatal activity and the likelihood of engaging in risky behavior across development. However, this activity varied as a function of individuals' perceived positive or negative consequences of such behavior. Individuals who anticipated dire consequences from risky behaviors showed less ventral striatal activation to reward. This negative association was primarily driven by the child participants, whereas a positive association was observed in adults who rated the consequences of risky behavior as positive.

In addition to linking risk-taking to reward circuitry activity, Galvan's study found no association between this circuitry and ratings of impulsivity (Galvan, et al., 2007). Instead, impulsivity was negatively correlated with age, consistent with Steinberg's (2008) findings of distinct developmental trajectories for sensation seeking and impulsivity. While sensation seeking increases during adolescence relative to childhood and adulthood, impulsivity follows a linear decline with age. These findings suggest that individual differences in risk-taking propensity during adolescence may stem from a combination of developmental changes in reward-related brain circuitry and individual predispositions towards risky behavior, rather than simply reflecting changes in impulsivity. These individual and developmental differences may help explain why some individuals are more vulnerable than others to the allure of risk-taking, which can lead to substance use and, ultimately, addiction.

Conclusion

Human imaging studies have revealed significant structural and functional changes in corticosubcortical circuitry (for review, see Casey, Tottenham, et al., 2005; Giedd, et al., 1999; Giedd, et al., 1996; Jernigan, et al., 1991; Sowell, et al., 1999), coinciding with developmental improvements in cognitive control and self-regulation (Casey, Trainor, et al., 1997; Luna & Sweeney, 2004; Luna, et al., 2001; Rubia, et al., 2000; Steinberg, 2004; Steinberg, et al., 2008). These changes are characterized by a shift in prefrontal activation patterns from diffuse to more focal recruitment (Brown, et al., 2005; Bunge, et al., 2002; Casey, Trainor, et al., 1997; Durston & Casey, 2006; Moses, et al., 2002) and elevated subcortical activity during adolescence (Casey, Thomas, et al., 2002; Durston & Casey, 2006; Luna, et al., 2001). While neuroimaging studies cannot definitively elucidate the precise neurobiological mechanisms driving these changes, the observed changes in brain volume and structure likely reflect developmental processes, such as synaptic pruning and myelination, that refine connections within and between these brain regions during maturation (Hare, et al., 2008; Liston, et al., 2006).

Collectively, the findings reviewed here indicate that the heightened risk-taking behavior observed during adolescence is associated with distinct developmental trajectories of subcortical motivational systems and cortical control regions. This is not to say that adolescents are incapable of making rational decisions. Rather, in emotionally charged situations, the more mature limbic system may exert greater influence over behavior than the still-developing prefrontal control system.

While adolescence is widely recognized as a period marked by increased reward-seeking and risk-taking behaviors (Gardner & Steinberg, 2005; Spear, 2000), individual differences in neural responses to reward can render some adolescents more susceptible than others to engaging in risky behaviors, potentially increasing their risk for negative outcomes such as addiction, substance abuse, and mortality.

Link to Article

Abstract

Adolescence is a developmental period often characterized as a time of impulsive and risky choices leading to increased incidence of unintentional injuries and violence, alcohol and drug abuse, unintended pregnancy and sexually transmitted diseases. Traditional neurobiological and cognitive explanations for such suboptimal choices and actions have failed to account for nonlinear changes in behavior observed during adolescence, relative to childhood and adulthood. This review provides a biologically plausible conceptualization of the mechanisms underlying these nonlinear changes in behavior, as an imbalance between a heightened sensitivity to motivational cues and immature cognitive control. Recent human imaging and animal studies provide a biological basis for this view, suggesting differential development of subcortical limbic systems relative to top-down control systems during adolescence relative to childhood and adulthood. This work emphasizes the importance of examining transitions into and out of adolescence and highlights emerging avenues of future research on adolescent brain development.

The Adolescent Brain: A Period of Heightened Sensitivity to Rewards and Risk-Taking

Introduction

Adolescence is a time of significant change, marked by an increased tendency towards impulsive behavior, a disregard for long-term consequences, and a higher engagement in risky activities compared to adulthood (Gardner & Steinberg, 2005; Scott, 1992; Steinberg, et al., 2008). This inclination towards risk-taking manifests in higher rates of accidents, suicides, unsafe sexual behavior, and criminal involvement (Scott, 1992). While younger adolescents (15 and under) display greater impulsivity, even older adolescents (16-17 years old) haven't fully developed adult-like self-control (Feld, 2008).

Over the past decade, researchers have explored various cognitive and neurobiological explanations for these impulsive and risky behaviors during adolescence. Traditional views suggested that cognitive control abilities, particularly those governed by the prefrontal cortex, steadily improve throughout adolescence. This improvement is supported by brain imaging studies (Galvan, et al., 2006; Gogtay, et al., 2004; Hare, et al., 2008; Sowell, et al., 2003) and post-mortem analyses (Bourgeois, Goldman-Rakic, & Rakic, 1994; Huttenlocher, 1979; Rakic, 1994), which demonstrate continued development of the prefrontal cortex's structure and function well into early adulthood.

However, if cognitive control alone was the sole factor, then children, with their even less developed prefrontal cortex, should exhibit similar or even greater impulsivity compared to adolescents (Casey, Getz, & Galvan, 2008). However, the risky behaviors observed during adolescence represent a distinct developmental stage (Windle, et al., 2008) different from both childhood and adulthood, as reflected in national health statistics (Eaton, et al., 2008).

This review delves into the question of how the adolescent brain undergoes changes that might explain this peak in risky behavior. We present a neurobiological model that emphasizes the dynamic interplay between subcortical and cortical brain regions. We examine evidence from behavioral studies and human brain imaging to support this model, particularly in the context of decision-making within motivational contexts (Cauffman, et al., 2010; Figner, Mackinlay, Wilkening, & Weber, 2009; Galvan, Hare, Voss, Glover, & Casey, 2007; Galvan, et al., 2006). Additionally, we explore why some teenagers might be more vulnerable than others to making poor decisions that lead to negative long-term outcomes (Galvan, et al., 2007; Hare, et al., 2008).

Neurobiological Model of Adolescence

To understand the cognitive and neurobiological changes during adolescence, we need to view it as a transitional period (Spear, 2000) rather than a static phase. Examining both the transition into and out of adolescence is crucial for identifying the unique characteristics of this developmental stage (Casey, Galvan, & Hare, 2005; Casey, Tottenham, Liston, & Durston, 2005).

We propose a neurobiological model that builds upon rodent studies (Brenhouse, Sonntag, & Andersen, 2008; Laviola, Adriani, Terranova, & Gerra, 1999; Spear, 2000) and recent human brain imaging research (Ernst, et al., 2005; Galvan, et al., 2007; Galvan, et al., 2006; Hare, et al., 2008; Somerville, Hare, & Casey, in press; Van Leijenhorst, Moor, et al., 2010; Van Leijenhorst, Zanolie, et al., 2010). Our model, depicted in Figure 1, suggests that risky behavior in adolescence isn't solely due to an immature prefrontal cortex, but rather a complex interplay between subcortical and cortical brain regions. The model highlights the different developmental trajectories of these systems, with subcortical regions like the ventral striatum maturing earlier than prefrontal control regions.

This model proposes that adolescents, compared to children or adults, are more influenced by their functionally mature subcortical systems, leading to an imbalance between these regions and the still-developing prefrontal control regions. Children have both systems developing simultaneously, while adults have both systems fully matured. This imbalance in adolescence results in a peak in risky behaviors. As individuals mature, the functional connections between these regions strengthen, allowing for better top-down control (Hare, et al., 2008).

This model explains why adolescents, while capable of rational thought and understanding risks (Reyna & Farley, 2006), may still engage in risky behaviors, especially in emotionally charged situations where the more developed subcortical systems ("the accelerator") can overpower the still-maturing prefrontal control systems ("the brakes").

Our model aligns with other models of adolescent development (Ernst, Pine, & Hardin, 2006; Ernst, Romeo, & Andersen, 2009; Geier & Luna, 2009; Nelson, Leibenluft, McClure, & Pine, 2005; Steinberg, 2008; Steinberg, et al., 2009) that propose differential development of subcortical and cortical regions.

Our model builds upon these by incorporating empirical evidence of brain changes across the entire developmental trajectory – from childhood, through adolescence, and into adulthood. Additionally, we suggest that the striatum and amygdala are not solely dedicated to approach and avoidance behaviors, respectively. Instead, these regions play a crucial role in detecting motivationally and emotionally relevant cues, which can significantly influence behavior. This sensitivity to reward and emotional cues during adolescence has been observed across species (see Spear, 2009).

Comparative and Evolutionary Perspectives on Adolescence

Why would the brain evolve to develop in this imbalanced way during adolescence? Adolescence can be viewed as the transition period between the dependence of childhood and the independence of adulthood, marked by the onset of puberty and sexual maturation (Graber & Brooks-Gunn, 1998).

From an evolutionary perspective, adolescence represents a time for individuals to gain independence from their families and seek out mates, a process that inevitably involves risks (Kelley, Schochet, & Landry, 2004). This independence-seeking behavior, observed in various mammalian species, includes increased peer interaction and novelty-seeking, contributing to risky behavior (Brown, 2004; Chassin, et al., 2004; Collins & Laursen, 2004; Laviola, et al., 1999).

It is theorized that this developmental pattern, with its inherent risks, was evolutionarily advantageous. Engaging in high-risk behaviors allowed individuals to leave their familiar and safe environments to find mates and reproduce (Spear, 2000). In today's society, where adolescence is prolonged, and individuals may remain financially dependent on their families for longer periods, these behaviors might be less adaptive.

Adolescent Behavioral Development

A key aspect of behavioral development is the ability to suppress inappropriate actions in favor of goal-directed ones, especially when enticing rewards are present. This ability, known as cognitive control (Casey, Galvan, et al., 2005; Casey, Giedd, & Thomas, 2000; Casey, Thomas, et al., 2000), shows a clear developmental trajectory.

Classic developmental studies highlight that cognitive control improves throughout childhood and adolescence (Case, 1972; Flavell, Beach, & Chinksy, 1966; Keating & Bobbitt, 1978; Pascual-Leone, 1970). Some attribute this to increases in processing speed and efficiency (e.g., (Bjorklund, 1985, 1987; Case, 1972)), while others emphasize the development of "inhibitory" processes (Harnishfeger & Bjorklund, 1993). The latter suggests that children are more susceptible to distractions, and their decision-making is hampered by their inability to suppress irrelevant information (e.g., (Brainerd & Reyna, 1993; Casey, Thomas, Davidson, Kunz, & Franzen, 2002; Dempster, 1993; Diamond, 1985; Munakata & Yerys, 2001).

In contrast to the linear development of cognitive control, risk-taking behaviors appear to peak during adolescence and decline in adulthood (Eaton, et al., 2008; Windle, et al., 2008), often coinciding with puberty (Dahl, 2004; Martin, et al., 2001). Steinberg et al. (2008) differentiated between impulsivity and sensation seeking, which is the desire for new experiences and the willingness to take risks to achieve them. They found that sensation seeking follows a curvilinear pattern, peaking in adolescence, while impulsivity decreases linearly with age.

These findings indicate distinct developmental trajectories for these constructs. Impulsivity consistently diminishes with age (Casey, Galvan, et al., 2005; Casey, Thomas, et al., 2002; Galvan, et al., 2007), although individual differences in impulsivity exist at all ages (Eigsti, et al., 2006). In contrast, sensation-seeking/risk-taking follows a curvilinear pattern, increasing during adolescence compared to childhood and adulthood (Cauffman, et al., 2010; Figner, et al., 2009; Galvan, et al., 2007). This suggests separate neural systems for these behaviors, with risk-taking developing earlier and impulsivity control maturing later (Galvan, et al., 2007; Steinberg, et al., 2008).

Adolescent Brain Development

Advances in neuroimaging, particularly magnetic resonance imaging (MRI), have allowed for in-depth investigations into adolescent brain development. Structural MRI measures the size and shape of brain structures, functional MRI (fMRI) tracks brain activity patterns, and diffusion tensor imaging (DTI) examines the connectivity of white matter fibers. Evidence for our proposed model comes from findings of immature structural and functional connectivity, as measured by DTI and fMRI, respectively.

MRI Studies of Human Brain Development

Numerous studies have used structural MRI to map the course of brain development (see review (Casey, Tottenham, et al., 2005)). While the brain reaches approximately 90% of its adult size by age six, the gray and white matter components continue to change throughout adolescence. Longitudinal MRI studies indicate an inverted U-shaped pattern for gray matter volume, with more regional variation compared to white matter (Giedd, 2004; Gogtay, et al., 2004; Sowell, et al., 2003; Sowell, Thompson, & Toga, 2004). Brain regions responsible for basic functions, like motor and sensory processing, mature earlier, while higher-order association areas, which integrate information from these basic regions, mature later (Gogtay, et al., 2004; Sowell, et al., 2004).

For instance, the prefrontal cortex experiences gray matter loss later than primary sensorimotor areas (Gogtay, et al., 2004). This pattern aligns with primate and post-mortem human studies, demonstrating that the prefrontal cortex is one of the last brain regions to fully mature (Bourgeois, et al., 1994; Huttenlocher, 1979), while subcortical and sensorimotor regions develop earlier. In contrast to gray matter, white matter volume exhibits a roughly linear increase throughout development, continuing well into adulthood (Gogtay, et al., 2004), likely reflecting ongoing myelination of axons, which improves neural communication.

While the exact relationship between structural brain changes and behavior is not fully understood, some studies have shown correlations between brain structure and cognitive function. For example, associations have been found between the volume of the prefrontal cortex and basal ganglia and performance on tasks measuring cognitive control (Casey, Castellanos, et al., 1997) (Casey, Trainor, et al., 1997). These findings suggest that cognitive development is reflected in structural brain changes, highlighting the importance of both cortical (prefrontal cortex) and subcortical (striatum) development.

DTI Studies of Human Brain Development

MRI-based morphometry studies suggest that cortical connections are refined throughout development, with unnecessary synapses being eliminated and important connections strengthened. DTI provides a tool to examine the development of specific white matter tracts and their relationship to behavior. For instance, one study found a positive correlation between the development of cognitive control and the strength of white matter tracts connecting the prefrontal and parietal regions of the brain (Nagy, Westerberg, & Klingberg, 2004). These findings align with fMRI studies showing the increased recruitment of these regions in adults compared to children during cognitive tasks (Klingberg, Forssberg, & Westerberg, 2002).

Similarly, Liston and colleagues (2006) investigated the strength of white matter tracts within the frontostriatal circuits, which continue developing throughout childhood and into adulthood. Their findings showed that the strength of these connections predicted impulse control ability, measured by performance on a go/nogo task (Casey, et al., 2007; Liston, et al., 2006). These DTI studies emphasize the importance of examining not only regional brain changes but also changes in the connections between regions when studying cognitive development.

Functional MRI Studies of Behavioral and Brain Development

While structural changes measured by MRI and DTI provide valuable information, fMRI allows researchers to directly observe brain activity during tasks, enabling a more direct assessment of structure-function relationships. fMRI has significantly advanced our understanding of how the brain functions during development.

As mentioned earlier, the prefrontal cortex plays a crucial role in the development of higher cognitive functions, such as decision-making and goal-directed behavior (Casey, Tottenham, & Fossella, 2002; Casey, Trainor, et al., 1997). Various fMRI paradigms, including go/nogo, flanker, stop-signal, and antisaccade tasks, have been used to investigate the neural underpinnings of these abilities. These studies consistently show that children recruit a wider network of prefrontal regions compared to adults when performing these tasks. As individuals mature, brain activity becomes more focused and efficient, with decreased activity in regions not essential for task performance. This pattern has been observed in both cross-sectional (Brown, et al., 2005) and longitudinal studies (Durston, et al., 2006).

While fMRI cannot directly reveal the specific cellular mechanisms driving these developmental changes (e.g., changes in synapses, myelination), the findings suggest that brain regions and their connections become more refined and specialized with age and experience. These neuroanatomical changes occur gradually over time (Brown, et al., 2005; Bunge, et al., 2002; Casey, Thomas, et al., 2002; Casey, Trainor, et al., 1997; Crone, Donohue, Honomichl, Wendelken, & Bunge, 2006; Luna, et al., 2001; Moses, et al., 2002; Schlaggar, et al., 2002; Tamm, Menon, & Reiss, 2002; Thomas, et al., 2004; Turkeltaub, Gareau, Flowers, Zeffiro, & Eden, 2003).

fMRI also helps differentiate between impulsivity and risk-taking in adolescents. While impulse control tasks like the go/nogo show linear developmental improvement, recent studies have begun to examine reward-related processing in adolescents using fMRI (Bjork, et al., 2004; Ernst, et al., 2005; Galvan, et al., 2005; May, et al., 2004; Van Leijenhorst, Moor, et al., 2010), focusing on the ventral striatum, a region involved in reward learning and prediction.

Sensitivity to Appetitive cues in Adolescence

Our model proposes that adolescents' heightened sensitivity to motivational cues, coupled with their still-developing behavioral control, might explain their tendency to prioritize immediate rewards over long-term gains. Examining brain development across childhood, adolescence, and adulthood helps determine whether these changes are specific to adolescence or part of a continuous developmental process.

Several studies have shown that adolescents, compared to children and adults, exhibit heightened activation in the ventral striatum in anticipation of and/or in response to rewards (Ernst, et al., 2005; Galvan, et al., 2006; Geier, Terwilliger, Teslovich, Velanova, & Luna, 2009; Van Leijenhorst, Zanolie, et al., 2010). This heightened activity is often accompanied by less activation in the prefrontal cortex compared to adults.

Galvan and colleagues (2006) investigated behavioral and neural responses to reward manipulations in individuals aged 6 to 29. Their findings revealed that adolescents exhibited an exaggerated response in the ventral striatum to rewards compared to both children and adults. However, both adolescents and children showed less mature prefrontal cortex activation compared to adults. These results suggest that the heightened activity in the ventral striatum observed in adolescents, and the associated increase in risky behavior, might be due to the differing developmental trajectories of these brain regions (see Figure 2).

These findings are consistent with rodent studies (Laviola, Macri, Morley-Fletcher, & Adriani, 2003) and other human imaging studies (Ernst, et al., 2005; Van Leijenhorst, Moor, et al., 2010) demonstrating enhanced ventral striatal responses to reward anticipation and receipt during adolescence. Importantly, while many studies have observed this heightened ventral striatal response in adolescents, some have not (Bjork, et al., 2004; Bjork, Smith, Chen, & Hommer, 2010). Future research is needed to clarify the specific conditions under which this occurs.

The differential activation patterns in the prefrontal cortex and subcortical regions suggest that the still-developing prefrontal cortex might not be able to effectively regulate the more developed and highly responsive reward-processing regions, like the ventral striatum, during adolescence. This imbalance might contribute to riskier decision-making.

A study by Geier et al. (2009) investigated this interplay between reward and cognitive control using an antisaccade task with fMRI. They found that adolescents showed enhanced performance and exaggerated ventral striatal activity when the task involved potential monetary rewards. This suggests that reward cues can enhance cognitive performance in adolescents.

However, in real-life scenarios, appetitive cues might lead to poorer cognitive decisions in adolescents. Somerville et al. (in press) explored this by having children, adolescents, and adults perform a go/nogo task where they had to inhibit their responses to appealing social cues. Adolescents had a harder time resisting these cues compared to children and adults, making more errors. This difficulty was associated with increased ventral striatal activity and decreased prefrontal cortex activity in adolescents. These findings suggest that the impact of rewards on adolescent behavior depends on the specific context.

Adolescence and Individual Differences

Individuals differ in their ability to control impulses and engage in risk-taking, and these differences are evident from a young age (Benthin, Slovic, & Severson, 1993). Therefore, some adolescents are more prone to engaging in risky behaviors and experiencing negative consequences. Understanding these individual differences might help identify individuals at higher risk.

A classic example is the delay of gratification paradigm, typically used with young children (Mischel, et al., 1989). The child is offered a small reward immediately or a larger reward if they wait. This task reveals that some children prioritize immediate gratification, while others can delay gratification for a larger reward. This ability to delay gratification is evident from early childhood and appears to persist throughout adolescence and young adulthood (Eigsti, et al., 2006).

One potential explanation for these individual differences in decision-making lies in the dopaminergic mesolimbic circuitry, which plays a crucial role in reward processing and has been implicated in risky behavior (Blum, et al., 2000). Research suggests that the balance between cortical and subcortical dopamine systems shifts during adolescence, with cortical dopamine levels increasing (Brenhouse, et al., 2008; Spear, 2000). This shift suggests that the prefrontal cortex's ability to regulate reward-seeking behavior continues to mature throughout adolescence.

Variations in this circuitry, such as genetic differences affecting dopamine levels, might explain why some individuals are more prone to risky behaviors (O'Doherty, 2004). The ventral striatum, a key region in this circuitry, has been shown to be more active in individuals who make risky choices in monetary tasks (Kuhnen & Knutson, 2005; Matthews, Simmons, Lane, & Paulus, 2004; Montague & Berns, 2002).

Galvan et al. (2007) investigated individual differences in risk-taking by examining the relationship between neural responses to reward anticipation and self-reported risk-taking behavior. They found a positive correlation between ventral striatal activity and the likelihood of engaging in risky behaviors. Interestingly, this activity was also influenced by individuals' perceptions of the potential consequences of their actions. Those who perceived risky behaviors as having severe negative consequences showed less activation in the ventral striatum in response to reward anticipation.

Importantly, Galvan's study found no association between impulsivity and ventral striatal activity. This further supports the idea that sensation-seeking and impulsivity are distinct constructs with different developmental trajectories (Steinberg, 2008).

Conclusion

Neuroimaging studies reveal significant structural and functional changes in corticosubcortical circuitry throughout adolescence (for review, (Casey, Tottenham, et al., 2005; Giedd, et al., 1999; Giedd, et al., 1996; Jernigan, et al., 1991; Sowell, et al., 1999)). These changes coincide with developmental improvements in cognitive control and self-regulation (Casey, Trainor, et al., 1997; Luna & Sweeney, 2004; Luna, et al., 2001; Rubia, et al., 2000; Steinberg, 2004; Steinberg, et al., 2008).

The research summarized here indicates that the increased risk-taking behavior observed in adolescence stems from the different developmental timelines of subcortical motivational regions and cortical control regions. Importantly, this does not imply that adolescents lack the capacity for rational decision-making. Rather, in emotionally charged situations, the more mature limbic system, driven by immediate rewards, can outweigh the influence of the still-developing prefrontal control system.

While adolescence is often characterized by reward-seeking and risk-taking (Gardner & Steinberg, 2005; Spear, 2000), individual differences in brain responses to rewards can make some adolescents more vulnerable to engaging in risky behaviors, potentially increasing their likelihood of experiencing negative outcomes such as substance abuse, addiction, and even fatality. By understanding these individual and developmental differences, we can better identify adolescents at risk and develop interventions to mitigate these risks.

Link to Article

Abstract

Adolescence is a developmental period often characterized as a time of impulsive and risky choices leading to increased incidence of unintentional injuries and violence, alcohol and drug abuse, unintended pregnancy and sexually transmitted diseases. Traditional neurobiological and cognitive explanations for such suboptimal choices and actions have failed to account for nonlinear changes in behavior observed during adolescence, relative to childhood and adulthood. This review provides a biologically plausible conceptualization of the mechanisms underlying these nonlinear changes in behavior, as an imbalance between a heightened sensitivity to motivational cues and immature cognitive control. Recent human imaging and animal studies provide a biological basis for this view, suggesting differential development of subcortical limbic systems relative to top-down control systems during adolescence relative to childhood and adulthood. This work emphasizes the importance of examining transitions into and out of adolescence and highlights emerging avenues of future research on adolescent brain development.

The Teenage Brain: Why Taking Risks Can Feel Rewarding

Introduction

Being a teenager is often described as a time of impulsivity, where we don't always think about the future and might engage in riskier behavior than adults (Gardner & Steinberg, 2005; Scott, 1992; Steinberg, et al., 2008). This tendency to take risks is reflected in higher rates of accidents, suicides, unsafe sex, and criminal activity (Scott, 1992). While younger teens (15 and under) tend to act more impulsively than older teens, even 16 and 17-year-olds don't always show the same self-control as adults (Feld, 2008).

Over the past decade, scientists have come up with a number of ideas about why teenagers engage in impulsive and risky behavior, focusing on how the brain works. Older theories suggested that as we age, our brains get better at controlling our impulses thanks to the development of the prefrontal cortex. This area of the brain, responsible for planning and decision-making, continues to grow and change well into our twenties, as seen in brain imaging studies (Galvan, et al., 2006; Gogtay, et al., 2004; Hare, et al., 2008; Sowell, et al., 2003).

However, if teenagers make poor choices solely because of an immature prefrontal cortex, then children should be making even worse choices, right? After all, their prefrontal cortex is even less developed (Casey, Getz, & Galvan, 2008). But here's the thing: teenagers aren't just like younger children when it comes to making bad choices. Statistics from the National Center for Health Statistics on adolescent behavior show that teenagers experience a unique surge in risky behavior (Eaton, et al., 2008).

This article explores how the teenage brain changes in a way that might explain this increase in risky behavior. We present a model that highlights the interplay between different parts of the brain responsible for emotions and decision-making and consider why some teenagers might be more vulnerable to making risky decisions than others (Galvan, et al., 2007; Hare, et al., 2008).

A New Way of Looking at the Teenage Brain

To understand the teenage brain, we can't just take a single snapshot in time. Instead, we need to look at adolescence as a transitional period between childhood and adulthood (Spear, 2000). This means considering how the brain changes as it enters and leaves adolescence (Casey, Galvan, & Hare, 2005; Casey, Tottenham, Liston, & Durston, 2005).

We propose a new model of the adolescent brain that builds upon recent brain imaging studies (Ernst, et al., 2005; Galvan, et al., 2007; Galvan, et al., 2006; Hare, et al., 2008; Somerville, Hare, & Casey, in press; Van Leijenhorst, Moor, et al., 2010; Van Leijenhorst, Zanolie, et al., 2010). This model, depicted in Figure 1, goes beyond simply blaming the prefrontal cortex for risky behavior. Instead, it shows how different parts of the brain – specifically, areas involved in processing emotions and rewards (like the ventral striatum), and areas involved in planning and decision-making (like the prefrontal cortex) – develop at different speeds.

According to this model, the emotional, reward-seeking parts of the brain develop earlier than the prefrontal cortex, which is still under construction during adolescence. This means that during adolescence, teenagers might be more easily swayed by these emotional areas, compared to children, where both systems are still developing, and adults, where both systems are fully mature.

This model helps us understand why teenagers experience a spike in risky behavior – their brains are wired to seek thrills and rewards, while the areas responsible for thinking about consequences are still catching up. It also explains why teenagers can make perfectly rational decisions in some situations (like when they are calm and thinking clearly) but might make riskier choices when emotions are running high (Reyna & Farley, 2006). In those emotionally charged moments, the reward system might overpower the control system, leading to riskier choices.

A Look Across Species: The Evolution of the Teenage Brain

If this imbalance in brain development is responsible for risky teenage behavior, why did our brains evolve this way? To understand this, we need to remember that adolescence marks a time when individuals become independent from their families, a period observed across many mammal species (Kelley, Schochet, & Landry, 2004). This drive for independence often involves seeking new experiences and taking risks, which can be observed in increased social interactions and novelty-seeking behavior (Brown, 2004; Chassin, et al., 2004; Collins & Laursen, 2004; Laviola, et al., 1999).

From an evolutionary standpoint, this makes sense. Imagine our ancestors: teenagers needed to leave their families, find a mate, and start their own families. This meant venturing out into the unknown, which naturally involved risks. Their brains were wired to prioritize these activities, even if it meant taking risks. While this might have been necessary for survival in the past, today's teenagers face different challenges. With extended adolescence and different social structures, this drive for risk-taking might not be as adaptive as it once was.

How Teenagers Behave

One of the most important aspects of growing up is learning to control our impulses and make good decisions, especially when faced with tempting rewards. This ability is known as cognitive control (Casey, Galvan, et al., 2005; Casey, Giedd, & Thomas, 2000; Casey, Thomas, et al., 2000). Many studies have shown that children get better at controlling their impulses as they grow older, but what about teenagers?

While teenagers continue to develop their ability to control their impulses, they also experience an increase in risk-taking and reward-seeking behavior, peaking during adolescence before declining in adulthood (Eaton, et al., 2008; Windle, et al., 2008). This suggests that impulse control and risk-taking follow different developmental paths.

Think of it like this: as you get older, you generally get better at resisting that cookie jar, knowing that eating too many cookies is not a good idea. However, at the same time, you might also become more interested in trying new things and taking risks, like trying out for the school play or going on a rollercoaster. This drive to seek out new experiences and rewards is heightened during adolescence.

The Teenage Brain: A Closer Look

Scientists use brain imaging techniques like MRI to see how the brain changes during adolescence. These studies show that while the brain is almost adult-sized by age six, it continues to develop throughout adolescence, especially in areas related to decision-making, emotions, and reward processing (Casey, Tottenham, et al., 2005).

What We've Learned: Putting It All Together

Research suggests that teenagers' increased risk-taking behavior is linked to the different speeds at which different parts of their brains develop. The parts of the brain involved in emotions and rewards develop faster than the parts responsible for planning and controlling impulses, creating an imbalance that can lead to riskier choices.

However, this doesn't mean that teenagers are incapable of making good decisions. They can, especially when they are not feeling overwhelmed by emotions. It's just that when emotions are running high, the reward-seeking part of the brain might win out over the control system, increasing the likelihood of risky behavior.

It's important to remember that not all teenagers are the same. Some teenagers are naturally more prone to taking risks than others due to differences in their brains and personalities. This can make some teenagers more vulnerable to negative consequences of risk-taking.

By understanding how the teenage brain develops and the factors that contribute to risky behavior, we can work towards creating environments that support teenagers as they navigate this exciting but challenging period of life.

Link to Article

Abstract

Adolescence is a developmental period often characterized as a time of impulsive and risky choices leading to increased incidence of unintentional injuries and violence, alcohol and drug abuse, unintended pregnancy and sexually transmitted diseases. Traditional neurobiological and cognitive explanations for such suboptimal choices and actions have failed to account for nonlinear changes in behavior observed during adolescence, relative to childhood and adulthood. This review provides a biologically plausible conceptualization of the mechanisms underlying these nonlinear changes in behavior, as an imbalance between a heightened sensitivity to motivational cues and immature cognitive control. Recent human imaging and animal studies provide a biological basis for this view, suggesting differential development of subcortical limbic systems relative to top-down control systems during adolescence relative to childhood and adulthood. This work emphasizes the importance of examining transitions into and out of adolescence and highlights emerging avenues of future research on adolescent brain development.

Why Do Teenagers Take So Many Risks?

Introduction

Being a teenager is a time of big changes. Compared to adults, teenagers are more impulsive, don't always think about the consequences of their actions, and might do things that are risky (Gardner & Steinberg, 2005; Scott, 1992; Steinberg, et al., 2008). This can lead to more accidents, unsafe choices, and other problems (Scott, 1992). Younger teenagers, especially those 15 or younger, act without thinking more than older teens, but even 16 and 17 year olds don't always have the same self-control as adults (Feld, 2008).

Scientists have been trying to understand why teenagers act this way. One idea is that the part of the brain responsible for making good decisions, called the prefrontal cortex, is still developing during the teenage years (Galvan, et al., 2006; Gogtay, et al., 2004; Hare, et al., 2008; Sowell, et al., 2003).

However, if this was the only reason, then younger children should act even riskier than teenagers, since their prefrontal cortex is even less developed (Casey, Getz, & Galvan, 2008). But teenagers actually take more risks than both children and adults (Eaton, et al., 2008; Windle, et al., 2008). This means something else must be going on in the teenage brain.

This article will explore how the brain changes during the teenage years and how these changes might explain why teenagers take more risks. We'll look at how different parts of the brain work together and talk about why some teenagers might be more likely to make risky decisions than others (Galvan, et al., 2007; Hare, et al., 2008).

The Teenage Brain: A Balancing Act

To understand the teenage brain, we need to think about how it changes over time. It's not just about the prefrontal cortex being "not ready" - it's about the interaction between different parts of the brain (Spear, 2000).

Imagine the brain has two teams: the "go" team and the "slow down" team. The "go" team, located in a part of the brain called the ventral striatum, loves rewards and exciting things. The "slow down" team, led by the prefrontal cortex, helps us think about consequences and make safe choices.

During the teenage years, the "go" team is strong and fully developed, while the "slow down" team is still learning and growing. This imbalance can make teenagers more likely to follow their impulses and take risks, even when they know better (Casey, Galvan, & Hare, 2005; Casey, Tottenham, Liston, & Durston, 2005) (Figure 1).

This doesn't mean teenagers are incapable of making good decisions. They can understand risks just as well as adults (Reyna & Farley, 2006). However, when something seems exciting or rewarding, the "go" team might take over before the "slow down" team can catch up (Cauffman, et al., 2010; Figner, Mackinlay, Wilkening, & Weber, 2009; Galvan, Hare, Voss, Glover, & Casey, 2007; Galvan, et al., 2006).

Why Are Teenage Brains Wired This Way?

From an evolutionary perspective, the teenage years are about becoming independent and finding a mate (Kelley, Schochet, & Landry, 2004). To do this, teenagers might need to take risks and explore new things, even if it means facing danger (Brown, 2004; Chassin, et al., 2004; Collins & Laursen, 2004; Laviola, et al., 1999). This risk-taking drive might be stronger during the teenage years because it's also when hormones are telling us to seek out partners (Spear, 2000).

What We Know From Studying Teenagers

Scientists have learned a lot about the teenage brain by studying how teenagers act and by using special brain imaging techniques.

One thing they've learned is that teenagers get really excited about rewards, even more so than children or adults (Ernst, et al., 2005; Galvan, et al., 2006; Geier, Terwilliger, Teslovich, Velanova, & Luna, 2009; Van Leijenhorst, Zanolie, et al., 2010). This excitement shows up in the brain as increased activity in the ventral striatum, the "go" team's home base (Galvan, et al., 2006) (Figure 2).

Studies have also shown that the connections between the "go" team and the "slow down" team are still getting stronger during adolescence (Hare, et al., 2008). This means the prefrontal cortex might not be as good at controlling impulsive behaviors when something exciting is at stake, which could lead to riskier choices (Geier, et al., 2009; Somerville, et al., in press).

Every Teenager Is Different

Just like every person is different, every teenage brain is different too. Some teenagers are naturally more impulsive or more drawn to rewards than others (Benthin, Slovic, & Severson, 1993).

Studies have shown that teenagers who are more sensitive to rewards are also more likely to engage in risky behaviors (Galvan, et al., 2007). This suggests that individual differences in how our brains respond to rewards could help explain why some teenagers are more likely to take risks than others.

Conclusion

The teenage brain is a work in progress. While the parts of the brain responsible for planning and decision-making are still developing, the parts of the brain that respond to rewards are already running strong. This imbalance can make teenagers more likely to act impulsively and take risks, even when they understand the potential consequences.

It's important to remember that every teenager is different and develops at their own pace. By understanding the changes happening in the teenage brain, we can better support teenagers as they navigate this exciting and challenging time in their lives.

Link to Article

Footnotes and Citation

Cite

Casey, B., Jones, R. M., & Somerville, L. H. (2011). Braking and Accelerating of the Adolescent Brain. Journal of research on adolescence : the official journal of the Society for Research on Adolescence, 21(1), 21–33. https://doi.org/10.1111/j.1532-7795.2010.00712.x

    Highlights