With the school year 2018-2019, tertiary-education institutions (universities and colleges) will be getting their first full batch of K-to-12 graduates.
Amid the preparations for new curricula, I’ve wondered if we are forgetting the nature of the incoming freshmen and subsequent batches. Besides (and because of) having two additional years of basic education, they will be older.
There have been prior discussions among faculty and administrators about the very young students we were getting. Many were as young as 15 or 16, graduating from college before the age of 20. Was it realistic to expect 15-year-olds to choose a career or profession? Could they grasp the difficult abstract concepts that come with college subjects?
Perhaps the most daunting of challenges from those 15- to 19-year-olds was their being teenagers, a term that panicked parents, teachers and other adults. The term “teenagers” came into wide usage only after World War II, encouraged by marketing people looking for new markets for clothing, food, music and other consumer products. The postwar baby boom did mean larger numbers of teenagers becoming visible, with their own culture, accompanied by all kinds of negative connotations of raging hormones, rebelliousness, and lack of direction in life.
Adolescence
It was not that societies didn’t recognize that particular age group. Even today in the Romance languages (French, Spanish, Italian and others), “adolescent” or “adolescente” are still the terms used. But even if cultures seem to have recognized a period between childhood and adulthood, for much of human history people seemed to skip adolescence and jumped from childhood to adulthood, marrying early and taking on many adult responsibilities early.
Rapid economic changes, especially in the 19th century, spurred a recognition of teenage years as something distinct from adulthood. The increasing complexity of society meant that young people had to be kept for many more years at home and in school, so they could pick up more skills required by an increasingly complex society. Besides longer schooling, teenagers stayed with their families longer because governments imposed minimum ages of marriage; in the Philippines it went from 12 to 14 for girls and 16 for boys and, in the 1987 Family Code, to 18… with parental permission required until the age of 21.
Teenagers and adolescents became the focus of studies starting in the 20th century, notably with G. Stanley Hall’s “Adolescence” (published in 1904), where he described the period as one of “storm and stress.” Hall cited, for example, a curve of despondency which “starts at 11, rises steadily and rapidly till 15, and then falls steady till 23.”
With time, people came to accept the teenage years as difficult, with parents and educators called on to give extra support and to assert more control over them. But by and large, it was expected that teenagers would magically become more mature when they turned 20.
You can see this in the discussions on the impact of K-to-12 on tertiary education. The students will be more mature, we are told, and should be ready for “higher-order thinking.” Teenagers were expected to pick up some “critical thinking” skills, but the new K-to-12 products are expected to be more discerning and to synthesize knowledge.
Brains and butterflies?
Here’s the bad news: Being 18, or 20, won’t do it. Physically, yes, there can be a huge difference between a 20- and an 18-year-old, but psychosocial development takes more time. American psychologist Jeffrey Jensen Arrett raised eyebrows some years back when he suggested the term “emerging adults” (yes, like caterpillars turning into butterflies) as a label for people in their 20s who, he said, were still in the continuing phases of “identity exploration, instability, self-focus, feeling in-between.”
Many of the challenges of the teenage years will continue into the 20s, and sometimes with greater intensity because society can be forgiving of problematic teenagers but not of “delinquents’ in their 20s.
Parents and educators have to realize that there’s a convergence of biology and society at work here. In the 1990s, the US National Institute of Mental Health began a massive study on 5,000 kids’ brain development from ages 3 to 16. As the initial phase of the research ended, researchers extended the upper limit of the study to 18, then 20, then 22, as they realized that the brain continued to develop throughout the teen years and well into the 20s, at least up to the age of 25.
Among the significant findings was that the limbic system (responsible for our emotions) developed more rapidly than the prefrontal cortex and the cerebellum (responsible for emotional control and higher-order thinking).
You might ask: How did centuries of adolescents manage to survive adulthood, starting families and earning a living?
I think they did it, although with great difficulty, because societies were simpler then in terms of needed job and life skills.
I also think that since after World War II, we had been demonizing teenagers and sweeping the problems of people in their 20s under the rug. Take drug use as an example. The war on drugs has included teenagers as targets, depicting them as rampant drug users and pushers. But any serious study will show more drug problems among those in their 20s because teenagers just don’t have the money for drugs, especially in an impoverished society like ours. And I’d propose that those in their 20s turn to drugs in part because their “teenage’ problems are more serious.
Handicapped
Does that mean we’ll have to bear with people in their 20s still dependent on their parents (or grandparents)—the so-called tambay—who may hold a college degree but can’t go independent, are still troubled and searching or jumping from one job to another, or, simply, not getting a job? In the universities, will we continue to grapple with students jumping from one major to another, running through emotional roller-coasters?
I suspect we will have to live with this extended adolescence. We should also stop thinking of attributing the problems of people in their 20s simply to their being “millennials.” What we see today are teenagers and twentysomethings becoming handicapped because social media and the internet hamper their ability to develop face-to-face communications skills.
But here’s some good news, too, from brain research. While the brain continues to be transformed up to the 20s, there’s “synaptic pruning” going on: Connections that are not used are being discarded.Thus, all the way up to the 20s, there are still rich opportunities to provide different learning environments—reading, math, languages, music and the arts, the sciences—that will allow the brain to retain larger numbers of nerve connections crucial for life later on.
The psychologist Arrett found in his study of people in their 20s that in spite of all their problems they still had a great “sense of possibilities.” Let’s allow those possibilities to become more attainable.
mtan@inquirer.com.ph