Is It Really Too Late to Learn New Skills?

You missed your chance to be a prodigy, but there’s still growth left for grownups.
Image may contain Outdoors Nature Human Person Ocean Water and Sea
The joys—and occasional embarrassments—of being a novice could be an antidote to the strain of being a perfectionist.Animation by Luke Wohlgemuth

Among the things I have not missed since entering middle age is the sensation of being an absolute beginner. It has been decades since I’ve sat in a classroom in a gathering cloud of incomprehension (Algebra 2, tenth grade) or sincerely tried, lesson after lesson, to acquire a skill that was clearly not destined to play a large role in my life (modern dance, twelfth grade). Learning to ride a bicycle in my early thirties was an exception—a little mortifying when my husband had to run alongside the bike, as you would with a child—but ultimately rewarding. Less so was the time when a group of Japanese schoolchildren tried to teach me origami at a public event where I was the guest of honor—I’ll never forget their sombre puzzlement as my clumsy fingers mutilated yet another paper crane.

Like Tom Vanderbilt, a journalist and the author of “Beginners: The Joy and Transformative Power of Lifelong Learning” (Knopf), I learn new facts all the time but new skills seldom. Journalists regularly drop into unfamiliar subcultures and domains of expertise, learning enough at least to ask the right questions. The distinction he draws between his energetic stockpiling of declarative knowledge, or knowing that, and his scant attention to procedural knowledge, or knowing how, is familiar to me. The prospect of reinventing myself as, say, a late-blooming skier or ceramicist or marathon runner sparks only an idle interest, something like wondering what it might be like to live in some small town you pass on the highway.

There is certainly a way to put a positive spin on that reluctance. If you love your job and find it intellectually and creatively fulfilling, you may not feel the urge to discover other rooms in the house of your mind, whatever hidden talents and lost callings may repose there. But there are less happy forces at work, too. There’s the fear of being bad at something you think is worthwhile—and, maybe even more so, being seen to be bad at it—when you have accustomed yourself to knowing, more or less, what you’re doing. What’s the point of starting something new when you know you’ll never be much good at it? Middle age, to go by my experience—and plenty of research—brings greater emotional equanimity, an unspectacular advantage but a relief. (The lows aren’t as low, the highs not as high.) Starting all over at something would seem to put you right back into that emotional churn—exhilaration, self-doubt, but without the open-ended possibilities and renewable energy of youth. Parties mean something different and far more exciting when you’re younger and you might meet a person who will change your life; so does learning something new—it might be fun, but it’s less likely to transform your destiny at forty or fifty.

In “Old in Art School: A Memoir of Starting Over,” Nell Painter, as distinguished a historian as they come—legions of honors, seven books, a Princeton professorship—recounts her experience earning first a B.F.A. at Rutgers and then an M.F.A. at the Rhode Island School of Design while in her sixties. As a Black woman used to feeling either uncomfortably singled out or ignored in public spaces where Black women were few, she was taken aback in art school to find that “old” was such an overwhelming signifier: “It wasn’t that I stopped being my individual self or stopped being black or stopped being female, but that old, now linked to my sex, obscured everything else beyond old lady.” Painter finds herself periodically undone by the overt discouragement of some of her teachers or the silence of her fellow-students during group crits of her work—wondering if they were “critiquing me, old-black-woman-totally-out-of-place,” or her work. Reading her book, I was full of admiration for Painter’s willingness to take herself out of a world in which her currency—scholarly accomplishment—commanded respect and put herself into a different one where that coin often went unrecognized altogether, all out of exultation in the art-making itself. But her quest also induced some anxiety in me.

Painter is no dilettante: she’s clear about not wanting to be a “Sunday Painter”; she is determined to be an Artist, and recognized as such. But “dilettante” is one of those words which deter people from taking up new pursuits as adults. Many of us are wary of being dismissed as dabblers, people who have a little too much leisure, who are a little too cute and privileged in our pastimes. This seems a narrative worth pushing back against. We might remember, as Vanderbilt points out, that the word “dilettante” comes from the Italian for “to delight.” In the eighteenth century, a group of aristocratic Englishmen popularized the term, founding the Society of the Dilettanti to undertake tours of the Continent, promote the art of knowledgeable conversation, collect art, and subsidize archeological expeditions. Frederick II of Prussia dissed the dilettanti as “lovers of the arts and sciences” who “understand them only superficially but who however are ranked in superior class to those who are totally ignorant.” (They were, of course, wealthy, with oodles of time on their hands.) The term turned more pejorative in modern times, with the rise of professions and of licensed expertise. But if you think of dilettantism as an endorsement of learning for learning’s sake—not for remuneration or career advancement but merely because it delights the mind—what’s not to love?

Maybe it could be an antidote to the self-reported perfectionism that has grown steadily more prevalent among college students in the past three decades. Thomas Curran and Andrew P. Hill, the authors of a 2019 study on perfectionism among American, British, and Canadian college students, have written that “increasingly, young people hold irrational ideals for themselves, ideals that manifest in unrealistic expectations for academic and professional achievement, how they should look, and what they should own,” and are worried that others will judge them harshly for their perceived failings. This is not, the researchers point out, good for mental health. In the U.S., we’ll be living, for the foreseeable future, in a competitive, individualistic, allegedly meritocratic society, where we can inspect and troll and post humiliating videos of one another all the live-long day. Being willing to involve yourself in something you’re mediocre at but intrinsically enjoy, to give yourself over to the imperfect pursuit of something you’d like to know how to do for no particular reason, seems like a small form of resistance.

Tom Vanderbilt got motivated to start learning again during the time he spent waiting about while his young daughter did her round of lessons and activities. Many of us have been there, “on some windowless lower level of a school huddled near an electrical outlet to keep your device alive,” as he nicely puts it—waiting, avoiding the parents who want to talk scores and rankings, trying to shoehorn a bit of work into a stranded hour or two. But not many of us are inspired to wonder, in such moments, why we ourselves aren’t in there practicing our embouchure on the trumpet or our Salchow on the ice. This may speak to my essential laziness, but I have fond memories of curling up on the child-size couch in the musty, overheated basement of our local community center reading a book for a stolen hour, while my kids took drum lessons and fencing classes. Vanderbilt, on the other hand, asks himself whether “we, in our constant chaperoning of these lessons, were imparting a subtle lesson: that learning was for the young.” Rather than molder on the sidelines, he decides to throw himself into acquiring five new skills. (That’s his term, though I started to think of these skills as “accomplishments” in the way that marriageable Jane Austen heroines have them, talents that make a long evening pass more agreeably, that can turn a person into more engaging company, for herself as much as for others.) Vanderbilt’s search is for “the naïve optimism, the hypervigilant alertness that comes with novelty and insecurity, the willingness to look foolish, and the permission to ask obvious questions—the unencumbered beginner’s mind.” And so he tries to achieve competence, not mastery, in chess, singing, surfing, drawing, and making. (He learns to weld a wedding ring to replace two he lost surfing.) He adds juggling, not because he’s so interested in it but because—with its steep and obvious learning curve (most people, starting from scratch, can learn to juggle three balls in a few days) and its fun factor—juggling is an oft-used task for laboratory studies of how people learn. These accomplishments aren’t likely to help his job performance as a journalist, or to be marketable in any way, except insofar as the learning of them forms the idea for the book.

“He’s giggling to himself. Get ready for a dad joke.”
Cartoon by Julia Suits

Vanderbilt is good on the specific joys and embarrassments of being a late-blooming novice, or “kook,” as surfers sometimes call gauche beginners. How you think you know how to sing a song but actually know only how to sing along with one, so that, when you hear your own voice, stripped of the merciful camouflage the recorded version provides, “you’re not only hearing the song as you’ve never quite heard it, you are hearing your voice as you’ve never quite heard it.” The particular, democratic pleasure of making that voice coalesce with others’ in a choir, coupled with the way, when friends and family come to see your adult group perform, “the parental smile of eternal indulgence gives way to a more complicated expression.” The fact that feedback, especially the positive kind stressing what you’re doing right, delivered by an actual human teacher or coach watching what you do, is crucial for a beginner—which might seem obvious except that, in an age when so many instructional videos of every sort are available online, you might get lulled into thinking you could learn just as well without it. The weirdness of the phenomenon that, for many of us, our drawing skills are frozen forever as they were when we were kids. Children tend to draw better, Vanderbilt explains, when they are around five years old and rendering what they feel; later, they fall into what the psychologist Howard Gardner calls “the doldrums of literalism”—trying to draw exactly what they see but without the technical skill or instruction that would allow them to do so effectively. Many of us never progress beyond that stage. Personally, I’m stuck at about age eight, when I filled notebooks with ungainly, scampering horses. Yet I was entranced by how both Vanderbilt and, in her far more ambitious way, Painter describe drawing as an unusually absorbing, almost meditative task—one that makes you look at the world differently even when you’re not actually doing it and pours you into undistracted flow when you are.

One problem with teaching an old dog new tricks is that certain cognitive abilities decline with age, and by “age” I mean starting as early as one’s twenties. Mental-processing speed is the big one. Maybe that’s one reason that air-traffic controllers have to retire at age fifty-six, while English professors can stay at it indefinitely. Vanderbilt cites the work of Neil Charness, a psychology professor at Florida State University, who has shown that the older a chess player is the slower she is to perceive a threatened check, no matter what her skill level. Processing speed is why I invariably lose against my daughter (pretty good-naturedly, if you ask me) at a game that I continue to play: Anomia. In this game, players flip cards bearing the names of categories (dog breeds, Olympic athletes, talk-show hosts, whatever), and, if your card displays the same small symbol as one of your opponents’ does, you try to be the first to call out something belonging to the other person’s category. If my daughter and I each had ten minutes to list as many talk-show hosts as we could, I’d probably triumph—after all, I have several decades of late-night-TV viewing over her. But, with speed the essence, a second’s lag in my response speed cooks my goose every game.

Still, as Rich Karlgaard notes in his reassuring book “Late Bloomers: The Hidden Strengths of Learning and Succeeding at Your Own Pace,” there are cognitive compensations. “Our brains are constantly forming neural networks and pattern-recognition capabilities that we didn’t have in our youth when we had blazing synaptic horsepower,” he writes. Fluid intelligence, which encompasses the capacity to suss out novel challenges and think on one’s feet, favors the young. But crystallized intelligence—the ability to draw on one’s accumulated store of knowledge, expertise, and Fingerspitzengefühl—is often enriched by advancing age. And there’s more to it than that: particular cognitive skills rise and fall at different rates across the life span, as Joshua K. Hartshorne, now a professor of psychology at Boston College, and Laura T. Germine, a professor of psychiatry at Harvard Medical School, show in a 2015 paper on the subject. Processing speed peaks in the late teens, short-term memory for names at around twenty-two, short-term memory for faces at around thirty, vocabulary at around fifty (in some studies, even at around sixty-five), while social understanding, including the ability to recognize and interpret other people’s emotions, rises at around forty and tends to remain high. “Not only is there no age at which humans are performing at peak at all cognitive tasks,” Hartshorne and Germine conclude, “there may not be an age at which humans are at peak on most cognitive tasks.” This helps Karlgaard’s case that we need a “kinder clock for human development”—societal pressure on young adults to specialize and succeed right out of college is as wrongheaded and oppressive on the one end of life as patronizing attitudes toward the old are on the other.

The gift of crystallized intelligence explains why some people can bloom spectacularly when they’re older—especially, perhaps, in a field like literature, where a rich vein of life experience can be a writerly asset. Annie Proulx published her first novel at the age of fifty-six, Raymond Chandler at fifty-one. Frank McCourt, who had been a high-school teacher in New York City for much of his career, published his first book, the Pulitzer Prize-winning memoir “Angela’s Ashes,” at sixty-six. Edith Wharton, who had been a society matron prone to neurasthenia and trapped in a gilded cage of a marriage, produced no novels until she was forty. Publishing fiction awakened her from what she described as “a kind of torpor,” a familiar feeling for the true later bloomer. “I had groped my way through to my vocation,” Wharton wrote, “and thereafter I never questioned that story-telling was my job.”

In science and technology, we often think of the people who make precocious breakthroughs as the true geniuses—Einstein developing his special theory of relativity at twenty-six. Einstein himself once said that “a person who has not made his great contribution to science before the age of thirty will never do so.” A classic paper on the relationship between age and scientific creativity showed that American Nobel winners tended to have done their prize-winning work at thirty-six in physics, thirty-nine in chemistry, and forty-one in medicine—that creativity rose in the twenties and thirties and began a gradual decline in the forties.

That picture has been complicated by more recent research. According to a 2014 working paper for the National Bureau of Economic Research, which undertook a broad review of the research on age and scientific breakthroughs, the average age at which people make significant contributions to science has been rising during the twentieth century—notably to forty-eight, for physicists. (One explanation might be that the “burden of knowledge” that people have to take on in many scientific disciplines has increased.) Meanwhile, a 2016 paper in Science that considered a wider range of scientists than Nobelists concluded that “the highest-impact work in a scientist’s career is randomly distributed within her body of work. That is, the highest-impact work can be, with the same probability, anywhere in the sequence of papers published by a scientist—it could be the first publication, could appear mid-career, or could be a scientist’s last publication.”

When it comes to more garden-variety late blooming, the kind of new competencies that Vanderbilt is seeking, he seems to have gone about it in the most promising way. For one thing, it appears that people may learn better when they are learning multiple skills at once, as Vanderbilt did. A recent study that looked at the experiences of adults over fifty-five who learned three new skills at once—for example, Spanish, drawing, and music composition—found that they not only acquired proficiency in these areas but improved their cognitive functioning over all, including working and episodic memory. In a 2017 paper, Rachel Wu, a neuroscientist at U.C. Riverside, and her co-authors, George W. Rebok and Feng Vankee Lin, propose six factors that they think are needed to sustain cognitive development, factors that tend to be less present in people’s lives as they enter young adulthood and certainly as they grow old. These include what the Stanford psychology professor Carol Dweck calls a “growth mindset,” the belief that abilities are not fixed but can improve with effort; a commitment to serious rather than “hobby learning” (in which “the learner casually picks up skills for a short period and then quits due to difficulty, disinterest, or other time commitments”); a forgiving environment that promotes what Dweck calls a “not yet” rather than a “cannot” approach; and a habit of learning multiple skills simultaneously, which may help by encouraging the application of capacities acquired in one domain to another. What these elements have in common, Wu and her co-authors point out, is that they tend to replicate how children learn.

So eager have I been all my life to leave behind the subjects I was bad at and hunker down with the ones I was good at—a balm in many ways—that, until reading these books, I’d sort of forgotten the youthful pleasure of moving our little tokens ahead on a bunch of winding pathways of aptitude, lagging behind here, surging ahead there. I’d been out of touch with that sense of life as something that might encompass multiple possibilities for skill and artistry. But now I’ve been thinking about taking up singing in a serious way again, learning some of the jazz standards my mom, a professional singer, used to croon to me at bedtime. If learning like a child sounds a little airy-fairy, whatever the neuroscience research says, try recalling what it felt like to learn how to do something new when you didn’t really care what your performance of it said about your place in the world, when you didn’t know what you didn’t know. It might feel like a whole new beginning. ♦