In the early years, a sophisticated young couple moved into our neighborhood. He was in Alabama on a fellowship and a graduate of Harvard. She spent five years in boarding school and graduated from Wellesley.
Once, in a light moment, she showed me the proper way to eat a banana: Peel. Cut into long sections. Cut again into bite-sized pieces. “So much for a liberal arts education,” she laughed.
Having a degree from a state university, I smiled, but, in truth, I had never heard a symphony orchestra play until I took a music appreciation course in college nor had I actually seen a living writer of national acclaim.
For me, a liberal arts education was my first experience of delving into Greek mythology, listening to violin concertos, hearing a professor explain why reading “War and Peace” would consume every weekend and trekking to contemporary art exhibits to “broaden an appreciation of form and color.”
Years later, quoting poetry in our house set our oldest daughter’s teeth on edge. An art history major, she was having no luck finding a job. In one door slamming incident, she laid blame. “You should have pushed me to major in accounting,” she cried, a girl who required a Coke and two brownies to balance her checkbook.
It would be a decade before we learned “conventional wisdom supports computer science and engineering majors as having better employment and higher earnings than their peers who chose liberal arts, but the long-term story is more complicated.”
In today’s world, “by age 40, those who chose the humanities, history or political science as their college focus, actually catch up to former classmates who are part of the engineering, science, technology and math communities.”
According to David Deming, Director of the Center for Social Policy at Harvard’s Kennedy School, liberal arts majors earn more at age 40 than their former classmates who signed on to work with computers.
Liberal arts majors move up a ladder in management jobs while an ever-adapting STEM workforce must cope with a changing world of technical skills. One course offered at Stanford University, lectures enrolling a thousand students in an academic year, was not even listed in the school’s curriculum 20 years ago.
Deming has come to believe we should value the individuality of a life’s work rather than focusing on the technology of today because change is a-coming.
Writing well, problem solving and working with a team are “whole person” skills suiting a fickle job market, Deming believes. Rather than a student partnering with a computer, a professor and his student experience more meaningful dialogue in a back-and-forth discussion of a novel, its characters and plot.
It is not a given, but often a student finds his voice in a setting where spoken opinions and emotions are encouraged. I once heard a middle-aged lawyer challenge a visiting professor who was teaching a summer course. The seminar on Southern writers included an opinion “To Kill A Mockingbird” was “overrated.”
The successful attorney, at home in a courtroom, “morphed” into the liberal arts major of his youth, and, in his 50s, stood to defend the story of a small-town lawyer who sought justice in the face of discrimination. Time had passed, but the memory of reading the book had lived on. As the attorney stood to speak to the novel’s message, his voice was filled with tears.
Harvard’s David Deming is quick to explain he is not suggesting students avoid majoring in STEM studies. “STEM graduates tend to have high earnings through their careers,” he writes.
Yet in this world where we find connection through a computer or an iPhone, he reminds us while convenience and speed of communication may be daily technological miracles, but it is the artist, the violinist, the poet who make us cry.