This is from 2012 | 13 minute read

Disrupting Higher Education: Some Observations On What To Fix

I hear a lot that higher education is ready for disruption. But it's overly simplistic to say "higher education is broken," and the entire focus of most education startups seems to be on delivery platforms: ways to bring content to students. Online learning is, indeed, flawed. But there are a lot of other aspects to higher education that could benefit from disruption, including aspects of academic research, the costs of education, and the basic assumptions that everyone needs to attend a college at all. These are some of my thoughts on the state of higher education today.

Academic Research

Academic research is strange, and I think it's difficult for people to really understand why it's necessary or how it works. What a researcher does all day is not that different than what anyone else does. If you don't know any researchers, you probably imagine them in libraries, pouring through manuscripts, all day, every day. Most researchers I know sit at a desk, and answer emails, have meetings, strategize on whiteboards, and waste time on facebook. The difference, though, is that their work is directed towards a pursuit of knowledge. They run projects and programs, usually with the intent on learning a lot about a little: it's common to have research that moves the needle only an extremely small amount by building on previous ideas and theories. I think that, broadly speaking, academic research works. The incentive to enter research self-selects individuals who are motivated by the production of knowledge, because the pay isn't particularly great and the working conditions can be fairly bland. And academic research continues to produce pretty amazing results. But the systems that dictate how this research happens need a lot of change.

Peer review

When an academic researcher publishes their work, it needs to be peer reviewed. That means that other researchers are invited to read it (they typically volunteer—peer review isn't a paid activity) and determine its worth. This is a fundamental principle that's rarely challenged. But academic research on academic research (holy meta!) indicates that it might not actually work (consider, for example, the paper "Nepotism and sexism in peer-review" which found that "the system is revealed as being riddled with prejudice." More qualitative evidence shows that peer review can be extraordinarily hit or miss. And my own experience has given me evidence on just how broken it is.

When I volunteered to peer review papers for the Computer Human Interaction conference (CHI is the largest academic conference for computer scientists who work on issues of interaction design), I received no training, no instructions, and only the most basic of guidelines on what to do. The system works like this.

First, you sign up to be a reviewer. Anyone can sign up to be a reviewer. You could sign up right now.

Next, papers are assigned to you. You might think that there is a vetting process to judge if you are actually qualified to read and review the paper, but you would be wrong. I'm aware of how difficult it is to find people to actually complete reviews, because it's unpaid work. Attempts are usually made to assign papers based on some expertise in the field, but that expertise is self-declared and based on a fairly superficial quantitative scale. Like other online systems, you build up a track record of having completed reviews, but there's no track record of how good your reviews are, and there's not really any indication of what a "good" review means. And even if there was, it might be irrelevant, because there's a shortage of reviewers. So when the review deadline is approaching and there are still 40% of papers without one review, it's tempting to reach out to a more generic set of reviewers.

Then, you read the paper, indicate your score, and respond to a series of questions. There are some guidelines written on what a "good response" might look like (this guide, for example, which is buried in a 2005 conference site. I wonder if anyone reads these guidelines.) I've chaired a few tracks at conferences and been completely astounded as to the shallow quality of reviews. I'm not alone here—read the comments, which are pretty telling.

Finally, there's a back and forth with the authors, giving them a chance to respond to any criticism, and then - based on the average of the scores, papers are accepted or rejected. Consider that acceptance means the community thinks this paper exemplifies good academic research, and that it also means points for tenure.

And so, three reviewers who may or may not be qualified to read a paper, and may or may not understand a paper, and may or may not write a thorough review of the paper, and may or may not calibrate their numeric scores in the same way, decide on the publish-ability of academic research. Which is why, when you attend a conference like CHI, you see truly great things, and truly awful things, and pretty much everything in between. There is a design opportunity to create a better system for vetting academic research.

Walled Garden Output

Even if the review process worked, the output is almost always hidden away in some proprietary locked system like the ACM Digital Library or JSTOR, ensuring that no one outside of academia will read it. There's been a pretty strong backlash recently against this. For example, as of this writing, over 11,000 researchers have said they won't publish in journals organized by Elsevier—one of the largest academic publishers. Harvard recently sent a memo to its faculty, urging them to publish only in open journals. Yet the lobbying power of organizations like Elsevier may prove to be as strong as that of the big content producers, and we're starting to see more SOPA like bills reach congress with the overt attempt of limiting public access to publicly funded research. There exists a design opportunity to create better incentive structure for people to publish academic research in a public manner.

Funding

Academic research requires funding. But, why—what does the money actually fund? While some of it goes to equipment, the majority of it covers salaries. But, this is where the system gets strange. Let's say you got a grant for $250,000 from an organization like the NSF. Before you do anything, the university takes a cut—in some cases, as much as 20%. You might use the rest of the money to fund some PhD students, who can act as research assistants. At CMU, a student who is studying to receive their PhD costs a research lab about $100,000. But that doesn't mean the PhD student gets all of that money. Instead, the money covers their tuition (which goes to the university), and then provides a small stipend of a few thousand a month directly to the student. You'll need to cover a portion of your own salary, too.

So when you play the numbers out, from a quarter million dollar grant,

  • The university gets close to $100,000, in the form of the off-the-top tax, the PhD student's tuition for two semesters, and a portion of your salary.
  • You can fund about 1.5 PhD students, for one year.

And if you further play these numbers out, funding a doctoral student for a four year commitment will cost you $400,000 in research funds, and to pay this, you'll need to bring in over half a million dollars.

It seems like a bit of a racket. There exists a design opportunity to fund academic research in a more practical and cost-effective manner.

Online Learning

We've heard over and over again that online learning is going to completely overhaul our educational system. The amount of money that has been, and continues to be, invested in online course delivery is amazing. Yet I'm extremely skeptical of the efficacy of online learning for advanced education in fields that are creative and collaborative. My problem with the focus on the platform of delivery ("online", "blended") is that it ignores the quality of the educational experience. As John Dewey described, "The belief that a genuine education comes about through experience does not mean that all experiences are genuinely or equally educative." Simply providing people with "an educational experience" makes no guarantee that they will, in fact, learn anything.

In my experience working with online course delivery, there are some major problems, and nearly all have to do with experiential qualities: the environment, context, emotion, and human to human interaction that shape the experiences learners have.

Structured educational software encourages rote learning without controversy. Perhaps one of the most fundamental qualities of an educational experience is that of dialogue, discourse and debate: the challenging of norms, active experimentation, public failure, and the serendipitous interplay of human interactions. You can learn by passively watching a video, but the learning is shallow because the experience is shallow. You are left on your own to form the connections between the material you watch and your existing knowledge. Some can make these connections; I fear that most can't or won't. Forums and message boards are used in an attempt to increase collaboration and communication. This is naively optimistic, and most students I know who have experienced this describe that it doesn't work. I've seen classes where students need to post "one forum post per week", as if forcing someone to have a question will actually ignite real curiosity about the subject. I do think online learning works tremendously well for extremely motivated and self-directed students (who are rare, but who do exist), and for extremely objective, fact-based learning. There's no reason the majority of freshman who have to take an introduction to calculus class can't try it online before engaging with a human directly. But before we rally around fact-based learning, we might question why so many freshman need to take an introduction to calculus class in the first place.

There exists an opportunity to design a better delivery mechanism, to judge the quality of educational experiences, and to encourage experiences that are grounded in the science of learning.

Costs

Going to a four year university is absurdly expensive. The cost of attending a university has grown at a disproportional rate, as compared to any other accepted benchmark. From 1985 to 2011, the overall inflation rate grew 115%, while the cost of higher education grew 500%. The same $10,000 college education in 1985 should cost $21,500; instead, it costs close to $60,000. Why the dramatic difference?

Some argue that programs like the Stafford Loan Program and Pell Grants are actually the main cause of the increased tuition. This article in the Atlantic describes how this happens:

In the past, college degrees conferred higher incomes on those who earned them. But almost all of that surplus went to the student rather than the college, because aside from a small number of extremely affluent families, the students were young and did not have that much cash. If colleges wanted to expand their market, college tuition was constrained to what an average student, or their family, could pay. Introducing subsidized loans into the picture allowed students to monetize that future income now. It's hardly surprising that colleges began to claim more and more of the surplus created by their college degree.

And if this wasn't troubling enough, the Stafford Loan interest rate is imminently set to double, which will trigger defaults and late payments.

The increased money that you pay frequently doesn't improve the quality of your education. According to this article in the LA Times, increased tuition funds athletic teams, administration, and the pay of the presidents. Paula Wallace, the President of the non-profit art and design college Savannah College of Art and Design, paid herself a 2 million dollars salary package, while the cost of a four year degree in furniture design will run a student $127,620 in 2012.

There exists an opportunity to design a cheaper educational offering.

The Major Assumption: Everyone Needs to Go To College

Ultimately, the biggest disruption that I see and hope for is the disruption of the social belief that a high school senior's next step is a four year college, and if they don't go, they won't have a good life. The assumptions baked into this statement include:

  • Everyone will learn something at any college that offers four year degrees
  • A high school senior is emotionally ready and personally interested in spending four more years learning
  • A good life requires a high paying job
  • A high paying job can only be secured with a college degree

We can challenge all of these assumptions logically, and we should challenge them all. I'll offer an anecdote rather than an argument. When I taught Industrial Design at the aforementioned Savannah College of Art and Design, I constantly had freshman show up in introductory classes who weren't interested, engaged, and clearly weren't trying. I remember having a conversation with one of them—call him Tony—after he turned in a project that he clearly didn't spend much time on. He admitted to throwing the work together at the last minute, and so rather than discuss the project, we elevated the discussion to the major. "Why are you pursuing a degree in industrial design?" I asked him. "I just want to work on motorcycles. This seemed like the closest major to working on bikes." I've heard this time and time again, and so I elevated the discussion once more. "And why are you pursuing a degree at all, if you are so passionate about working on bikes?" Tony replied: "Because my dad made me go to college, and art school was the only place I could get in."

Like everything else in popular culture, there are a set of norms that we're taught to follow, truths that are constantly reified. If you work hard, you can accomplish anything. Invest your money, because over time, the stock market always goes up. Trust your employer to reward you for your loyalty. And, go to college, because college opens doors. College does open doors; it certainly has for me. And Tony's dad probably thought an awful lot about his hundred-thousand dollar investment in Tony's future. And Tony, for all of his apathy, might turn it around, if he just sticks with it. I've certainly seen students do a 180 after a broken freshman and sophomore year, and I've even gone on to hire some of these students who were grasping at straws early in their academic studies.

But the likelihood that he'll stick with it is slim: "40-50 percent of those who matriculate in colleges and universities do not obtain a degree within six years of entering college." There exists a design opportunity to change the perception that college is the only appropriate next-step after high school. Sometimes, it's one of many appropriate next-steps. Often, it's an inappropriate next step.

Summary

As the various "disruption" attempts in higher education play out, one thing that's certain is that there will be more choices and more "appropriate norms" for post-high-school education. We're already seeing more examples of and acceptance of community colleges, trade schools, apprenticeships, self-curated learning, and hybrid learning; and these are all still fairly predictable! I hope that startups focusing on "blowing up learning" focus not only on new delivery mechanisms, although those are certainly in need to overhaul. In my opinion, online delivery platforms are the least critical part of the entire system. There are so many other areas that can be fixed, with massive social and financial return.

Originally posted on Mon, 07 May 2012

Want to read some more? Try It's Not Safe on the East Side: Perceptions of Fear.