Higher education statistics that say less than they seem to
They feel meaningful, but what are they really saying?
I recently read an article from The Wall Street Journal called “Why It Pays to Be a Double Major in College.”
It cites a study looking at student majors and found that graduates with double majors were less likely to face layoffs or other career disruptions than those who graduated with a single major. The contention from the article is that people with double majors are more broad minded, more flexible, and more valuable to their employers than those with a single focus and skill set.
I wonder about that. The study begins at the ending, once the degree is complete. Its scope is limited such that it cannot ask why those people got double majors. The assumption is that they chose to have a broader academic experience and that their academic experience is the factor that explains their employment trajectory.
But I have some qualms with that.
For instance, more well-resourced schools are more likely to provide students access to Advanced Placement and dual credit coursework. This can help students complete college requirements before setting foot on campus, giving them more space in their schedule to complete a second major.
Additionally, students who have more generous tuition support are less likely to graduate early. They will stay enrolled for four years, whether or not they’ve met the minimum graduation requirements already. Instead, they are more likely to use that “extra” time to get a second major. Why not?
If, then, we consider double majors being a signaling a wealthier high school or more robust support structure during college, then that can also signal which kinds of careers these graduates are in. Do they face fewer disruptions because they are more connected through family ties? Do they avoid long stretches of unemployment because they come from backgrounds where they will get support instead of needing to find it on their own?
I don’t know how much of this is a real factor, but it’s the kind of inquiry pattern that doesn’t show up in higher education reporting very often. I don’t blame the Journal because they’re likely no more capable of answering the questions I raised than I am. But taking the study’s conclusions at face value instead of raising any concerns might be misleading.
While I’m here, I want to raise two other common statistics that get misconstrued in higher education.
Graduation Rate
Higher is better, right?
Well, it’s certainly not worse, but when prospective students get worried about a graduation rate, it’s often based on a false premise.
Students and families may see a statistic like “X University has a 70% graduation rate.” They’ll get concerned and think they have a 3-in-10 chance of not completing the degree.
The issue is you are not rolling fair dice. The dice are weighted.
This statistic looks at the cohort of new, first-year students entering a college and asks how many of them completed a degree at that institution within 6 years (for 4-year colleges).
A low rate can be the product of poor academic programming, overcrowding, or a lack of student support. This is particularly likely at for-profit colleges which tend to minimize support services and maximize tuition to generate revenue.
But, generally, graduation rates are a product of the student body’s composition. If universities attract high-achieving high school students with significant financial support, those students are very likely to complete their degrees.
But if a college enrolls marginalized students, that is not the case. Students who had a hard time academically in high school can enroll in college, but they’re likely to continue to have a harder time in college than the valedictorians will. Students whose families can’t support them, who have to take on more debt, or who have more family obligations outside of school are less likely to graduate on time. They are more likely to transfer programs or take more than six years to complete a degree.
This doesn’t necessarily say anything about the support that the institution gives its students. But some students have problems that are too big for campus staff to solve.
Additionally, the college itself may be the problem without being a problem. For example, St. John’s College (another alma mater of mine) has a relatively low graduation rate. This is mostly because their classical curriculum really limits student choice. So if students find that they’re not a good fit, there is no alternative to them within the institution. They cannot change majors because the school has no majors. So they are forced to drop out/transfer out. Does this mean St. John’s is doing a bad job? Or just that the institution is not prioritizing retention over their curricular identity?
Graduation rates don’t really say much about any one student’s chance to graduate; they mostly just tell you about who the college enrolls.
Acceptance Rate
Lower is better, right?
Acceptance rates are, on the surface, simple. They tell you how many applicants were offered a seat at the institution compared to the total number of students who applied.
When we see low acceptance rates, it tells us that a school has many more people that want to attend than are allowed to. We tend to trust the wisdom of the masses. If so many people want to go to a school, then the school must be good, right?
And we also trust the inverse. If most of the people who apply to a school are let in, then that means the school is bad, right?
Well, maybe. But there are several ways institutions get low acceptance rates. Not all of them are meaningful.
Is Harvard the best college in America? Or is it just the most well known? Is it just the oldest? Harvard, and other schools like it, have a name brand that is associated with prestige. The content and quality of their undergraduate education is not of primary importance. Their acceptance rate is annually among the lowest in America because their name is prestigious enough to draw thousands of applicants hoping to “win the lottery” of admissions, even if the school isn’t a good fit for them personally.
Another way institutions can manipulate this is by making it easier to apply. Institutions who have more complex applications, require more writing, or don’t make their applications available on Common App are likely to get fewer applications.
Georgetown famously refuses to use Common App because students applying through that centralized service are less likely to take any one application seriously. They are willing to have fewer applicants if those applicants are more engaged with their specific program. This raises their acceptance rate, but does it tell you anything about the quality of the undergraduate program?
Lastly, institutions can decide for themselves what qualifies as an “application.” For example, the federal military academies likely have the most complex applications out there. They involve physical tests, medical tests, and even Congressional nominations.
Academies like West Point tend to count any initiated application as a submission. With around 12,000 initiated applications, this gives them about a 10% acceptance rate. But only about 2200 applicants actually complete the application with all its components. This means that the institution has a 50% acceptance rate for completed applications.
Which number do you think makes the institution seem more prestigious? Which one do you think shows up in their public-facing information?
Institutions can inflate the number of applicants in several ways. Moreover, they can keep their institution artificially small to retain the perception that it is “exclusive.” Nothing is stopping Ivy League institutions from building out their campus and accepting more undergraduates. But the acceptance rate game turns those institutions into luxury commodities. Keeping students out may not mean much for the educational experience, but it will say a lot about how they are perceived by prospective students.
Having a low acceptance rate doesn’t make you a good school, but it sure makes people feel like it does.
Oh gosh, and there’s so many more…
This is the same process I try to go through thinking about average starting salary, admissions yield rates, median test scores, and many similar numbers.
The basic issue I wanted to reflect on here was how easy it is to take a single statistic and spin a narrative about an institution based on that. This is especially important since those same institutions know which statistics people pay attention to and do all in their power to manipulate the numbers in their favor.
When we ignore the context of a statistic or take it at face value, we end up drastically oversimplifying the exceptionally complex organizations that colleges are. We all need to simplify the world in order to process it. But if we can ask a few extra questions about where a number came from and what assumptions it rests on, we get a more well rounded view of the thing we want to learn.
-Matt
This was a very interesting read! Though my math skills are abysmal, the "true" reflection of statistics has always been at the core of my educational understanding (my first degrees in social work required statistical study for a lot of the reasons you've outlined above: what they say about society).
I had a double major at Ball State University after an academic advisor pointed out that the counseling psychology major overlapped so significantly with my social work course work that I only had to take two or three additional classes to graduate as a double major. I did it, and ironically, those additional courses I took in the counseling psych department taught me things I leaned back on for years post-grad. I also had a "concentration," and took courses in the BSU criminal justice department. Less success with those courses, lots of white men who wanted to be police officers. In one class, the professor asked the class who had been corporally punished as a child with a show of hands and I was the only student who didn't raise my hand...