At risk of stating the obvious, when college admissions counselors are reading through the plethora of applications, one of the central things they are looking for is whether or not they think the applicant will succeed at their university. It may seem quite obvious but the answer to this question may not be as easy to pinpoint: What is the best predictor of a student’s success in college?
College admissions counselors have a number of data points at their disposal from an application. They have an applicant’s SAT scores, Subject Test scores, AP scores, grades, courses, extracurricular activities, essays, and letters of recommendation, and high school profile, to name a few. Demographic information including ethnicity, occupation of the applicants’ parents, geographic region, and parents’ highest level of completed education are also significant information that college admissions counselors consider.
In Major League Baseball, as depicted in Michael Lewis’ 2003 “New York Times” Bestseller “Moneyball: The Art of Winning an Unfair Game,” the General Manager of the Oakland A’s, a small-market team that didn’t have the budget to compete against a division rival like the New York Yankees, decided to change the game. Instead of going after the same players’ teams like the Yankees were going after, they found different predictors of success. So instead of trying to trade for an expensive player who leads the American League in home runs, a Harvard computer wizard by the name of Paul DePodesta who reported to Beane used computer algorithms to discover predictors of success like on-base percentage that could ignite an Oakland A’s dynasty. With this system, the Oakland A’s were able to win four division titles in ten years and, in doing so, changed baseball.
So, is there a Moneyballer like Paul DePodesta, now the VP of Player Development and Scouting for the New York Mets (hired by another Ivy Leaguer in Dartmouth’s Sandy Alderson, the GM of the Mets who trained Beane as his predecessor in Oakland), working in the field of highly selective college admissions? Are there algorithms that can have an impact on The Art of Winning the Unfair Game that is college admissions? In a 2007 study, two UC Berkeley professors, Saul Geiser and Maria Veronica Santelices, found that the best predictor of a student’s college grades is a student’s high school grades. Shocker! Not SAT scores. Not ethnicity. Not AP scores or parental occupations. As the College Board, the administrators of the SAT, admitted about the changes made to the SAT in 2005, “[it] did not substantially change how well the test predicts first-year college performance.”
Wrote Geiser and Santelices in their study, “High-school grades are often viewed as an unreliable criterion for college admissions, owing to differences in grading standards across high schools, while standardized tests are seen as methodologically rigorous, providing a more uniform and valid yardstick for assessing student ability and achievement. The present study challenges that conventional view. The study finds that high-school grade point average (HSGPA) is consistently the best predictor not only of freshman grades in college, the outcome indicator most often employed in predictive-validity studies, but of four-year college outcomes as well.
A previous study, UC and the SAT (Geiser with Studley, 2003), demonstrated that HSGPA in college-preparatory courses was the best predictor of freshman grades for a sample of almost 80,000 students admitted to the University of California. Because freshman grades provide only a short-term indicator of college performance, the present study tracked four-year college outcomes, including cumulative college grades and graduation, for the same sample in order to examine the relative contribution of high-school record and standardized tests in predicting longer- term college performance. Key findings are: (1) HSGPA is consistently the strongest predictor of four-year college outcomes for all academic disciplines, campuses and freshman cohorts in the UC sample; (2) surprisingly, the predictive weight associated with HSGPA increases after the freshman year, accounting for a greater proportion of variance in cumulative fourth-year than first-year college grades; and (3) as an admissions criterion, HSGPA has less adverse impact than standardized tests on disadvantaged and underrepresented minority students.”
This topic of using data to predict the college performance success of college applicants to highly selective colleges is one that we’ll be returning to quite often on our blog pages so tune in here for new news, information, and insight.
And check out the study by UC Berkeley professors Saul Geiser and Maria Veronica Santelices here.