At the risk of stating the obvious, when college admissions officers are reading through the plethora of applications before them, one of the central things they are looking for is whether or not they think the applicant will succeed at their university. It may seem quite obvious but the answer to this question may not be as easy to pinpoint: What is the best predictor of a student’s success in college?
College admissions officers have a number of data points at their disposal from an application. They have an applicant’s SAT or ACT scores, SAT Subject Test scores, AP scores, grades, courses, extracurricular activities, essays, letters of recommendation, and the high school profile to name but a few. Demographic information including ethnicity, occupation of the applicants’ parents, geographic region, and parents’ highest level of completed education are also significant information that college admissions officers consider.
In Major League Baseball, as depicted in Michael Lewis’ 2003 New York Times bestseller Moneyball: The Art of Winning an Unfair Game, the General Manager of the Oakland A’s, a small-market team that didn’t have the budget to compete against a division rival like the New York Yankees, decided to change the game. Instead of going after the same players’ that teams like the Yankees were going after, they found different predictors of success. So instead of trying to trade for an expensive player who leads the American League in home runs, a Harvard computer wizard by the name of Paul DePodesta who reported to Beane used computer algorithms to discover predictors of success like on-base percentage that could ignite an Oakland A’s dynasty. With this system, the Oakland A’s were able to win four division titles in ten years and, in doing so, forever changed baseball.
So, is there a Moneyballer like Paul DePodesta, now the VP of Player Development and Scouting for the New York Mets (hired by another Ivy Leaguer in Dartmouth’s Sandy Alderson, the GM of the Mets who trained Beane as his predecessor in Oakland), working in the field of highly selective college admissions? Are there algorithms that can have an impact on The Art of Winning the Unfair Game that is college admissions? In a 2007 study, two University of California, Berkeley professors, Saul Geiser and Maria Veronica Santelices, found that the best predictor of a student’s college grades is a student’s high school grades. Shocker, we know. Not SAT scores. Not ethnicity. Not AP scores or an applicant’s parents’ occupations. As The College Board, the maker of the SAT, admitted about the changes made to the SAT in 2005, “[It] did not substantially change how well the test predicts first-year college performance.”
Wrote Geiser and Santelices in their study, “High-school grades are often viewed as an unreliable criterion for college admissions, owing to differences in grading standards across high schools, while standardized tests are seen as methodologically rigorous, providing a more uniform and valid yardstick for assessing student ability and achievement. The present study challenges that conventional view. The study finds that high-school grade point average (HSGPA) is consistently the best predictor not only of freshman grades in college, the outcome indicator most often employed in predictive-validity studies, but of four-year college outcomes as well.”
They go on, “A previous study, UC and the SAT (Geiser with Studley, 2003), demonstrated that HSGPA in college-preparatory courses was the best predictor of freshman grades for a sample of almost 80,000 students admitted to the University of California. Because freshman grades provide only a short-term indicator of college performance, the present study tracked four-year college outcomes, including cumulative college grades and graduation, for the same sample in order to examine the relative contribution of high-school record and standardized tests in predicting longer- term college performance. Key findings are: (1) HSGPA is consistently the strongest predictor of four-year college outcomes for all academic disciplines, campuses and freshman cohorts in the UC sample; (2) surprisingly, the predictive weight associated with HSGPA increases after the freshman year, accounting for a greater proportion of variance in cumulative fourth-year than first-year college grades; and (3) as an admissions criterion, HSGPA has less adverse impact than standardized tests on disadvantaged and underrepresented minority students.”
This topic of using data to predict the college performance of applicants to highly selective colleges is one that we’ll surely be returning to on our blog in the months to come so do stay tuned!
You are permitted to use www.ivycoach.com (including the content of the Blog) for your personal, non-commercial use only. You must not copy, download, print, or otherwise distribute the content on our site without the prior written consent of Ivy Coach, Inc.