One thing the annual Kiplinger Best College Values report tells us with regularity is that UNC Chapel Hill, Florida, and Virginia are wonderful values for both in-state and out-of-state (OOS) students. The three schools rank 1,2, and 3 in both categories for 2018 and are no strangers to lofty value rankings.
Rounding out the top 10 for in-state value are Michigan, UC Berkeley, UCLA, Washington, UT Austin, NC State, and Maryland.
The top 10 for OOS students are the aforementioned UNC Chapel Hill, Florida, and Virginia, followed by Florida State, UC Berkeley, Binghamton, NC State, Truman State, William and Mary, and Minnesota.
Below is a list of the top 25 best value public universities for in-state students:
UNC Chapel Hill
Florida
Virginia
Michigan
UC Berkeley
UCLA
Washington
UT Austin
NC State
Maryland
William and Mary
Georgia
UW Madison
Florida State
Purdue
New College Florida
Georgia Tech
Binghamton
Truman State
UC San Diego
New Mexico Inst Mining and Tech
UC Santa Barbara
Minnesota
Texas A&M
Ohio State
Except for the nuts and bolts metrics used by U.S. News in its annual college rankings (grad and retention rates, class sizes) all of the other ranking categories receive strong criticism from education writers and the academic community. A category since 2009, the high school counselor rankings of colleges’ reputations fly a bit under the radar. But the fact is, they do appear to have a curious impact on the rankings.
A recent, excellent article about the rankings on the websitePolitico argues that the counselor rankings rely heavily on “guidance counselors from highly ranked high schools, while many high schools in less affluent areas have few or no counselors.”
According the the Washington Post, the rankings do include “surveys of 2,200 counselors at public high schools, each of which was a gold, silver or bronze medal winner in the 2016 edition of the U.S. News Best High Schools rankings.” U.S. News also surveys “the largest private independent schools nationwide.”
This already elite group of respondents is even more restrictive than it seems: “The counselors’ one-year response rate was 7 percent for the spring 2017 surveys,” according to U.S News.
Using the nuts and bolts categories and reputation rankings alone, as in this recent post, and separating out the peer reputation rankings from the high school counselor rankings, we can see the impact the counselor rankings have.
Using a sample of 60 national universities that are either in the top 50 nationally or have at least 7 nationally rated academic departments, we found that the high school counselor rankings of private colleges were about 11% higher than those of university peer rankings of the same colleges. (Twenty-five of the schools are public, while 35 are private.)
The fact is, high school counselor rankings on the whole run higher than those of peer reviewers. But counselor rankings of public colleges were only 6.5% higher than peer rankings.
The main question at hand is, do these (few) counselors have more useful knowledge about national universities that peer reviewers have? Peer reviewers have a response rate of more than 40%; this much broader response rate (in absolute percentages and, almost certainly, demographically) should yield a more accurate assessment from peers. (Even more accurate would be the academic departmental rankings, but those are not included.)
Related questions are, how much marketing information do counselors receive, and do they receive a disproportionate share from private colleges? Do they tour private colleges more frequently? Peer reviewers are not without biases, either, but they are not recipients of marketing information from other colleges. Finally, do counselors rely more on…U.S. News rankings?
Again using the same data set we cite above, a side by side comparison of peer and counselor assessments reveals the following:
–Of the 14 universities that rose in rankings at least two places, three were public universities (21.4%) while 11 (78.6%) were private universities. (The percentage of universities in the sample is 41.7% public and 58.3% private.)
–Of the 17 universities that fell in rankings at least two places, 14 (82.4%) were public while three (17.6%) were private.
Below is a table showing the side-by-side comparison. Please bear in mind that the rankings are our adjusted rankings, not the actual U.S. News rankings.
The critics of the annual–and hugely popular–U.S. News Best Colleges rankings are vocal, large in number, well-armed with data, and mostly unavailing. Here is another attempt, based on the idea that the “financial” metrics used in the rankings distort the results. If Harvard has a zillion dollars, Harvard will have smaller classes than Mammoth State University with its meager funding per student. But why give Harvard credit for the zillion dollars and the smaller classes, when the smaller classes are the “output” that really matters?
So…the adjusted rankings below use the major non-financial metrics only: Peer assessment of academic reputation; high school counselor recommendations; graduation rates; retention rates; and class sizes. No acceptance rates or test score-related metrics are used. The impact of both are reflected in the output metric of graduation rates. (A separate post will discuss the curious disparities in high school counselor recommendations.)
Each of the universities on the list is in the top 50 in the 2018 U.S. News rankings with at least 7 ranked departments or has an aggregate academic department ranking of 50 or better across a minimum of 7 departments. The departments ranked are business and engineering (undergrad); biology, chemistry, computer science, earth sciences, economics, education, English, history, math, physics, political science, psychology, and sociology (graduate level).
Therefore, even though department ranking data are not included in the adjusted rankings below, they are used as part of the eligibility requirements for inclusion.
Below are the adjusted rankings of 60 national universities, in the order of the adjusted ranking. Also shown are the U.S. News rankings for 2018 and the difference between the adjusted rankings and those of the magazine. We used data from U.S News for the categories listed above, with the same weight assigned to each category. All categories were then standardized and aggregated. After the first fifteen or so schools, some of the disparities are striking, especially for the last half.
Especially notable in the list below are the changes in major public universities.
Included here are institutions that were, at some point, ranked in the top 50 in those two categories. Some values are blank because in those years the magazine did not give individual rankings to every institution, instead listing them in large groups described as “quartiles” or “tiers.” The rankings shown for 1983 and 1985 are the ones that U.S. News published in its magazine in those same years. For all subsequent years, the rankings come from U.S. News’s separate annual publication “America’s Best Colleges”, which applies rankings for the upcoming year.
The new rankings from Money are out, and public colleges and universities account for 27 of the top 50 best values in 2017. These rankings are likely the best college rankings overall, given their balanced approach.
As Jeffrey J. Selingo writes in the Washington Post, the earnings portion of the rankings are based in part on some very interesting new evidence: the “Chelly data.”
“That refers to Raj Chetty,” Selingo tells us, “a Stanford professor, who has led a team of economists that has received access to millions of anonymous tax records that span generations. The group has published several headline-grabbing studies recently based on the data. In findings published in January, the group tracked students from nearly every college in the country and measured their earnings more than a decade after they left campus, whether they graduated or not.
Money does a better job of ranking colleges based on “outcomes” than Forbes does (see Outcomes farther down). This is especially the case with the multiple earnings analyses.
To see the list of top publics, please skip the methodology discussion immediately below.
The 2017 rankings include 27 factors in three categories:
Quality of education (1/3 weighting), which was calculated using:
Six-year graduation rate (30%).
Value-added graduation rate (30%). “This is the difference between a school’s actual graduation rate and its expected rate, based on the economic and academic profile of the student body (measured by the percentage of attendees receiving Pell grants, which are given to low-income students, and the average standardized test scores of incoming freshmen).” [Emphasis added.]
“Peer quality (10%). This is measured by the standardized test scores of entering freshman (5%), and the percentage of accepted students who enroll in that college, known as the “yield” rate (5%).” Note: using the yield rate is an improvement over the U.S. News rankings.
“Instructor quality (10%). This measured by the student-to-faculty ratio.” Note: this is very similar to a U.S. News metric.
“Financial troubles (20%). This is a new factor added in 2017, as financial difficulties can affect the quality of education, and a growing number of schools are facing funding challenges.” Note: although this is not an “outcome” either, it is more meaningful than using data on alumni contributions, etc.
Affordability (1/3 weighting), which was calculated using:
“Net price of a degree (30%). This is the estimated amount a typical freshman starting in 2017 will pay to earn a degree, taking into account the college’s sticker price; how much the school awards in grants and scholarships; and the average time it takes students to graduate from the school, all as reported to the U.S. Department of Education….This takes into account both the estimated average student debt upon graduation (15%) and average amount borrowed through the parent federal PLUS loan programs (5%).
“Student loan repayment and default risk (15%).
“Value-added student loan repayment measures (15%). These are the school’s performance on the student loan repayment and default measures after adjusting for the economic and academic profile of the student body.
Affordability for low-income students (20%). This is based on federally collected data on the net price that students from families earning $0 to $30,000 pay.
Outcomes(1/3 weighting), which was calculated using:
“Graduates’ earnings (12.5%), as reported by alumni to PayScale.com; early career earnings within five years of graduation (7.5%), and mid-career earnings, which are for those whose education stopped at a Bachelor’s degree and graduated, typically, about 15 years ago. (5%).
“Earnings adjusted by majors (15%). To see whether students at a particular school earn more or less than would be expected given the subjects students choose to study, we adjusted PayScale.com’s data for the mix of majors at each school; for early career earnings (10%) and mid-career earnings (5%).
“College Scorecard 10-year earnings (10%). The earnings of federal financial aid recipients at each college as reported to the IRS 10 years after the student started at the college.
“Estimated market value of alumni’s average job skills (10%). Based on a Brookings Institution methodology, we matched up data provided by LinkedIn of the top 25 skills reported by each school’s alumni with Burning Glass Technologies data on the market value each listed skill.
“Value-added earnings (12.5%). To see if a school is helping launch students to better-paying jobs than competitors that take in students with similar academic and economic backgrounds, we adjusted PayScale.com’s earnings data for the student body’s average test scores and the percentage of low-income students at each school; for early career earnings (7.5%) and mid-career earnings (5%).
Job meaning (5%). We used the average score of each school’s alumni on PayScale.com’s survey question of “Does your work make the world a better place?”
“Socio-economic mobility index (20%).
For the first time, we included new data provided by the Equality of Opportunity Project that reveals the percentage of students each school move from low-income backgrounds to upper-middle class jobs by the time the student is 34 years old.Finally, we used statistical techniques to turn all the data points into a single score and ranked the schools based on those scores.” [Emphasis added.]
The inclusion of these metrics makes the Money rankings a hybrid of the Washington Monthly “public good” rankings, U.S. News, and Kiplinger rankings, with the socio-economic factors having a less significant impact than the Washington Monthly rankings on overall standing. Still, these factors do result in two CUNY campuses’ receiving high rankings.
“The data showed, for example,” Selingo writes, “that the City University of New York propelled almost six times as many low-income students into the middle class and beyond as all eight Ivy League campuses, plus Duke, M.I.T., Stanford and Chicago, combined. The California State University system advanced three times as many.”
TOP PUBLIC UNIVERSITIES, MONEY MAGAZINE, 2017, BY NAME AND OVERALL RANK INCLUDING PRIVATE INSTITUTIONS:
CUNY Baruch College–2
Michigan–3
UC Berkeley–4
UCLA–5
UC Irvine–7
UC Davis–9
Virginia–11
Washington–13
Georgia Tech–16
Florida–18
Maryland–20
Illinois–22
Virginia Tech–23
College of New Jersey–24
UC Riverside–29
Michigan State–30
UT Austin–31
Binghamton–33
Texas A&M–34
UC Santa Barbara–36
Connecticut–37
Purdue–37 (tie)
VMI–41
Cal State Long Beach–42
CUNY Brooklyn–43
UW Madison–45
James Madison–46
Rutgers, New Brunswick–49
NC State–50
The Kiplinger Best Value College Index methodology emphasizes a “quality” side in relation to the “cost” side of a university. The quality side includes selectivity, retention, and four-year grad rates, while the cost side takes tuition, fees, merit aid, need-based aid, and post-graduation debt into account.
For the 16th straight year, UNC Chapel Hill leads as the best public value for both in-state and out-of-state (OOS) applicants.
The Southeast and Mid-Atlantic account for 10 of the top 25 best public value schools. West coast universities in the UC system along with the University of Washington account for another half dozen in the top 25.
In the middle, so to speak, are traditionally strong publics including Michigan, UW Madison, Illinois, UT Austin, Minnesota, and Ohio State.
Acceptance rates vary widely among the top value schools, from a low of 15 and 17 percent at UC Berkeley and UCLA respectively, to a high of 66 percent at Illinois.
Other publics with relative low acceptance rates include Michigan (26 percent); Cal Poly (31 percent); Georgia Tech (32 percent); UC Santa Barbara (33 percent); UC San Diego (34 percent); and UC Irvine and UT Austin (39 percent).
Below are the top 25 in-state public values, with the OOS ranking and Acceptance Rate listed as well.
In our latest book of honors program ratings, we listed the honors programs and colleges that received an overall five “mortarboard” rating. One component of the rating model used in order to determine the leading programs is prestigious scholarships–the number of Rhodes, Marshall, Truman, Goldwater, etc., awards earned by students from each university as a whole.
In most cases, honors programs at these universities contribute most of the winners of these awards, but not in all cases. So while the prestigious scholarship component is worth including, we do not want it to override the 12 other rating components used in the ratings. These components are “honors only” because they do not include awards earned by non-honors students of the university as a whole.
Therefore, we decided to do a separate rating, one that is not included in the new book, INSIDE HONORS. The new rating uses only the 12 components listed below. Farther down, you can see whether the prestigious scholarship component had a major impact on the overall ratings of top programs.
Those 12 additional components are…
Curriculum Requirements
Number of Honors Classes
Number of Honors Classes in 15 Key Disciplines
Extent of Honors Enrollment
Average Class Size, Honors-only Sections
Overall Average Class Size, All Sections
Honors Graduation Rate-Raw
Honors Graduation Rate-Adjusted for Test Scores
Student to Staff Ratio
Type and Extent of Priority Registration
Honors Residence Halls, Amenities
Honors Residence Halls, Availability
Below is a comparison of the honors programs that received a five mortarboard OVERALL RATING (left side) and those that receive the same rating for HONORS COMPONENTS ONLY (right side), all listed ALPHABETICALLY.
OVERALL FIVE MORTARBOARDS
HONORS ONLY COMPONENTS, FIVE MORTARBOARDS
Arizona St
Clemson
Clemson
CUNY Macaulay
CUNY Macaulay
Georgia
Georgia
Houston
Houston
Kansas
Kansas
New Jersey Inst Tech
New Jersey Inst Tech
Oregon
Oregon
Penn St
Penn St
South Carolina
South Carolina
Temple
UT Austin
UT Austin
It is notable that the overlap is almost identical: Arizona State is not on the second list, while Temple is not on the OVERALL list but is on the HONORS COMPONENTS list.
We must add that Temple barely missed a five mortarboard overall rating, while ASU was similarly close to making the honors components list.
There are some interesting features, and the rankings are certainly worth a look.
The rankings combine national universities and liberal arts colleges into one group, and in this way resemble the Forbes rankings. And, also like the Forbes rankings, the salaries earned by graduates also count as a metric, 12% of the total in the WSJ/THE rankings.
Farther down, we will list the top 100 colleges in the rankings. Only 20 of the top 100 schools are public; 31 are liberal arts colleges; and the remaining 49 are elite private universities. This is not much of a surprise, given that financial resources are a major ranking category.
Before listing the top 100, we will list another group of schools that have the best combined scores in what we consider to be the two most important umbrella categories in the rankings, accounting for 60% of the total: “Engagement” and “Output.”
Engagement (20% of total, as broken out below):
A. Student engagement: 7%. This metric is generated from the average scores per College from four questions on the student survey:
To what extent does the teaching at your university or college support CRITICAL THINKING?
To what extent did the classes you took in your college or university so far CHALLENGE YOU?
To what extent does the teaching at your university or college support REFLECTION UPON, OR MAKING CONNECTIONS AMONG, things you have learned?
To what extent does the teaching at your university or college support APPLYING YOUR LEARNING to the real world?
B. Student recommendation: 6%. This metric is generated from the average score per College from the following question on the student survey:
If a friend or family member were considering going to university, based on your experience, how likely or unlikely are you to RECOMMEND your college or university to them?
C. Interactions with teachers and faculty: 4%. This metric is generated from the average scores per College from two questions on the student survey:
To what extent do you have the opportunity to INTERACT WITH THE FACULTY and teachers at your college or university as part of your learning experience?
To what extent does your college or university provide opportunities for COLLABORATIVE LEARNING?
D. Number of accredited programs (by CIP code): 3%. This metric is IPEDS standardized number of Bachelor’s degree programs offered.
Output (40% of the total, as broken out below):
A. Graduation rate: 11%. This metric is 150% of the graduation rate status as of 31 August 2014 for the cohort of full-time, first-time degree/certificate-seeking undergraduates, Bachelor’s or equivalent sub-cohort.
B. Graduate salary: 12%. This metric estimates the outcome of median earnings of students working and not enrolled 10 years after entry.
C. Loan default/repayment rates: 7%. This metric estimates the outcome of the 3-year repayment rate from College Scorecard data. The value added component is the difference between actual and predicted (based on underlying student and College characteristics) outcomes.
D. Reputation: 10%. This metric is the number of votes obtained from the reputation survey, and is calculated as the number of US teaching votes from the reputation survey and the number of US-only teaching votes from country section of the reputation survey.
The two remaining umbrella categories measure FinancialResources, including the amount spent per student; and the Environment, including the diversity of enrolled students (or faculty) across various ethnic groups. You can find a summary of the methodology here.
Here are the 23 colleges that scored at least 17.0 (out of 20) in Engagementand at least 30.0 (out of 40.0) in Output, listed in order of their overall place in the WSJ/TimesHigherEd rankings:
Editor’s note: This post has now been updated, effective September 14, 2021, to include new U.S. News rankings for 20212. Listed below are the yearly rankings and overall average rankings of 123 national universities that were included in the first tier of the U.S. News Best Colleges from 2015 through 2022. There are 61 public and 62 private universities. The list below not only shows the average rankings over this eight-year period but also lists the number of places lost or gained by each university.
The post also has rankings for 11 other universities that we will begin tracking.
U.S. News has changed its methodology, and there are some significant changes, especially after the top 30-35 places in the rankings.
The organization deserves considerable credit for the changes in 2020 and, even more, for those made in 2021.
The new methodology definitely mitigates some of the worst effects of the old ranking system. “For the 2021 edition, U.S. News reduced the weight of SAT/ACT standardized tests to 5% (7.75% previously) and reduced the weight of high school class standing to 2% (2.25% previously) toward schools’ overall scores. The weight of alumni giving was reduced to 3% (5% previously) toward each school’s overall rank.”
Other changes:
“Two new ranking indicators that measure graduate indebtedness were added to the rankings this year:
Graduate indebtedness total. This is the average amount of accumulated federal loan debt among the 2019 bachelor’s degree graduating class that took out federal loans (weighted 3%). For non-responders to the U.S. News financial aid survey, the U.S. Department of Education College Scorecard’s most recent cohort of institutional median graduate indebtedness was adjusted and used in its place.
Graduate indebtedness proportion. This is the percentage of graduates from the 2019 bachelor’s degree graduating class who borrowed federal loans (2%). For non-responders to the U.S. News financial aid survey, the College Scorecard’s most recent institutional cohort of proportion of undergraduates borrowing was adjusted and used in its place.
“U.S. News also calculated a new graduate indebtedness rank, which is the combination of the two indebtedness indicators for ranked schools. It provides a benchmark for how schools compare in terms of total graduate indebtedness among those with debt and the proportion of graduates with debt. For schools ranked highest, it means their recent bachelor’s degree graduates had relatively little debt and a relatively small proportion of students graduating with debt compared with other schools. This graduate indebtedness rank is available on each school’s Rankings section on usnews.com.”
“Outcomes weight increased: As a result of adding graduate indebtedness, the rankings factors that measure outcomes now account for 40% of the ranking, up from 35% last year. The outcomes rank displayed on each school’s Rankings section on usnews.com is composed of these ranking factors: graduation and retention rates; graduation rate performance; social mobility; and graduate indebtedness.”
Here are the historical rankings, the average of each school across eight years, and the increase or decline of each school from 2015 through 2022. The universities are listed in order of their 2022 rankings.
The 2016 edition will have a new name– Inside Honors: Ratings and Reviews of 60 Public University Honors Programs. It is in the final proofing stage now. The goal is to publish in late September. Each edition includes a somewhat different group of honors colleges and programs, so there will be changes, even among the 40 or so programs that are reviewed in each edition.
As I have noted in previous updates, the book will take an almost microscopic view of 50 of these programs and also provide more general summary reviews of 10 additional programs. I can say now that there will be a few more programs that will receive the highest overall rating of five “mortarboards” than there were in 2014. (The final list of programs we are rating and reviewing for 2016 is below.)
The rating system makes it possible for any honors college or program, whether a part of a public “elite” or not, to earn the highest rating. Similarly, the ratings allow all types of honors programs to earn the highest rating. Those receiving five mortarboards will include core-type programs with fewer than 1,000 students and large honors programs with thousands of students. And absent any intentional preference for geographical diversity, the list does in fact include programs from north, south, east, and west.
By microscopic, I mean that the rating categories have increased from 9 to 14, and so has the depth of statistical analysis. The categories are, first, the overall honors rating; curriculum requirements; the number of honors classes offered; the number of honors classes in “key” disciplines; the extent of honors participation by all members in good standing; honors-only class sizes; overall class size averages, including mixed and contract sections; honors grad rates, adjusted for admissions test scores; ratio of students to honors staff; type of priority registration; honors residence halls, amenities; honors residence halls, availability; and the record of achieving prestigious scholarships (Rhodes, Marshall, Goldwater, etc.).
Sometimes readers (and critics) ask: Why so few programs? Doesn’t U.S. News report on hundreds of colleges?
The answer is: Honors colleges and programs are complicated. Each one of the 50 rated reviews in the new edition with by 2,500-3,000 words in length, or 7-8 pages. That’s almost 400 pages, not including introductory sections. The rest of the answer is: We are not U.S. News. With myself, one assistant editor, a contract statistician, and an outsourced production firm, our ability to add programs is very limited.
The 2016 profiles are full of numbers, ratios, and averages, more than in 2014 certainly–and too many, I believe, for readers who would prefer more narrative summary and description. So, yes, it is a wonkish book, even to a greater extent than this website tends to be. But then, they are honors programs after all.
Full ratings:
Alabama Honors
Arizona Honors
Arizona State Honors
Arkansas Honors
Auburn Honors
Central Florida Honors
Clemson Honors
Colorado State Honors
Connecticut Honors
CUNY Macaulay Honors
Delaware Honors
Georgia Honors
Georgia State Honors
Houston Honors
Idaho Honors
Illinois Honors
Indiana Honors
Iowa Honors
Kansas Honors
Kentucky Honors
LSU Honors
Maryland Honors
Massachusetts Honors
Minnesota Honors
Mississippi Honors
Missouri Honors
Montana Honors
New Jersey Inst of Tech
New Mexico Honors
North Carolina Honors
Oklahoma Honors
Oklahoma State Honors
Oregon Honors
Oregon State Honors
Penn State Honors
Purdue Honors
South Carolina Honors
South Dakota Honors
Temple Honors
Tennessee Honors
Texas A&M Honors
Texas Tech Honors
UC Irvine Honors
University of Utah Honors
UT Austin Honors
Vermont Honors
Virginia Commonwealth Honors
Virginia Tech Honors
Washington Honors
Washington State Honors
Summary Reviews:
Cincinnati Honors
Florida State Honors
Michigan Honors
New Hampshire Honors
Ohio Univ Honors
Pitt Honors
Rutgers Honors
Virginia Honors
Western Michigan Honors
Wisconsin Honors