By John Willingham, Editor
Recently, Google searches are listing two new sites that claim to rank public university honors programs and honors colleges. Their “rankings” in most instances bear a close resemblance to the ratings we have produced since 2012. Aside from the likelihood of extensive (unattributed) borrowing from our copyrighted work, the fact is that most of the data necessary to rank or rate these programs is not publicly available. We are the only site or organization in the country that does have access, gained only after many years of dialogue and collaboration with honors deans and directors across the nation. One wonders how these new rankings were developed. Or were they mostly “borrowed”?
Our collaborative process yields enormous amounts of data. For example, to calculate honors class sizes, we have to analyze about 10,000 honors classes for each addition. Much of the data required for this analysis is not available on honors sites or even on university-wide course schedules.
And still we do not “rank” programs. Typically, I have an opinion, based on data, about the best five to ten programs in the nation among those rated in a given edition. The data may show that one is “better” (a higher point total) than all the rest. And then I think about how I have weighted each of the 13 rating categories. If I were to change any of them, the ratings would change. All is driven by the methodology, and nobody’s methodology is perfect. It is a matter of judgment in the final analysis. It is not scientific in the truest sense, even with all the data involved. I can give you an exact figure for honors class sizes at Honors College A, but the rating proportion I assign to that exact figure is subjective.
If it’s not science, don’t present it as science. Ordinal rankings present themselves as science. But just imagine how the U.S. News rankings would change if all the institutional wealth metrics were removed or if selectivity did not count.
Thanks to the cooperation of honors deans and directors across the nation, we now receive for each rated profile 10-20 pages of documents, much of it hard data on class sections and course offerings. No one else obtains this level of unique data. Even by going online and reading every entry in the university’s course schedule one will not find the volume and specificity of data that we need for honors course analyses. That’s because honors programs offer mixed and contract sections that are not transparent in online course listings.
This brings us to the new rankings.
One lists “The 9 Best Honors Programs” in the nation. Here is the methodology:
“To put together our list, we evaluated the national honors college rankings from the past two years. We also evaluated honors colleges based on admissions requirements, curricular and extracurricular program offerings, emphasis on fostering an honors student community, financial aid opportunities, and unique or innovative approaches to the honors educational experience.” [Emphasis added.]
First, how does someone quantify “an emphasis on fostering an honors student community” or “innovative approaches to the honors educational experience”?
Second, I do not know of any “national honors college rankings,” although we announce the top 5-10 programs, in one alphabetical group, every other year. These programs are “top” only within the data set of rated programs for a given edition. No program is declared number one, or number three, or number ten for that data set, much less for the entire universe of honors programs. They are a instead placed in a group. Our refusal to anoint any program with a specific ranking number has, in fact, caused one prominent program to stop cooperating with us.
The “9 Best” site does not hesitate to do so: “Ranked #1 among honors colleges in the United States, Barrett College has a presence on ASU’s four campuses in Phoenix, Mesa, Tempe, and Glendale, Arizona.” Although Barrett, under its longstanding Dean, Mark Jacobs, achieves excellent results year in and year out, I do not know of any recent ranking that specifically lists Barrett or any other honors program or college as number 1. It is true that Barrett has been in the highest (five mortarboard) group in all of our editions. But so has the South Carolina Honors College, Penn State’s Schreyer Honors College, the Plan II Honors Program at UT Austin, the University Honors Program at Kansas, and, since 2016, the Macaulay Honors College at CUNY. These are very different programs, ranging from extremely large (Barrett) to very small (UT Plan II.)
Other strong programs are at Clemson, Delaware, Georgia, Houston, and Ole Miss. Data from Maryland, Michigan, and North Carolina is no longer available, but in one or more previous editions, all received excellent ratings.
The “9 Best” site above also lists Penn State Schreyer, Clemson, and Rutgers Honors College among the best honors colleges, and adds UT Plan II, Kansas UHP, and the Echols Scholar program at UVA. Then in a “best bang for the buck” category, it lists CUNY Macaulay and the Alabama Honors College. (We have not included Echols after the 2014 edition because the new methodology in place since 2016 requires much more class data. Echols students can take almost any class at UVA, and it’s not possible to determine which ones those are at any given time.)
Another site lists “the top 50 honors programs and colleges”-a list which bears an uncanny resemblance to programs we have rated over the years. The list includes several programs that were not prominently mentioned until they appeared in one of our books: New Jersey Institute of Technology, Temple, Colorado State, and CUNY Macaulay, among them.
Here is the methodology behind this list:
“Below, we have compiled a list of the nation’s top honors colleges/programs. The selection was based on the following indicators of program quality.
- The selectivity of the college/university (overall)
- The selectivity of the honors program
- Average honors class size
- Number of honors classes
- Availability of honors housing
- Whether priority registration is offered to honors students
“Schools marked with an asterisk (*) rated especially high on several indicators and were ranked among the top 20 honors programs according to our methodology.”
All of the above information is in our publications. Further, “availability” of honors housing can be calculated only if one knows both the number of honors “beds” and the number of eligible honors students. One can know the true number of honors classes only if there is access to full spreadsheets, not just online listings, especially those limited to the honors homepage. And the true average class size likewise relies on extremely detailed data not available from online sources. Finally, some of the test scores listed on the site are incorrect and misleading.
Yes, I realize that U.S. News has several competitors in ranking colleges and universities. And, often, many of these rankings roughly correspond, especially at the most elite brand level. But…these competing ranking organizations all gather their own data, even while applying different methodologies, refrain from unseemly borrowing.