The first two are straightforward: The university rate will always be lower than the honors program rate because of the greater selectivity and mentoring associated with honors programs. The university grad rate for honors students averages 86-88 percent, and is sometimes as high as 97 percent.
An honors completion rate goes a step beyond the honors graduation rate. The grad rate is for honors entrants, whether or not they completed all honors requirements by the time of graduation. The completion rate is the percentage of honors program entrants who not only graduated from the university but also completed all honors program requirements for at least one option. Some programs have multiple options, with the requirements for first-year entrants averaging about 30 honors credits and a threshold for transfer students of 15-18 hours or so.
In our study for 2020, we have obtained honors graduation and completion rates from 31 honors colleges and programs. Below, in Table 1, we list the programs with the highest completion rates, all above the mean of 57.2 percent. In this table we also list the honors graduation rate, the highest credit-hour completion requirement for each program, and the average 2020 SAT scores for first-year entrants.
The top six programs all had honors completion rates of 70 percent or higher. This is a remarkably high number when one considers that many of these programs require an honors thesis. Many elite private colleges no longer require a thesis for graduation or for honors recognition. The top six programs, in terms of raw ordinal completion rates, are CUNY Macaulay Honors College; UIUC’s CHP Honors Program; the UT Austin Plan II Honors Program; Penn State’s Schreyer Honors College; the South Carolina Honors College; and Arizona State’s Barrett Honors College.
[table id=119 /]
In Table 2, below, we show adjusted honors completion rates for programs after the impact of university graduation and freshman retention rates are taken into account. In contrast to Table 1, the table shows the extent to which programs have exceeded expectations in light of these two factors.
We find that seven programs achieved an adjusted completion rate that exceeded the target rate by 10 or more percentage points: CUNY Macaulay Honors College; the UAB Honors College; the Kansas University Honors Program; the College of Charleston Honors College; the South Carolina Honors College; Arizona State’s Barrett Honors College; and the Washington State Honors College.
Recently, Google searches are listing two new sites that claim to rank public university honors programs and honors colleges. Their “rankings” in most instances bear a close resemblance to the ratings we have produced since 2012. Aside from the likelihood of extensive (unattributed) borrowing from our copyrighted work, the fact is that most of the data necessary to rank or rate these programs is not publicly available. We are the only site or organization in the country that does have access, gained only after many years of dialogue and collaboration with honors deans and directors across the nation. One wonders how these new rankings were developed. Or were they mostly “borrowed”?
Our collaborative process yields enormous amounts of data. For example, to calculate honors class sizes, we have to analyze about 10,000 honors classes for each addition. Much of the data required for this analysis is not available on honors sites or even on university-wide course schedules.
And still we do not “rank” programs. Typically, I have an opinion, based on data, about the best five to ten programs in the nation among those rated in a given edition. The data may show that one is “better” (a higher point total) than all the rest. And then I think about how I have weighted each of the 13 rating categories. If I were to change any of them, the ratings would change. All is driven by the methodology, and nobody’s methodology is perfect. It is a matter of judgment in the final analysis. It is not scientific in the truest sense, even with all the data involved. I can give you an exact figure for honors class sizes at Honors College A, but the rating proportion I assign to that exact figure is subjective.
If it’s not science, don’t present it as science. Ordinal rankings present themselves as science. But just imagine how the U.S. News rankings would change if all the institutional wealth metrics were removed or if selectivity did not count.
Thanks to the cooperation of honors deans and directors across the nation, we now receive for each rated profile 10-20 pages of documents, much of it hard data on class sections and course offerings. No one else obtains this level of unique data. Even by going online and reading every entry in the university’s course schedule one will not find the volume and specificity of data that we need for honors course analyses. That’s because honors programs offer mixed and contract sections that are not transparent in online course listings.
This brings us to the new rankings.
One lists “The 9 Best Honors Programs” in the nation. Here is the methodology:
“To put together our list, we evaluated the national honors college rankings from the past two years. We also evaluated honors colleges based on admissions requirements, curricular and extracurricular program offerings, emphasis on fostering an honors student community, financial aid opportunities, and unique or innovative approaches to the honors educational experience.” [Emphasis added.]
First, how does someone quantify “an emphasis on fostering an honors student community” or “innovative approaches to the honors educational experience”?
Second, I do not know of any “national honors college rankings,” although we announce the top 5-10 programs, in one alphabetical group, every other year. These programs are “top” only within the data set of rated programs for a given edition. No program is declared number one, or number three, or number ten for that data set, much less for the entire universe of honors programs. They are a instead placed in a group. Our refusal to anoint any program with a specific ranking number has, in fact, caused one prominent program to stop cooperating with us.
The “9 Best” site does not hesitate to do so: “Ranked #1 among honors colleges in the United States, Barrett College has a presence on ASU’s four campuses in Phoenix, Mesa, Tempe, and Glendale, Arizona.” Although Barrett, under its longstanding Dean, Mark Jacobs, achieves excellent results year in and year out, I do not know of any recent ranking that specifically lists Barrett or any other honors program or college as number 1. It is true that Barrett has been in the highest (five mortarboard) group in all of our editions. But so has the South Carolina Honors College, Penn State’s Schreyer Honors College, the Plan II Honors Program at UT Austin, the University Honors Program at Kansas, and, since 2016, the Macaulay Honors College at CUNY. These are very different programs, ranging from extremely large (Barrett) to very small (UT Plan II.)
Other strong programs are at Clemson, Delaware, Georgia, Houston, and Ole Miss. Data from Maryland, Michigan, and North Carolina is no longer available, but in one or more previous editions, all received excellent ratings.
The “9 Best” site above also lists Penn State Schreyer, Clemson, and Rutgers Honors College among the best honors colleges, and adds UT Plan II, Kansas UHP, and the Echols Scholar program at UVA. Then in a “best bang for the buck” category, it lists CUNY Macaulay and the Alabama Honors College. (We have not included Echols after the 2014 edition because the new methodology in place since 2016 requires much more class data. Echols students can take almost any class at UVA, and it’s not possible to determine which ones those are at any given time.)
Another site lists “the top 50 honors programs and colleges”-a list which bears an uncanny resemblance to programs we have rated over the years. The list includes several programs that were not prominently mentioned until they appeared in one of our books: New Jersey Institute of Technology, Temple, Colorado State, and CUNY Macaulay, among them.
Here is the methodology behind this list:
“Below, we have compiled a list of the nation’s top honors colleges/programs. The selection was based on the following indicators of program quality.
The selectivity of the college/university (overall)
The selectivity of the honors program
Average honors class size
Number of honors classes
Availability of honors housing
Whether priority registration is offered to honors students
“Schools marked with an asterisk (*) rated especially high on several indicators and were ranked among the top 20 honors programs according to our methodology.”
All of the above information is in our publications. Further, “availability” of honors housing can be calculated only if one knows both the number of honors “beds” and the number of eligible honors students. One can know the true number of honors classes only if there is access to full spreadsheets, not just online listings, especially those limited to the honors homepage. And the true average class size likewise relies on extremely detailed data not available from online sources. Finally, some of the test scores listed on the site are incorrect and misleading.
Yes, I realize that U.S. News has several competitors in ranking colleges and universities. And, often, many of these rankings roughly correspond, especially at the most elite brand level. But…these competing ranking organizations all gather their own data, even while applying different methodologies, refrain from unseemly borrowing.
The 2020 edition of Inside Honors was to have included in-depth ratings of 33 programs and somewhat shorter reviews of an additional seven programs. The COVID-19 issues facing universities will delay the next edition until October 2020 and has reduced the original number of programs that committed to participate. Most of the top-rated programs in previous editions will likewise be rated in 2020.
One positive: The new edition will include a new narrative section that summarizes each program and each profile will be longer, averaging 3,500 words.
The 33 programs that will now receive full ratings are below:
Central Florida (UCF)
College of Charleston
South Florida (USF)
Below are the seven programs that will receive unrated reviews:
North Carolina Charlotte
South Dakota St
Yes, the title of this post is a mouthful. For years now, I have kept an updated list of the departmental rankings that U.S. News publishes so that I can add them to the biannual profiles I do of honors programs. When the 2020 rankings came out, I wanted to see whether there was any clear relationship between the departmental scores and the academic reputation scores. Then I compared the latest reputation scores with those published in 2015 to see how much had changed. Finally, the table below also includes changes in university rankings and the most recent rankings for social mobility.
(I would welcome comments on this post. Please email firstname.lastname@example.org.)
It appears that the social mobility metric has had some impact, especially if the ranking is very strong, as in the case of many UC campuses and Florida institutions. There is no clear relationship between departmental scores and academic reputation scores. Departmental rankings do have a modest relationship to the overall U.S. News rankings, but there are many inconsistencies. Academic reputation scores do seem to show some “grade inflation” since 2015; often this is the case even when the U.S News ranking has dropped significantly.
The table below includes data for 100 public and private universities.
The cumulative rankings that I do for 15 academic disciplines requires some explanation. U.S. News only ranks graduate programs for most departments. Here are the disciplines for which I have cumulative departmental rankings, using the most recent data (2018): biological sciences; business (undergrad); chemistry, computer science; earth sciences; economics; education; engineering (undergrad);English; history; mathematics; physics; political science; psychology; and sociology.
Not every university has a ranked department in each of the 15 disciplines. I averaged departmental rankings for every university that had at least six ranked departments. For universities with, say, fewer than 12 ranked departments, the total ranking will be artificially high because only the best departments are ranked and I cannot include unranked departents. Most universities have 12-15 departments that are ranked, and so the overall average will be more useful for them. And some of the universities with a small number of ranked departments are specialized, such as Georgia Tech and Caltech. Clearly, even ranking only six or seven departments for those schools and getting a strong result is not misleading.
Universities with fewer than 10 departmental rankings: Colorado School of Mines; Georgia Tech; Miami Ohio; American; Brigham Young; Caltech; Dartmouth; Drexel; Fordham; Georgetown; and RPI.
It should be said that universities with relatively low departmental rankings can legitimately receive high rankings because of other meaningful factors, such as grad and retention rates and class size. Some excellent universities do not have an especially strong research focus or a lot of graduate programs. Dartmouth is one prominent example.
The universities below appear in rank order of their 2020 academic reputation, according to U.S. News.
At last, there is a major study that goes a long way toward answering this important question.
Dr. Art Spisak
Making good use of the increasing data now available on honors programs and their parent institutions, two honors researchers have recently published a major paper that compares honors students and non-honors students from 19 public research universities. Out of 119,000 total students, a total of 15,200 were or had been participants in an honors program.
The study is extremely helpful to parents and prospective honors students who rightly ask how an honors education differs from a non-honors education: How will participation in an honors program shape and differentiate an honors student? Will an honors education be the equivalent of an education at a more prestigious private college?
Feelings about the undergraduate experience: “In their undergraduate experience, students in the honors group reported a more positive experience, on average, than those in the non-honors group.” Both groups attended classes with similar frequency, but honors students reported greater activity in the following areas:
finding coursework so interesting that they do more work than is required;
communicating with profs outside of class;
working with faculty in activities other than coursework;
increasing effort in response to higher standards;
completing assigned reading;
attending to self care, eating, and sleeping;
spending more time studying;
performing more community service and volunteer work;
participating in student organizations;
and, while spending about the same time in employment, finding on-campus employment more frequently than non-honors students.
Participation in “high-impact” activities: These experiences contribute to undergraduate success and satisfaction as well as to higher achievement after graduation. Some of these are restricted to upperclassmen, so the study concentrated on participation by seniors in high-impact activities, including undergraduate research, senior capstone or thesis, collaborating with a professor on a project or paper, studying abroad, or serving in a position of leadership.
“Those [students] in the honors student segment of the senior sample had markedly higher cumulative college grade point averages.” The cumulative GPA of the honors group was 3.65; for the non-honors group it was 3.31. “A grade point average of 3.31 is located at the 38th percentile in the overall distribution within the study sample, and a grade point average of 3.65 is at the 69th percentile.” The authors found that the very significant difference was “particularly impressive” given that the high school GPAs of honors and non-honors students did not vary so significantly. Honors students were also 14% more likely to have served as an officer in a campus organization.
Students in the honors group were 77% more likely to have assisted faculty in research projects, 85% percent more likely to have studied abroad, and 2.5 times more likely to have conducted undergraduate research under faculty guidance.
Intellectual curiosity: Honors students expressed a statistically significant but not dramatically greater degree of intellectual curiosity; however, their intellectual curiosity was aligned with the “prestige” of an academic major. The study did not measure whether this attachment to prestige reflected a desire for greater intellectual challenge or for higher salaries associated with many such majors. (Or both.) Both groups placed similar emphasis on the importance of high pay after graduation and on career fulfillment.
Diversity: The study found that African American students were only 52% as likely to be in an honors program as they are to be in the larger university sample. Latin American students were 58% as likely. These figures may be due in part to the fact that, as a group, the 19 research universities “are located in states that are somewhat more white than the nation as a whole, but most of the discrepancy can be attributed to the fact that Research 1 universities do not, in general, have enrollments that are especially representative of ethnic and racial minorities.” On the other hand, LGBQ, transgender, and gender-questioning students “appear to be slightly over-represented among honors students.”
Low-income and first generation participation: These students “are significantly and substantially under-represented in the honors group.” Pell Grant recipients are 30% less likely to be in honors than in the non-honors group; and 40% of first-generation students are less likely to be in the honors group.
Test scores and HSGPA: There was a difference between honors and non-honors students, but it was not dramatic. “Regardless of which test score was used, the honors group had scores that were about 10% higher, on average.” (In our ratings of honors programs, we have found that honors test scores were about 17% higher, based on actual honors scores and the mid-range of test scores in U.S. News rankings.) The average high school GPA for the honors group was .11 points higher than for the non-honors group.
The study used data from the 2018 Student Experience in the Research University (SERU) survey for 2018. Although the study only used data from Research 1 universities that comprise only 3% of all colleges and universities in the nation, R1 universities enroll 28.5% of all undergraduates pursuing four-year degrees.
Research centered on honors education is increasingly important: An estimated 300,000-400,000 honors students are enrolled in American colleges and universities today.
“It has become a mantra in some quarters to assert that standardized tests measure wealth more than intellectual ability or academic potential, but this is not actually the case. These tests clearly assess verbal and mathematical skills, which a century of psychological science shows are not mere reflections of upbringing. Research has consistently found that ability tests like the SAT and the ACT are strongly predictive of success in college and beyond, even after accounting for a student’s socioeconomic status.”
For years, U.S. News has used test scores and selection rates as ranking data for the annual “Best Colleges” report. The publication has slightly reduced the impact of test scores in recent editions.
Below I will explain why we do not include test scores as a metric and argue that, for honors and non-honors students, other factors are more important in predicting success. (High school GPA is certainly a major factor; but since almost all honors students have high GPAs, I do not discuss the impact of GPA in this post.)
In their published scholarly work, the authors argue that test scores by themselves correlate very strongly ( r= -.892) with the annual U.S. News Best Colleges rankings for national universities even though the test scores count for only 7.75 percent of the total ranking score. (The authors do not cite the impact of test scores on other ranking factors such as graduation and retention rates, which together account for 22 percent of the total ranking score.)
Our own work for the past eight years, however, shows that test scores do not have a similar correlation to quantitative assessments of honors programs. In our publications we list minimum and average admissions test scores for all programs we rate, but we do not count the scores alone as a rating factor.
Here’s why we do not use test scores as a measure: The factors that make for an excellent honors program are primarily structural. The major building blocks are the credits required for honors completion; the number of honors class sections offered, by type and academic discipline; the availability of priority registration and honors housing; the size of honors class sections; and the number of staff to assist students.
So, don’t the test scores drive the university graduation rates of honors program entrants, just as they do in elite colleges? The answer is not so much; the correlation is r= .50
Admittedly, it is probably difficult for a student with, say, a 1050 SAT score to succeed in an elite college or in most honors programs. But within a fairly large range of SAT scores (~1280-1510), the opportunities for success are more often present given a conducive structure. With every biannual review of honors data, I find great pleasure in discovering outstanding honors programs that are not housed in highly- ranked and extremely selective universities. The golden nuggets of excellence in higher education are scattered much farther and wider than many would have us believe.
I am strongly opposed to the numerical ranking of colleges or their honors programs, whether or not test scores are included in the methodology. I ranked honors program one time, in 2012, and regret doing so. Yes, I have data that allows me to numerically differentiate the total rating scores earned by honors programs. But anyone who wants to provide some kind of assessment of colleges or programs needs to do so with the assumption that their methodology is subjective and imperfect. Ordinal rankings based on distinctions of one point or fractions of a point give readers a veneer of certitude that a qualitative difference exists even if it (often) does not.
Although we do not rank honors programs, we do place them in one of five rating groups, a process that is similar to rating films on a five-star basis but based on quantitative rather than completely subjective data. The seven honors programs in the top group in 2018 (out of 41) had average SAT scores (enrolled students) ranging from 1280 to 1490, a sizable range.
Honors completion rates are something of an issue these days. An honors completion rate is the percentage of first year honors entrants who complete at least one honors program graduation requirement by the time of graduation from the university. About 42 percent of honors students do not complete honors requirements before graduation, although a very high percentage of honors entrants (87 percent) do graduate from the university.
The seven honors programs with honors completion rates of 75 percent or higher in our 2018 ratings had average SAT scores ranging from 1340 to 1510; the mean for this group was 1420. The mean SAT for the 31 (of 41) programs that provided completion rates was 1405, not much lower. And another seven programs with mean SAT scores of 1420 or higher had completion rates below 58 percent, the group mean.
The mean SAT score for all 41 rated programs was 1407; the mean SAT for the top seven programs was only one point higher at 1408.
It is clear, at least with respect to honors programs, that average SAT scores are not the best predictors of program effectiveness. What does this mean for the value of test scores nationwide, if anything?
I think it means that for students who are in the 1280 to 1500 SAT range, success depends as much or more on mentoring, smaller interdisciplinary sections, student engagement, course availability, community (including housing), and advising support than it does on test scores.
The good news here is that even for students who are not in honors programs, high levels of achievement are accessible to students who do not begin college with extremely high test scores, although non-honors students will probably have to assert themselves more in order to benefit from the strongest attributes of their university.
Editor’s Note: We hope to update this post before the end of September 2019. The list appears after the introductory section. The list was current as of September 25, 2018.
In a previous post, Based on Academic Reputation Alone, Publics Would Be Higher in U.S. News Rankings, we write that many public universities have a reputation in the academic community that is much higher than their overall ranking by U.S. News. In this post, we will summarize the reasons that prospective honors students and their parents might consider paying more attention to academic reputation than to other factors in the oft-cited rankings. The list also facilitates comparisons of public and private universities.
First, these are factors to consider if the state university’s academic reputation is much stronger than its overall ranking:
1. The overall rankings penalize public universities for their typically larger class sizes, but the average honors class size in our most recent study of honors programs is 24.9 students, much smaller than the average class size for the universities as a whole. Many of these honors classes are lower-division, where the preponderance of large classes is often the norm. First-year honors seminars and classes for honors-only students average 17.5 students per section. Result: the relatively poor rating the whole university might receive for class size is offset for honors students.
2. The overall rankings hit some public universities hard for having relatively low retention and graduation percentages, but freshmen retention rates in honors programs are in the 90% range and higher; meanwhile six-year grad rates for honors entrants average 87%–much higher than the average rates for the universities as a whole. Result: the lower rates for the universities as a whole are offset for honors students.
3. All public universities suffer in the overall rankings because U.S. News assigns ranking points for both the wealth of the university as a whole and for the impact that wealth has on professors’ salaries, smaller class sizes, etc. This is a double whammy in its consideration of inputs and outputs separately; only the outputs should be rated. Result: the outputs for class size (see above) are offset for honors students, and the wealth of the university as an input should not be considered in the first place.
4. For highly-qualified students interested in graduate or professional school, academic reputation and the ability to work with outstanding research faculty are big advantages. Honors students have enhanced opportunities to work with outstanding faculty members even in large research universities, many of which are likely to have strong departmental rankings in the student’s subject area. Result: honors students are not penalized for the research focus of public research universities; instead, they benefit from it.
5. Many wealthy private elites are generous in funding all, or most, need-based aid, but increasingly offer little or no merit aid. This means that families might receive all the need-based aid they “deserve” according to a federal or institutional calculation and still face annual college costs of $16,000 to $50,000. On the other hand, national scholars and other highly-qualified students can still receive significant merit aid at most public universities. Result: if a public university has an academic reputation equal to that of a wealthy private elite, an honors student could be better off financially and not suffer academically in a public honors program.
But…what if the academic reputation of the public university is lower than that of a private school under consideration? In this case, the public honors option should offer the following offsets:
1.The net cost advantage of the public university, including merit aid, probably needs to be significant.
2. It is extremely important to evaluate the specific components of the honors program to determine if it provides a major “value-added” advantage–is it, relatively, better than the university as a whole. Often, the answer will be yes. To determine how much better, look at the academic disciplines covered by the honors program, the actual class sizes, retention and graduation rates, research opportunities, and even honors housing and perks, such as priority registration.
The rankings below are on a 5.0 scale, and there are many ties. We have included national universities with reputations rankings between 2.7 and 4.9.
Editor’s Note: This is the third and final post in our series on honors program completion rates.
In the first post, we wrote about the hybrid structure of honors programs and how that can affect honors completion rates. An honors completion rate is the percentage of honors students who complete all honors course requirements for at least one option by the time they graduate. The second post presented a tentative formula for evaluating honors completion rates.
This post has two parts. The first part compares honors completion rates of main option and multiple option honors programs; the second part (2) a compares completion rates of honors colleges and honors programs.
Main option programs emphasize only one curriculum completion path, usually requiring more than 30 honors credits and often an honors thesis as well. Multiple option programs offer two or more completion paths for first-year students. One option might require 24 honors credits; another might require 15-16 credits. Either of these might also require a thesis.
Many universities are now establishing honors colleges. These usually have a dean and a designated staff of advisors. They typically provide at least enough honors housing space for first-year students. Some began as honors programs and then re-formed into honors colleges. Quite a few honors colleges have significant endowments.
Honors programs do not have a dean, but are administered by a director and staff. Sometimes there are few real differences between honors colleges and programs. In general, however, honors colleges have more staff and offer more access to honors housing.
We received data from 23 honors colleges and eight honors programs, having a combined enrollment of more than 64,000 honors students. The 31 parent universities had an average U.S. News ranking of 126, ranging from the low 50s to higher than 200.
The first summary is below:
[table id=96 /]
The second summary, comparing honors colleges and honors programs, is below:
Honors completion rates, as we noted in a previous post, are a complicated issue. They represent the percentage of students who enter an honors program and then complete all honors requirements for at least one completion option by the time they graduate.
They are related to university freshman retention rates and university graduation rates, but in order to evaluate them there must be some workable baseline completion rate derived from a significant sample of programs.
Honors deans and directors at 31 public university honors programs contributed the data used to calculate the values in the next paragraph, along with extensive additional data we use in rating honors programs. The 31 programs enrolled more than 64,000 honors students in Fall 2017. At some point we might include completion rates as a metric; if we do, then this formula, or an improved version, might be used.
This tentative formula takes into account (1) the average (mean) honors completion rate for the whole data set (57.88 percent); (2) the mean university-wide freshman retention rate for the whole data set (86.81 percent); (3) the completion rate of each program; (4) the freshman retention rate for the parent university of each program; and (5) the graduation rate of each university.
The formula assumes that a desirable target honors completion rate should at least equal the midway point between the university graduation rate and the adjusted honors completion rate.(See examples below, however, for programs that have honors completion rates that exceed the university graduation rate.) The formula can easily be changed to include lower or higher target levels by increasing or reducing the divisor.
H = the mean honors completion rate for the data set;
F = the mean freshman retention rate for the data set;
P = the program completion rate;
C = the completion rate of each program adjusted to the university freshman retention rate (.67*R);
R = the freshman retention rate of each parent university;
G = the graduation rate of each parent university;
T = the estimated target completion rate after the formula is applied. T = (G + C) /2. This is an estimate of what the minimum completion rate should be, given the university’s freshman retention rate and graduation rate, and the mean completion rate and mean freshman retention rate for this data set. Other data sets would of course have different data, but the formula could still be applied.
The completion rates of ten programs exceeded the graduation rates of their parent universities.
Here is the formula, where P = 61%; R = 92%; G = 83%:
First step = (H/F), or .57.88 / 86.81. The result is .67. This is a constant for this data set.
Second step is to adjust the completion rate in relation to the university freshman retention rate = .67 *R, or .67 *92. The result is 61.64 (C), a bit higher than the actual program completion rate of 61.0 (P), because of the relatively high freshman retention rate.
Third step is to adjust the completion rate C in relation to the university graduation rate in order to calculate the target completion rate. T = (G + C) /2, or (83 + 61.64) /2 = 72.32 (T).
Fourth step is to calculate P – T, which would be 61.00 – 72.32 = –11.32. This step calculates the extent to which the program completion rate varies from the estimated target rate. The program is performing below the estimated target rate. The relatively high university graduation rate is the main reason.
Honors program A had a program completion rate (P) of 84%, a freshman retention rate (R) of 88%, and a university graduation rate (G) of 73%. The C rate would be .67*88, or 58.96. The T calculation would be (G + C) /2, or (73 + 58.96) / 2= 65.98 (T). Now calculate C – T, (or 84 – 65.98) = +18.02. This program is performing far above its estimated target rate.
Honors program B had the same program completion rate (P) of 84% but a much higher freshman retention rate (R) of 95%, and a university graduation rate (G) of 81%. Calculating the C value would be .67*95, or 63.7, and the T would (G + C) /2, or (81 – 63.7) /2 = 73.325. When we calculate C – T, (84 – 73.325), the result is + 11.675. This program is performing well above its estimated rage, but even with the same completion rate as Program A, the impact of higher graduation and freshman retention rates for Program B causes its relative performance rating to be lower than Program A. In other words, the expectations were higher for Program B. Both programs are exceptional in that their honors completion rates exceed their university graduation rates.
Honors program D had a program completion rate (P) of 40%, a freshman retention rate (R) of 82%, and a university graduation rate (G) of 53%. C would be .67*82, or 54.94. T would be (G + C) /2, or (53 + 54.94) /2 = 53.97. Calculating C – T, the result is 40 – 53.97, or -13.97. Program D is significantly underperforming based on the formula.
Especially notable in the list below are the changes in major public universities.
Included here are institutions that were, at some point, ranked in the top 50 in those two categories. Some values are blank because in those years the magazine did not give individual rankings to every institution, instead listing them in large groups described as “quartiles” or “tiers.” The rankings shown for 1983 and 1985 are the ones that U.S. News published in its magazine in those same years. For all subsequent years, the rankings come from U.S. News’s separate annual publication “America’s Best Colleges”, which applies rankings for the upcoming year.