U.S. News 2020: Dept Rank vs Academic Rep vs Overall Rank Plus Social Mobility

The post is by editor John Willingham.

Yes, the title of this post is a mouthful. For years now, I have kept an updated list of the departmental rankings that U.S. News publishes so that I can add them to the biannual profiles I do of honors programs. When the 2020 rankings came out, I wanted to see whether there was any clear relationship between the departmental scores and the academic reputation scores. Then I compared the latest reputation scores with those published in 2015 to see how much had changed. Finally, the table below also includes changes in university rankings and the most recent rankings for social mobility.

(I would welcome comments on this post. Please email editor@publicuniversityhonors.com.)

It appears that the social mobility metric has had some impact, especially if the ranking is very strong, as in the case of many UC campuses and Florida institutions. There is no clear relationship between departmental scores and academic reputation scores. Departmental rankings do have a modest relationship to the overall U.S. News rankings, but there are many inconsistencies. Academic reputation scores do seem to show some “grade inflation” since 2015; often this is the case even when the U.S News ranking has dropped significantly.

The table below includes data for 100 public and private universities.

The cumulative rankings that I do for 15 academic disciplines requires some explanation. U.S. News only ranks graduate programs for most departments. Here are the disciplines for which I have cumulative departmental rankings, using the most recent data (2018): biological sciences; business (undergrad); chemistry, computer science; earth sciences; economics; education; engineering (undergrad);English; history; mathematics; physics; political science; psychology; and sociology.

Not every university has a ranked department in each of the 15 disciplines. I averaged departmental rankings for every university that had at least six ranked departments. For universities with, say, fewer than 12 ranked departments, the total ranking will be artificially high because only the best departments are ranked and I cannot include unranked departents. Most universities have 12-15 departments that are ranked, and so the overall average will be more useful for them. And some of the universities with a small number of ranked departments are specialized, such as Georgia Tech and Caltech. Clearly, even ranking only six or seven departments for those schools and getting a strong result is not misleading.

Universities with fewer than 10 departmental rankings: Colorado School of Mines; Georgia Tech; Miami Ohio; American; Brigham Young; Caltech; Dartmouth; Drexel; Fordham; Georgetown; and RPI.

It should be said that universities with relatively low departmental rankings can legitimately receive high rankings because of other meaningful factors, such as grad and retention rates and class size. Some excellent universities do not have an especially strong research focus or a lot of graduate programs. Dartmouth is one prominent example.

UNIVERSITYAvg Dept RankDept RankRep ScoreRep RankRep ScoreRep DifUS NewsRank Dif2020 Rank  
NAME15 Disciplines 2018Ordinal2020202020152020 v 2015Rank 20202015-2020Soc Mobility
Harvard5.7164.914.9020186
Stanford1.9314.914.9061241
MIT2.7324.914.9034241
Princeton5.3854.914.80.110186
Yale10.9294.854.8030285
UC Berkeley3.224.764.7022-270
Columbia10.2384.764.60.131138
Caltech4.7144.764.60.112-2345
Johns Hopkins21.93194.764.50.2102241
Chicago11.67114.664.606-2335
Cornell13.79134.664.50.117-2224
Penn16.73154.664.40.262241
Duke20.23174.5134.40.110-2254
Brown27.62284.5134.40.1142224
Michigan9.474.5134.40.1254291
Northwestern17.86164.5134.30.294251
Dartmouth51.38574.4174.20.212-1303
UCLA10.8694.3184.20.120313
Carnegie Mellon27.73294.3184.20.1250303
Georgia Tech33.7374.3184.20.1297224
Vanderbilt35.57404.3184.10.217-1291
Virginia27.4274.2224.3-0.128-5328
Rice31.92334.22240.2172204
Georgetown53.75614.22240.224-3241
Notre Dame45.43474.2223.90.315-1322
North Carolina23.79214.1264.10291165
UW Madison12.93124.1264.10461297
WUSTL32.29344.12640.119-5381
Emory45.82494.12640.121-1200
UT Austin14.47144.12640.1485134
NYU25234313.80.2293115
Illinois20.07173.9324.1-0.248-6186
Washington 22.2203.9324-162-14176
USC35.27393.9323.9022-3147
UC Davis28.14303.9323.80.139-19
UC San Diego25.93243.9323.80.137021
William & Mary69363.8373.70.140-7354
Ohio St26.4253.8373.70.1540254
Purdue40.27413.8373.60.2575270
Tufts73.8783.8373.60.229-2328
UC Irvine32.53353.8373.60.23663
Florida48.67523.8373.60.2341434
Penn State27.27263.7433.60.157-9348
Maryland28.8313.7433.60.164-2322
Minnesota24.2223.7433.60.170-1251
Boston College50.27543.7433.60.137-6270
Texas A&M41.6423.7433.60.170-296
Indiana29.93323.7433.60.179-3303
Case Western72.91773.7433.50.240-2214
Boston Univ48.67523.7433.50.2402270
Colorado 33.2363.7433.50.2104-16359
Virginia Tech52.31603.7433.40.374-3322
Wake Forest98.75933.6533.50.1270360
Brandeis63.92683.6533.50.140-5138
UC Santa Barbara35.21383.6533.50.13469
Arizona43443.6533.50.11173195
Georgia 63653.6533.40.25013159
Tulane90.77893.6533.40.24013365
Pitt45.4463.6533.40.2575335
George Washington76.92833.5603.5070-19322
Iowa50.27543.5603.5084-13335
Michigan St42.13433.5603.50841241
RPI62623.5603.40.1402270
Rochester52593.5603.40.1294159
Col School of Mines74.83793.5603.30.2844303
U of Miami85.69873.5603.20.357-9270
Northeastern67.85723.5603.20.3402254
Rutgers43.87453.4683.4062859
Syracuse69.33753.4683.30.154490
Oregon51.43573.4683.30.11042214
Kansas63.87673.4683.30.1130-24377
UMass Amherst48.57513.4683.20.26412186
Arizona St45.67483.4683.20.211712147
Clemson89.6883.4683.20.270-8348
Lehigh106.67983.3763.3050-10270
Stony Brook46.46503.3763.20.191-324
Iowa St50.27543.3763.20.1121-15270
Connecticut69.47763.3763.10.264-6265
Auburn94.36923.3763.10.2107-4380
Tennessee76.77813.3763.10.21042138
SMU109.6993.37630.364-6360
Florida St68.8733.37630.3573880
Missouri76.87813.2833.3-0.1139-40354
Baylor103.09943.2833.2079-8297
American105.83963.2833.10.177-6176
Delaware76.54803.2833.10.191-15360
Miami Oh94.11913.2833.10.191-15369
NC State67.09703.2833.10.18411224
Nebraska67.33713.2833.10.1139-40303
Brigham Young80.22843.28330.277-15291
Utah60.87633.28330.210425186
Fordham105.83963.1923.2-0.174-16351
UC Riverside64.33693.1923.1091221
Alabama124.911003.19230.1153-65377
UC Santa Cruz59.71623.19230.18412
Drexel105953.19230.197-2270
Oklahoma83.4853.19230.1132-26328
Washington St84.5863.19230.1166-28176
George Mason93.67903.19230.1153-25125
UIC63.53663100301321714
MEAN SCORES/RANKS49.9083503.77247.713.6610.10256.96-2.51229.38

 

 

What Are the Differences Between an Honors and a Non-Honors Undergraduate Education?

At last, there is a major study that goes a long way toward answering this important question.

Dr. Art Spisak

Making good use of the increasing data now available on honors programs and their parent institutions, two honors researchers have recently published a major paper that compares honors students and non-honors students from 19 public research universities. Out of 119,000 total students, a total of 15,200 were or had been participants in an honors program.

The study is extremely helpful to parents and prospective honors students who rightly ask how an honors education differs from a non-honors education: How will participation in an honors program shape and differentiate an honors student? Will an honors education be the equivalent of an education at a more prestigious private college?

The authors of the study are Dr. Andrew Cognard-Black of St. Mary’s College of Maryland and Dr. Art Spisak, Director of the University of Iowa Honors Program and former president of the National Collegiate Honors Council (NCHC).The title of their paper, published in the Journal of the National Collegiate Honors Council, is Honors and Non-Honors Students in Public Research Universities in the United States.”

Dr. Andrew Cognard-Black

Here are the major findings:

Feelings about the undergraduate experience: “In their undergraduate experience, students in the honors group reported a more positive experience, on average, than those in the non-honors group.” Both groups attended classes with similar frequency, but honors students reported greater activity in the following areas:

  1. finding coursework so interesting that they do more work than is required;
  2. communicating with profs outside of class;
  3. working with faculty in activities other than coursework;
  4. increasing effort in response to higher standards;
  5. completing assigned reading;
  6. attending to self care, eating, and sleeping;
  7. spending more time studying;
  8. performing more community service and volunteer work;
  9. participating in student organizations;
  10. and, while spending about the same time in employment, finding on-campus employment more frequently than non-honors students.

Participation in “high-impact” activities: These experiences contribute to undergraduate success and satisfaction as well as to higher achievement after graduation. Some of these are restricted to upperclassmen, so the study concentrated on participation by seniors in high-impact activities, including undergraduate research, senior capstone or thesis, collaborating with a professor on a project or paper, studying abroad, or serving in a position of leadership.

“Those [students] in the honors student segment of the senior sample had markedly higher cumulative college grade point averages.” The cumulative GPA of the honors group was 3.65; for the non-honors group it was 3.31. “A grade point average of 3.31 is located at the 38th percentile in the overall distribution within the study sample, and a grade point average of 3.65 is at the 69th percentile.” The authors found that the very significant difference was “particularly impressive” given that the high school GPAs of honors and non-honors students did not vary so significantly. Honors students were also 14% more likely to have served as an officer in a campus organization.

Students in the honors group were 77% more likely to have assisted faculty in research projects, 85% percent more likely to have studied abroad, and 2.5 times more likely to have conducted undergraduate research under faculty guidance.

Intellectual curiosity: Honors students expressed a statistically significant but not dramatically greater degree of intellectual curiosity; however, their intellectual curiosity was aligned with the “prestige” of an academic major. The study did not measure whether this attachment to prestige reflected a desire for greater intellectual challenge or for higher salaries associated with many such majors. (Or both.) Both groups placed similar emphasis on the importance of high pay after graduation and on career fulfillment.

Diversity: The study found that African American students were only 52% as likely to be in an honors program as they are to be in the larger university sample. Latin American students were 58% as likely. These figures may be due in part to the fact that, as a group, the 19 research universities “are located in states that are somewhat more white than the nation as a whole, but most of the discrepancy can be attributed to the fact that Research 1 universities do not, in general, have enrollments that are especially representative of ethnic and racial minorities.” On the other hand, LGBQ, transgender, and gender-questioning students “appear to be slightly over-represented among honors students.”

Low-income and first generation participation: These students “are significantly and substantially under-represented in the honors group.” Pell Grant recipients are 30% less likely to be in honors than in the non-honors group; and 40% of first-generation students are less likely to be in the honors group.

Test scores and HSGPA: There was a difference between honors and non-honors students, but it was not dramatic. “Regardless of which test score was used, the honors group had scores that were about 10% higher, on average.” (In our ratings of honors programs, we have found that honors test scores were about 17% higher, based on actual honors scores and the mid-range of test scores in U.S. News rankings.) The average high school GPA for the honors group was .11 points higher than for the non-honors group.

The study used data from the 2018 Student Experience in the Research University (SERU) survey for 2018. Although the study only used data from Research 1 universities that comprise only 3% of all colleges and universities in the nation, R1 universities enroll 28.5% of all undergraduates pursuing four-year degrees.

Research centered on honors education is increasingly important: An estimated 300,000-400,000 honors students are enrolled in American colleges and universities today.

 

 

 

 

Here’s Why We Don’t Use Test Scores in Rating Honors Programs

The following post is from site editor John Willingham.

In the aftermath of the “Varsity Blues” college admissions scandal that included cheating on entrance exams, three social scientists recently weighed in on the continued importance of those same examinations, arguing that “No one likes the SAT” but “It’s still the fairest thing about admissions.”

“It has become a mantra in some quarters to assert that standardized tests measure wealth more than intellectual ability or academic potential, but this is not actually the case. These tests clearly assess verbal and mathematical skills, which a century of psychological science shows are not mere reflections of upbringing. Research has consistently found that ability tests like the SAT and the ACT are strongly predictive of success in college and beyond, even after accounting for a student’s socioeconomic status.”

For years, U.S. News has used test scores and selection rates as ranking data for the annual “Best Colleges” report. The publication has slightly reduced the impact of test scores in recent editions.

Below I will explain why we do not include test scores as a metric and argue that, for honors and non-honors students, other factors are more important in predicting success. (High school GPA is certainly a major factor; but since almost all honors students have high GPAs, I do not discuss the impact of GPA in this post.)

In their published scholarly work, the authors argue that test scores by themselves correlate very strongly ( r= -.892) with the annual U.S. News Best Colleges rankings for national universities even though the test scores count for only 7.75 percent of the total ranking score. (The authors do not cite the impact of test scores on other ranking factors such as graduation and retention rates, which together account for 22 percent of the total ranking score.)

Our own work for the past eight years, however, shows that test scores do not have a similar correlation to quantitative assessments of honors programs. In our publications we list minimum and average admissions test scores for all programs we rate, but we do not count the scores alone as a rating factor.

Here’s why we do not use test scores as a measure: The factors that make for an excellent honors program are primarily structural. The major building blocks are the credits required for honors completion; the number of honors class sections offered, by type and academic discipline; the availability of priority registration and honors housing; the size of honors class sections; and the number of staff to assist students.

So, don’t the test scores drive the university graduation rates of honors program entrants, just as they do in elite colleges? The answer is not so much; the correlation is r= .50

Admittedly, it is probably difficult for a student with, say, a 1050 SAT score to succeed in an elite college or in most honors programs. But within a fairly large range of SAT scores (~1280-1510), the opportunities for success are more often present given a conducive structure. With every biannual review of honors data, I find great pleasure in discovering outstanding honors programs that are not housed in highly- ranked and extremely selective universities. The golden nuggets of excellence in higher education are scattered much farther and wider than many would have us believe.

I am strongly opposed to the numerical ranking of colleges or their honors programs, whether or not test scores are included in the methodology. I ranked honors program one time, in 2012, and regret doing so. Yes, I have data that allows me to numerically differentiate the total rating scores earned by honors programs. But anyone who wants to provide some kind of assessment of colleges or programs needs to do so with the assumption that their methodology is subjective and imperfect. Ordinal rankings based on distinctions of one point or fractions of a point give readers a veneer of certitude that a qualitative difference exists even if it (often) does not.

Although we do not rank honors programs, we do place them in one of five rating groups, a process that is similar to rating films on a five-star basis but based on quantitative rather than completely subjective data. The seven honors programs in the top group in 2018 (out of 41) had average SAT scores (enrolled students) ranging from 1280 to 1490, a sizable range.

Honors completion rates are something of an issue these days. An honors completion rate is the percentage of first year honors entrants who complete at least one honors program graduation requirement by the time of graduation from the university. About 42 percent of honors students do not complete honors requirements before graduation, although a very high percentage of honors entrants (87 percent) do graduate from the university.

The seven honors programs with honors completion rates of 75 percent or higher in our 2018 ratings had average SAT scores ranging from 1340 to 1510; the mean for this group was 1420. The mean SAT for the 31 (of 41) programs that provided completion rates was 1405, not much lower. And another seven programs with mean SAT scores of 1420 or higher had completion rates below 58 percent, the group mean.

The mean SAT score for all 41 rated programs was 1407; the mean SAT for the top seven programs was only one point higher at 1408.

It is clear, at least with respect to honors programs, that average SAT scores are not the best predictors of program effectiveness. What does this mean for the value of test scores nationwide, if anything?

I think it means that for students who are in the 1280 to 1500 SAT range, success depends as much or more on mentoring, smaller interdisciplinary sections, student engagement, course availability, community (including housing), and advising support than it does on test scores.

The good news here is that even for students who are not in honors programs, high levels of achievement are accessible to students who do not begin college with extremely high test scores, although non-honors students will probably have to assert themselves more in order to benefit from the strongest attributes of their university.

 

 

 

Academic Reputation Rankings for 155 National Universities (and What That Means for Honors)

Editor’s Note: We hope to update this post before the end of September 2019. The list appears after the introductory section. The list was current as of September 25, 2018.

In a previous post, Based on Academic Reputation Alone, Publics Would Be Higher in U.S. News Rankings, we write that many public universities have a reputation in the academic community that is much higher than their overall ranking by U.S. News. In this post, we will summarize the reasons that prospective honors students and their parents might consider paying more attention to academic reputation than to other factors in the oft-cited rankings. The list also facilitates comparisons of public and private universities.

First, these are factors to consider if the state university’s academic reputation is much stronger than its overall ranking:

1. The overall rankings penalize public universities for their typically larger class sizes, but the average honors class size in our most recent study of honors programs is 24.9 students, much smaller than the average class size for the universities as a whole.  Many of these honors classes are lower-division, where the preponderance of large classes is often the norm. First-year honors seminars and classes for honors-only students average 17.5 students per section.  Result:  the relatively poor rating the whole university might receive for class size is offset for honors students.

2. The overall rankings hit some public universities hard for having relatively low retention and graduation percentages, but freshmen retention rates in honors programs are in the 90% range and higher; meanwhile six-year grad rates for honors entrants average 87%–much higher than the average rates for the universities as a whole.  Result: the lower rates for the universities as a whole are offset for honors students.

3. All public universities suffer in the overall rankings because U.S. News assigns ranking points for both the wealth of the university as a whole and for the impact that wealth has on professors’ salaries, smaller class sizes, etc.  This is a double whammy in its consideration of inputs and outputs separately; only the outputs should be rated.  Result: the outputs for class size (see above) are offset for honors students, and the wealth of the university as an input should not be considered in the first place.

4. For highly-qualified students interested in graduate or professional school, academic reputation and the ability to work with outstanding research faculty are big advantages. Honors students have enhanced opportunities to work with outstanding faculty members even in large research universities, many of which are likely to have strong departmental rankings in the student’s subject area.  Result: honors students are not penalized for the research focus of public research universities; instead, they benefit from it.

5. Many wealthy private elites are generous in funding all, or most, need-based aid, but increasingly offer little or no merit aid. This means that families might receive all the need-based aid they “deserve” according to a federal or institutional calculation and still face annual college costs of $16,000 to $50,000. On the other hand, national scholars and other highly-qualified students can still receive significant merit aid at most public universities. Result: if a public university has an academic reputation equal to that of a wealthy private elite, an honors student could be better off financially and not suffer academically in a public honors program.

But…what if the academic reputation of the public university is lower than that of a private school under consideration?  In this case, the public honors option should offer the following offsets:

1.The net cost advantage of the public university, including merit aid, probably needs to be significant.

2. It is extremely important to evaluate the specific components of the honors program to determine if it provides a major “value-added” advantage–is it, relatively, better than the university as a whole. Often, the answer will be yes. To determine how much better, look at the academic disciplines covered by the honors program, the actual class sizes, retention and graduation rates, research opportunities, and even honors housing and perks, such as priority registration.

The rankings below are on a 5.0 scale, and there are many ties. We have included national universities with reputations rankings between 2.7 and 4.9.

University Acad Rep Ranking
Princeton 4.9 1
Harvard 4.9 1
Stanford 4.9 1
MIT 4.9 1
Yale 4.7 5
Columbia 4.7 5
Caltech 4.7 5
UC Berkeley 4.7 5
Chicago 4.6 9
Johns Hopkins 4.6 9
Cornell 4.6 9
Penn 4.5 12
Duke 4.5 12
Northwestern 4.4 14
Brown 4.4 14
Michigan 4.4 14
Dartmouth 4.3 17
Carnegie Mellon 4.3 17
UCLA 4.3 17
Georgia Tech 4.3 17
Vanderbilt 4.2 21
Virginia 4.2 21
Washington Univ 4.1 23
Rice 4.1 23
Notre Dame 4.1 23
Emory 4.1 23
Georgetown 4.1 23
North Carolina 4.1 23
UT Austin 4.1 23
USC 4 30
UW Madison 4 30
NYU 3.9 32
UC Davis 3.9 32
Illinois 3.9 32
Washington 3.9 32
William & Mary 3.8 36
UC San Diego 3.8 36
Ohio St 3.8 36
Purdue 3.8 36
Tufts 3.7 40
Case Western 3.7 40
UC Irvine 3.7 40
Penn State 3.7 40
Florida 3.7 40
Maryland 3.7 40
Minnesota 3.7 40
Wake Forest 3.6 47
Boston College 3.6 47
Brandeis 3.6 47
Boston Univ 3.6 47
UC Santa Barbara 3.6 47
Georgia 3.6 47
Texas A&M 3.6 47
Indiana 3.6 47
Colorado 3.6 47
Arizona 3.6 47
RPI 3.5 57
Tulane 3.5 57
George Washington 3.5 57
Pitt 3.5 57
Virginia Tech 3.5 57
Iowa 3.5 57
Michigan St 3.5 57
Rochester 3.4 64
U of Miami 3.4 64
Northeastern 3.4 64
Rutgers 3.4 64
Col School of Mines 3.4 64
UMass Amherst 3.4 64
Arizona St 3.4 64
Pepperdine 3.3 71
Syracuse 3.3 71
RIT 3.3 71
Connecticut 3.3 71
Clemson 3.3 71
Auburn 3.3 71
Stony Brook 3.3 71
Iowa St 3.3 71
Oregon 3.3 71
Kansas 3.3 71
Lehigh 3.2 81
Villanova 3.2 81
SMU 3.2 81
American 3.2 81
Delaware 3.2 81
Miami Oh 3.2 81
Alabama 3.2 81
Florida St 3.2 81
NC State 3.2 81
Missouri 3.2 81
Tennessee 3.2 81
Fordham 3.1 92
Brigham Young 3.1 92
Baylor 3.1 92
Pacific 3.1 92
Drexel 3.1 92
UC Santa Cruz 3.1 92
Oklahoma 3.1 92
Nebraska 3.1 92
South Carolina 3.1 92
UC Riverside 3.1 92
Kentucky 3.1 92
George Mason 3.1 92
Utah 3.1 92
WPI 3 105
Marquette 3 105
Loyola Chicago 3 105
Howard 3 105
Binghamton 3 105
Vermont 3 105
UI Chicago 3 105
Univ at Buffalo 3 105
Colorado St 3 105
Temple 3 105
Kansas St 3 105
Clark 2.9 116
Denver 2.9 116
San Diego 2.9 116
DePaul 2.9 116
St. Louis 2.9 116
New Hampshire 2.9 116
Arkansas 2.9 116
Mississippi 2.9 116
San Diego St 2.9 116
Seton Hall 2.9 116
LSU 2.9 116
Yeshiva 2.8 127
Stevens Inst Tech 2.8 127
New School 2.8 127
Hofstra 2.8 127
TCU 2.8 127
St. John’s 2.8 127
Illinois Tech 2.8 127
Texas Tech 2.8 127
TCU 2.8 127
UAB 2.8 127
USF 2.8 127
VCU 2.8 127
UT Dallas 2.8 127
New Mexico 2.8 127
Univ at Albany 2.8 127
UMBC 2.8 127
Cincinnati 2.8 127
URI 2.8 127
Tulsa 2.7 145
Catholic 2.7 145
Clarkson 2.7 145
Michigan Tech 2.7 145
UCF 2.7 145
Georgia State 2.7 145
NJIT 2.7 145
Idaho 2.7 145
UNC Charlotte 2.7 145
UC Merced 2.7 145
Hawaii Manoa 2.7 145

Honors Completion Rates: A Statistical Summary

Editor’s Note: This is the third and final post in our series on honors program completion rates.

In the first post, we wrote about the hybrid structure of honors programs and how that can affect honors completion rates. An honors completion rate is the percentage of honors students who complete all honors course requirements for at least one option by the time they graduate. The second post presented a tentative formula for evaluating honors completion rates.

This post has two parts. The first part compares honors completion rates of main option and multiple option honors programs; the second part (2) a compares completion rates of honors colleges and honors programs.

Main option programs emphasize only one curriculum completion path, usually requiring more than 30 honors credits and often an honors thesis as well. Multiple option programs offer two or more completion paths for first-year students. One option might require 24 honors credits; another might require 15-16 credits. Either of these might also require a thesis.

Many universities are now establishing honors colleges. These usually have a dean and a designated staff of advisors. They typically provide at least enough honors housing space for first-year students. Some began as honors programs and then re-formed into honors colleges. Quite a few honors colleges have significant endowments.

Honors programs do not have a dean, but are administered by a director and staff. Sometimes there are few real differences between honors colleges and programs. In general, however, honors colleges have more staff and offer more access to honors housing.

We received data from 23 honors colleges and eight honors programs, having a combined enrollment of more than 64,000 honors students. The 31 parent universities had an average U.S. News ranking of 126, ranging from the low 50s to higher than 200.

The first summary is below:

PART ONE: SUMMARY STATISTICS   
MAIN OPT PROGRAMS VS
MULTI OPTION PROGRAMS
MEASUREALL PROGRAMSMAIN OPTIONMULTI OPTION
NO. OF PROGRAMS311516
NO. HONORS STUDENTS642872768836599
PROGRAM SIZE2073.81845.92287.4
COMPLETION % rate57.967.848.6
UNIVERSITY GRAD RT67.268.765.6
UNIV GRAD RT>COMPLETION RT9.3.917.0
HONORS GRAD RATE86.988.785.2
HONORS GR RT>COMPLETION RT29.020.936.6
HONORS GR RT>UNIV GR RT19.720.019.6
FRESH RETENTION86.787.785.6
TEST SCORES ADJ TO SAT1405.61416.91395.0
CURRICULUM REQUIREMENT AVG27.031.822.1
CLASS SIZE24.224.024.4
THESIS OPTION Y/N27/411/416/0
THESIS REQ ALL OPTIONS Y/N14/3110/54/16
DORM RMS / FR & SOPH0.53.57.48
HON CLASS SEATS / HON STUDENTS1.291.491.11
APPLY SEP TO HONORS Y/N23/812/311/5

The second summary, comparing honors colleges and honors programs, is below:

SUMMARY PART TWO: HON COLLEGESHON PROGRAMS
HONORS COLLEGES vs
HONORS PROGRAMS
NO. OF PROGRAMS238
NO. HONORS STUDENTS5277111516
PROGRAM SIZE2294.391439.5
COMPLETION % rate54.866.7
UNIVERSITY GRAD RT64.774.5
UNIV GRAD RT>COMPLETION RT9.97.8
HONORS GRAD RATE85.591.0
HONORS GR RT>COMPLETION RT30.724.3
HONORS GR RT>UNIV GR RT20.816.5
FRESH RETENTION85.689.5
TEST SCORES ADJ TO SAT1394.61437.3
CURRICULUM REQUIREMENT AVG26.029.75
CLASS SIZE25.022.0
MAIN OPTION 106
MULTIPLE OPTION132
THESIS OPTION Y/N20/236/8
THESIS REQ ALL OPTIONS Y/N10/233/8
DORM RMS / FR & SOPH.55.46
HON CLASS SEATS / HON STUDENTS1.251.44
APPLY SEP TO HONORS Y/N17/235/8

Here is a Formula for Evaluating Honors Completion Rates

Honors completion rates, as we noted in a previous post, are a complicated issue. They represent the percentage of students who enter an honors program and then complete all honors requirements for at least one completion option by the time they graduate.

They are related to university freshman retention rates and university graduation rates, but in order to evaluate them there must be some workable baseline completion rate derived from a significant sample of programs.

Honors deans and directors at 31 public university honors programs contributed the data used to calculate the values in the next paragraph, along with extensive additional data we use in rating honors programs. The 31 programs enrolled more than 64,000 honors students in Fall 2017. At some point we might include completion rates as a metric; if we do, then this formula, or an improved version, might be used.

This tentative formula takes into account (1) the average (mean) honors completion rate for the whole data set (57.88 percent); (2) the mean university-wide freshman retention rate for the whole data set (86.81 percent); (3) the completion rate of each program; (4) the freshman retention rate for the parent university of each program; and (5) the graduation rate of each university.

The formula assumes that a desirable target honors completion rate should at least equal the midway point between the university graduation rate and the adjusted honors completion rate. (See examples below, however, for programs that have honors completion rates that exceed the university graduation rate.) The formula can easily be changed to include lower or higher target levels by increasing or reducing the divisor.

H = the mean honors completion rate for the data set;

F = the mean freshman retention rate for the data set;

P = the program completion rate;

C = the completion rate of each program adjusted to the university freshman retention rate (.67*R);

R = the freshman retention rate of each parent university;

G = the graduation rate of each parent university;

T = the estimated target completion rate after the formula is applied. T = (G + C) /2. This is an estimate of what the minimum completion rate should be, given the university’s freshman retention rate and graduation rate, and the mean completion rate and mean freshman retention rate for this data set. Other data sets would of course have different data, but the formula could still be applied.

The completion rates of ten programs exceeded the graduation rates of their parent universities.

Here is the formula, where P = 61%; R = 92%; G = 83%:

First step = (H/F), or .57.88 / 86.81. The result is .67. This is a constant for this data set.

Second step is to adjust the completion rate in relation to the university freshman retention rate = .67 *R, or .67 *92. The result is 61.64 (C), a bit higher than the actual program completion rate of 61.0 (P), because of the relatively high freshman retention rate.

Third step is to adjust the completion rate C in relation to the university graduation rate in order to calculate the target completion rate. T = (G + C) /2, or (83 + 61.64) /2 = 72.32 (T).

Fourth step is to calculate P – T, which would be 61.00 – 72.32 = –11.32. This step calculates the extent to which the program completion rate varies from the estimated target rate. The program is performing below the estimated target rate. The relatively high university graduation rate is the main reason.

More examples:

Honors program A had a program completion rate (P) of 84%, a freshman retention rate (R) of 88%, and a university graduation rate (G) of 73%. The C rate would be .67*88, or 58.96. The T calculation would be (G + C) /2, or (73 + 58.96) / 2= 65.98 (T). Now calculate C – T, (or 84 – 65.98) = +18.02. This program is performing far above its estimated target rate.

Honors program B had the same program completion rate (P) of 84% but a much higher freshman retention rate (R) of 95%, and a university graduation rate (G) of 81%. Calculating the C value would be .67*95, or 63.7, and the T would (G + C) /2, or (81 – 63.7) /2 = 73.325. When we calculate C – T, (84 – 73.325), the result is + 11.675. This program is performing well above its estimated rage, but even with the same completion rate as Program A, the impact of higher graduation and freshman retention rates for Program B causes its relative performance rating to be lower than Program A. In other words, the expectations were higher for Program B. Both programs are exceptional in that their honors completion rates exceed their university graduation rates.

Honors program D had a program completion rate (P) of 40%, a freshman retention rate (R) of 82%, and a university graduation rate (G) of 53%. C would be .67*82, or 54.94. T would be (G + C) /2, or (53 + 54.94) /2 = 53.97. Calculating C – T, the result is 40 – 53.97, or -13.97. Program D is significantly underperforming based on the formula.

 

 

&nbsp

U.S. News Rankings for 57 Leading Universities, 1983–2007

Below are the U.S. News rankings from 1983 through 2007 for 57 leading national universities. For additional U.S. News rankings, please see U.S. News Rankings, 2008 through 2015, and Average U.S. News Rankings for 129 National Universities, 2011 to 2018.

Especially notable in the list below are the changes in major public universities.

Included here are institutions that were, at some point, ranked in the top 50 in those two categories. Some values are blank because in those years the magazine did not give individual rankings to every institution, instead listing them in large groups described as “quartiles” or “tiers.” The rankings shown for 1983 and 1985 are the ones that U.S. News published in its magazine in those same years. For all subsequent years, the rankings come from U.S. News’s separate annual publication “America’s Best Colleges”, which applies rankings for the upcoming year.

Here is the list:

 Year 83 85 88 89 90 91 92 93 94 95 96 97 98 99 0 1 2 3 4 5 6 7
Stanford University 1 1 1 6 6 2 3 4 6 5 4 6 5 4 6 6 5 4 5 5 5 4
Harvard University 2 2 2 4 3 1 1 1 1 1 1 3 1 1 2 2 2 2 1 1 1 2
Yale University 3 2 3 1 1 3 2 3 3 3 2 1 3 1 4 2 2 2 3 3 3 3
Princeton University 4 4 4 2 2 4 4 2 2 2 2 2 1 1 4 1 1 1 1 1 1 1
University of California at Berkeley 5 7 5 24 13 13 16 16 19 23 26 27 23 22 20 20 20 20 21 21 20 21
University of Chicago 6 5 8 10 9 11 10 9 9 10 11 12 14 14 13 10 9 12 13 14 15 9
University of Michigan at Ann Arbor 7 8 25 17 21 22 24 23 21 24 24 23 25 25 25 25 25 25 22 25 24
Cornell University 8 11 14 11 9 12 11 10 15 13 14 14 6 11 10 14 14 14 14 13 12
University of Illinois at Urbana-Champaign 8 20 45 50 45 42 34 41 36 38 40 37 42 41
Massachusetts Institute of Technology 10 11 5 7 6 6 5 4 4 5 5 6 4 3 5 5 4 4 5 7 4
Dartmouth College 10 10 6 7 8 8 8 7 8 8 7 7 7 10 11 9 9 9 9 9 9 9
California Institute of Technology 12 21 3 4 5 4 5 5 7 7 9 9 9 1 4 4 4 5 8 7 4
Carnegie Mellon University 13 22 24 19 24 24 23 28 23 25 23 23 22 21 23 22 22 21
University of Wisconsin at Madison 13 23 32 41 38 36 34 35 32 31 32 32 34 34
Case Western Reserve University 35 38 37 34 34 38 38 37 37 35 37 38
Tulane University 38 36 34 36 44 45 46 43 44 43 43 44
University of California at Irvine 48 37 41 36 49 41 41 45 45 43 40 44
Rensselaer Polytechnic Institute 39 48 49 49 48 47 48 46 43 42
University of Washington 50 42 44 45 45 47 45 46 45 42
University of Rochester 25 29 30 31 29 32 33 36 36 35 37 34 34
University of California at San Diego 43 34 33 32 32 31 31 31 32 35 32 38
Georgia Institute of Technology 42 48 41 46 40 35 41 38 37 41 37 38
Yeshiva University 45 48 42 44 45 41 40 40 46 45 44
Pennsylvania State University at University Park 41 45 44 40 44 46 45 48 50 48 47
Worcester Polytechnic Institute 48 55 55 53 64
Rutgers University at New Brunswick 45 60 58 60 60
Texas A&M University at College Station 48 48 67 62 60 60
Pepperdine University 49 48 47 51 52 55 54
Syracuse University 49 44 40 47 55 52 50 52
George Washington University 46 50 51 52 53 52
University of Florida 47 49 48 50 50 47
University of California at Santa Barbara 46 47 47 44 45 48 47 45 45 45 47
University of California at Davis 40 40 41 44 42 41 41 43 43 42 48 47
University of Texas at Austin 25 44 49 48 47 53 46 52 47
New York University 36 35 34 35 34 33 32 35 35 32 37 34
Boston College 37 38 38 36 39 38 38 40 40 37 40 34
Emory University 25 22 21 25 16 17 19 9 16 18 18 18 18 18 20 20 18
Vanderbilt University 24 19 25 20 18 22 20 19 20 20 22 21 21 19 18 18 18
Rice University 14 9 10 16 15 12 14 12 16 16 17 18 14 13 12 15 16 17 17 17
Johns Hopkins University 16 11 14 15 11 15 15 22 10 15 14 14 7 15 16 15 14 14 13 16
Brown University 7 10 13 15 12 17 18 12 11 9 8 9 10 14 15 16 17 17 13 15 15
Northwestern University 17 16 19 23 14 13 13 14 13 9 9 10 14 13 12 10 11 11 12 14
Washington University in St. Louis 23 19 22 24 18 20 18 20 20 17 17 16 17 15 14 12 9 11 11 12
Columbia University 18 8 11 10 9 10 11 9 15 11 9 10 10 10 9 10 11 9 9 9
Duke University 6 7 12 5 7 7 7 7 6 6 4 3 6 7 8 8 4 5 5 5 8
University of Notre Dame 18 23 25 19 18 17 19 18 19 19 19 18 19 18 18 20
Georgetown University 17 25 19 19 17 17 25 21 23 21 20 23 23 22 24 23 25 23 23
Lehigh University 33 32 34 36 34 38 38 40 37 37 32 33
Brandeis University 30 29 28 31 31 31 34 31 32 32 34 31
College of William and Mary 22 34 33 32 33 29 30 30 30 31 31 31 31
Wake Forest University 31 25 28 29 28 28 26 25 28 27 27 30
Tufts University 25 22 23 25 29 29 28 28 27 28 27 27
University of Southern California 44 43 41 41 42 35 34 31 30 30 30 27
University of North Carolina at Chapel Hill 9 11 23 18 20 25 27 25 27 24 27 25 28 28 29 29 27 27
University of California at Los Angeles 21 16 17 23 23 22 28 31 28 25 25 25 26 25 26 25 25 26
University of Virginia 15 20 21 18 21 22 21 17 19 21 21 22 22 20 24 23 21 22 23 24
University of Pennsylvania 19 15 20 13 13 14 16 12 11 13 7 6 7 6 5 4 5 4 4 7

Top 25 Universities for Silicon Valley Hires: 17 Are Public

The website Quartz just published a list of the universities that place the highest number of grads at tech firms in Silicon Valley.

“The most coveted jobs are in Silicon Valley, and most selective US universities are members of the Ivy League. So it stands to reason that tech giants like Apple, Google, Amazon and Facebook would scoop up best and brightest from those bastions of power and privilege.

“Think again. None of the eight Ivy League schools—Harvard, Yale, Princeton, Brown, Columbia, Cornell, Dartmouth and the University of Pennsylvania—cracked the top 10 on a list of the universities sending the most graduates to tech firms, according to an analysis by HiringSolved, an online recruiting company. The company used data from more than 10,000 public profiles for tech workers hired or promoted into new positions in 2016 and the first two months of 2017.”

Editor’s note: The HiringSolve link also lists the 10 specific skills most in demand as of 2017, with changes from 2016. For example, the top four skills for entry level placement in 2017 are Python, C++, Java, and algorithms. The top job titles for entry placement in 2017 are Software Engineering Intern, Software Engineer, Business Development Consultant, and Research Intern.

Now let it be said that the 17 public universities in the top 25 are generally much larger than the private institutions on the list, so the sheer volume of highly-trained tech grads from the publics is much larger.

But the final message from Quartz was this:

If the list tells us anything, it’s that admission to an elite university isn’t a prerequisite for a career in Silicon Valley, and what you know is more important than where you learn it.” [Emphasis added.]

Here are the top 25 universities for Silicon Valley tech placement, in numerical order:

UC Berkeley
Stanford
Carnegie Mellon
USC
UT Austin
Georgia Tech
Illinois
San Jose State
UC San Diego
Arizona State
Michigan
UCLA
NC State
Cal Poly
Cornell
Waterloo (Canada)
Texas A&M
Washington
Purdue
MIT
Santa Clara
Univ of Phoenix*
UC Santa Barbara
UC Davis
Penn State

*Hypothesis: hands-on experience and later degrees?

Money Magazine Best Values 2017: CUNY Baruch, Michigan, UC’s, UVA Lead Publics

The new rankings from Money are out, and public colleges and universities account for 27 of the top 50 best values in 2017. These rankings are likely the best college rankings overall, given their balanced approach.

As Jeffrey J. Selingo writes in the Washington Post, the earnings portion of the rankings are based in part on some very interesting new evidence: the “Chelly data.”

“That refers to Raj Chetty,” Selingo tells us, “a Stanford professor, who has led a team of economists that has received access to millions of anonymous tax records that span generations. The group has published several headline-grabbing studies recently based on the data. In findings published in January, the group tracked students from nearly every college in the country and measured their earnings more than a decade after they left campus, whether they graduated or not.

Money does a better job of ranking colleges based on “outcomes” than Forbes does (see Outcomes farther down). This is especially the case with the multiple earnings analyses.

To see the list of top publics, please skip the methodology discussion immediately below.

 

The 2017 rankings include 27 factors in three categories:

Quality of education (1/3 weighting), which was calculated using:

Six-year graduation rate (30%).

Value-added graduation rate (30%). “This is the difference between a school’s actual graduation rate and its expected rate, based on the economic and academic profile of the student body (measured by the percentage of attendees receiving Pell grants, which are given to low-income students, and the average standardized test scores of incoming freshmen).” [Emphasis added.]

“Peer quality (10%). This is measured by the standardized test scores of entering freshman (5%), and the percentage of accepted students who enroll in that college, known as the “yield” rate (5%).” Note: using the yield rate is an improvement over the U.S. News rankings.

“Instructor quality (10%). This measured by the student-to-faculty ratio.” Note: this is very similar to a U.S. News metric.

“Financial troubles (20%). This is a new factor added in 2017, as financial difficulties can affect the quality of education, and a growing number of schools are facing funding challenges.” Note: although this is not an “outcome” either, it is more meaningful than using data on alumni contributions, etc.

Affordability (1/3 weighting), which was calculated using:

“Net price of a degree (30%). This is the estimated amount a typical freshman starting in 2017 will pay to earn a degree, taking into account the college’s sticker price; how much the school awards in grants and scholarships; and the average time it takes students to graduate from the school, all as reported to the U.S. Department of Education….This takes into account both the estimated average student debt upon graduation (15%) and average amount borrowed through the parent federal PLUS loan programs (5%).

“Student loan repayment and default risk (15%).

“Value-added student loan repayment measures (15%). These are the school’s performance on the student loan repayment and default measures after adjusting for the economic and academic profile of the student body.

Affordability for low-income students (20%). This is based on federally collected data on the net price that students from families earning $0 to $30,000 pay.

Outcomes (1/3 weighting), which was calculated using:

“Graduates’ earnings (12.5%), as reported by alumni to PayScale.com; early career earnings within five years of graduation (7.5%), and mid-career earnings, which are for those whose education stopped at a Bachelor’s degree and graduated, typically, about 15 years ago. (5%).

“Earnings adjusted by majors (15%). To see whether students at a particular school earn more or less than would be expected given the subjects students choose to study, we adjusted PayScale.com’s data for the mix of majors at each school; for early career earnings (10%) and mid-career earnings (5%).

“College Scorecard 10-year earnings (10%). The earnings of federal financial aid recipients at each college as reported to the IRS 10 years after the student started at the college.

“Estimated market value of alumni’s average job skills (10%). Based on a Brookings Institution methodology, we matched up data provided by LinkedIn of the top 25 skills reported by each school’s alumni with Burning Glass Technologies data on the market value each listed skill.

“Value-added earnings (12.5%). To see if a school is helping launch students to better-paying jobs than competitors that take in students with similar academic and economic backgrounds, we adjusted PayScale.com’s earnings data for the student body’s average test scores and the percentage of low-income students at each school; for early career earnings (7.5%) and mid-career earnings (5%).

Job meaning (5%). We used the average score of each school’s alumni on PayScale.com’s survey question of “Does your work make the world a better place?”

“Socio-economic mobility index (20%).

For the first time, we included new data provided by the Equality of Opportunity Project that reveals the percentage of students each school move from low-income backgrounds to upper-middle class jobs by the time the student is 34 years old.Finally, we used statistical techniques to turn all the data points into a single score and ranked the schools based on those scores.” [Emphasis added.]

The inclusion of these metrics makes the Money rankings a hybrid of the Washington Monthly “public good” rankings, U.S. News, and Kiplinger rankings, with the socio-economic factors having a less significant impact than the Washington Monthly rankings on overall standing. Still, these factors do result in two CUNY campuses’ receiving high rankings.

“The data showed, for example,” Selingo writes, “that the City University of New York propelled almost six times as many low-income students into the middle class and beyond as all eight Ivy League campuses, plus Duke, M.I.T., Stanford and Chicago, combined. The California State University system advanced three times as many.”

TOP PUBLIC UNIVERSITIES, MONEY MAGAZINE, 2017, BY NAME AND OVERALL RANK INCLUDING PRIVATE INSTITUTIONS:

CUNY Baruch College–2
Michigan–3
UC Berkeley–4
UCLA–5
UC Irvine–7
UC Davis–9
Virginia–11
Washington–13
Georgia Tech–16
Florida–18
Maryland–20
Illinois–22
Virginia Tech–23
College of New Jersey–24
UC Riverside–29
Michigan State–30
UT Austin–31
Binghamton–33
Texas A&M–34
UC Santa Barbara–36
Connecticut–37
Purdue–37 (tie)
VMI–41
Cal State Long Beach–42
CUNY Brooklyn–43
UW Madison–45
James Madison–46
Rutgers, New Brunswick–49
NC State–50

 

Top Honors Programs, Honors Components Only

So, what do we mean by “honors components only”?

In our latest book of honors program ratings, we listed the honors programs and colleges that received an overall five “mortarboard” rating. One component of the rating model used in order to determine the leading programs is prestigious scholarships–the number of Rhodes, Marshall, Truman, Goldwater, etc., awards earned by students from each university as a whole.

In most cases, honors programs at these universities contribute most of the winners of these awards, but not in all cases. So while the prestigious scholarship component is worth including, we do not want it to override the 12 other rating components used in the ratings. These components are “honors only” because they do not include awards earned by non-honors students of the university as a whole.

Therefore, we decided to do a separate rating, one that is not included in the new book, INSIDE HONORS. The new rating uses only the 12 components listed below. Farther down,  you can see whether the prestigious scholarship component had a major impact on the overall ratings of top programs.

Those 12 additional components are…

  • Curriculum Requirements
  • Number of Honors Classes
  • Number of Honors Classes in 15 Key Disciplines
  • Extent of Honors Enrollment
  • Average Class Size, Honors-only Sections
  • Overall Average Class Size, All Sections
  • Honors Graduation Rate-Raw
  • Honors Graduation Rate-Adjusted for Test Scores
  • Student to Staff Ratio
  • Type and Extent of Priority Registration
  • Honors Residence Halls, Amenities
  • Honors Residence Halls, Availability

Below is a comparison of the honors programs that received a five mortarboard OVERALL RATING (left side) and those that receive the same rating for HONORS COMPONENTS ONLY (right side), all listed ALPHABETICALLY.

OVERALL FIVE MORTARBOARDS HONORS ONLY COMPONENTS, FIVE MORTARBOARDS
Arizona St Clemson
Clemson CUNY Macaulay
CUNY Macaulay Georgia
Georgia Houston
Houston Kansas
Kansas New Jersey Inst Tech
New Jersey Inst Tech Oregon
Oregon Penn St
Penn St South Carolina
South Carolina Temple
UT Austin UT Austin

It is notable that the overlap is almost identical: Arizona State is not on the second list, while Temple is not on the OVERALL list but is on the HONORS COMPONENTS list.

We must add that Temple barely missed a five mortarboard overall rating, while ASU was similarly close to making the honors components list.