Southeast, West Coast Colleges: Top Public Values in Kiplinger Report

The Kiplinger Best Value College Index methodology emphasizes a “quality” side in relation to the “cost” side of a university. The quality side includes selectivity, retention, and four-year grad rates, while the cost side takes tuition, fees, merit aid, need-based aid, and post-graduation debt into account.

For the 16th straight year, UNC Chapel Hill leads as the best public value for both in-state and out-of-state (OOS) applicants.

The Southeast and Mid-Atlantic account for 10 of the top 25 best public value schools. West coast universities in the UC system along with the University of Washington account for another half dozen in the top 25.

In the middle, so to speak, are traditionally strong publics including Michigan, UW Madison, Illinois, UT Austin, Minnesota, and Ohio State.

Acceptance rates vary widely among the top value schools, from a low of 15 and 17 percent at UC Berkeley and UCLA respectively, to a high of 66 percent at Illinois.

Other publics with relative low acceptance rates include Michigan (26 percent); Cal Poly (31 percent); Georgia Tech (32 percent); UC Santa Barbara (33 percent); UC San Diego (34 percent); and UC Irvine and UT Austin (39 percent).

Below are the top 25 in-state public values, with the OOS ranking and Acceptance Rate listed as well.

University In State OOS Accept Rate
UNC Chapel Hill 1 1 30
Virginia 2 2 30
UC Berkeley 3 7 15
William and Mary 4 6 34
Michigan 5 13 26
UCLA 6 14 17
Florida 7 3 48
Maryland 8 10 45
Georgia Tech 9 15 32
Georgia 10 11 53
UW Madison 11 18 49
Washington 12 24 53
UT Austin 13 26 39
UC Santa Barbara 14 28 33
Binghamton 15 8 42
Illinois 16 20 66
UC San Diego 17 31 34
NC State 18 9 50
New College Florida 19 21 61
Minnesota 20 4 45
Cal Poly 21 17 31
Ohio State 22 19 49
UC Irvine 23 44 39
Clemson 24 29 51
Miami Ohio 25 33 65

Top Honors Programs, Honors Components Only

So, what do we mean by “honors components only”?

In our latest book of honors program ratings, we listed the honors programs and colleges that received an overall five “mortarboard” rating. One component of the rating model used in order to determine the leading programs is prestigious scholarships–the number of Rhodes, Marshall, Truman, Goldwater, etc., awards earned by students from each university as a whole.

In most cases, honors programs at these universities contribute most of the winners of these awards, but not in all cases. So while the prestigious scholarship component is worth including, we do not want it to override the 12 other rating components used in the ratings. These components are “honors only” because they do not include awards earned by non-honors students of the university as a whole.

Therefore, we decided to do a separate rating, one that is not included in the new book, INSIDE HONORS. The new rating uses only the 12 components listed below. Farther down,  you can see whether the prestigious scholarship component had a major impact on the overall ratings of top programs.

Those 12 additional components are…

  • Curriculum Requirements
  • Number of Honors Classes
  • Number of Honors Classes in 15 Key Disciplines
  • Extent of Honors Enrollment
  • Average Class Size, Honors-only Sections
  • Overall Average Class Size, All Sections
  • Honors Graduation Rate-Raw
  • Honors Graduation Rate-Adjusted for Test Scores
  • Student to Staff Ratio
  • Type and Extent of Priority Registration
  • Honors Residence Halls, Amenities
  • Honors Residence Halls, Availability

Below is a comparison of the honors programs that received a five mortarboard OVERALL RATING (left side) and those that receive the same rating for HONORS COMPONENTS ONLY (right side), all listed ALPHABETICALLY.

OVERALL FIVE MORTARBOARDS HONORS ONLY COMPONENTS, FIVE MORTARBOARDS
Arizona St Clemson
Clemson CUNY Macaulay
CUNY Macaulay Georgia
Georgia Houston
Houston Kansas
Kansas New Jersey Inst Tech
New Jersey Inst Tech Oregon
Oregon Penn St
Penn St South Carolina
South Carolina Temple
UT Austin UT Austin

It is notable that the overlap is almost identical: Arizona State is not on the second list, while Temple is not on the OVERALL list but is on the HONORS COMPONENTS list.

We must add that Temple barely missed a five mortarboard overall rating, while ASU was similarly close to making the honors components list.

New U.S. College Rankings: Wall St Journal Partners with Times Highered

Whether we need it or not, there is a new ranking on the scene, the Wall Street Journal/Times Higher Education College Rankings 2017.

There are some interesting features, and the rankings are certainly worth a look.

The rankings combine national universities and liberal arts colleges into one group, and in this way resemble the Forbes rankings. And, also like the Forbes rankings, the salaries earned by graduates also count as a metric, 12% of the total in the WSJ/THE rankings.

Farther down, we will list the top 100 colleges in the rankings. Only 20 of the top 100 schools are public; 31 are liberal arts colleges; and the remaining 49 are elite private universities. This is not much of a surprise, given that financial resources are a major ranking category.

Before listing the top 100, we will list another group of schools that have the best combined scores in what we consider to be the two most important umbrella categories in the rankings, accounting for 60% of the total: “Engagement” and “Output.”

Engagement (20% of total, as broken out below):

A. Student engagement: 7%. This metric is generated from the average scores per College from four questions on the student survey:

  1. To what extent does the teaching at your university or college support CRITICAL THINKING?
  2. To what extent did the classes you took in your college or university so far CHALLENGE YOU?
  3. To what extent does the teaching at your university or college support REFLECTION UPON, OR MAKING CONNECTIONS AMONG, things you have learned?
  4. To what extent does the teaching at your university or college support APPLYING YOUR LEARNING to the real world?

B. Student recommendation: 6%. This metric is generated from the average score per College from the following question on the student survey:

  1. If a friend or family member were considering going to university, based on your experience, how likely or unlikely are you to RECOMMEND your college or university to them?

C. Interactions with teachers and faculty: 4%. This metric is generated from the average scores per College from two questions on the student survey:

  1. To what extent do you have the opportunity to INTERACT WITH THE FACULTY and teachers at your college or university as part of your learning experience?
  2. To what extent does your college or university provide opportunities for COLLABORATIVE LEARNING?

D. Number of accredited programs (by CIP code): 3%. This metric is IPEDS standardized number of Bachelor’s degree programs offered.

Output (40% of the total, as broken out below):

A. Graduation rate: 11%. This metric is 150% of the graduation rate status as of 31 August 2014 for the cohort of full-time, first-time degree/certificate-seeking undergraduates, Bachelor’s or equivalent sub-cohort.

B. Graduate salary: 12%. This metric estimates the outcome of median earnings of students working and not enrolled 10 years after entry.

C. Loan default/repayment rates: 7%. This metric estimates the outcome of the 3-year repayment rate from College Scorecard data. The value added component is the difference between actual and predicted (based on underlying student and College characteristics) outcomes.

D. Reputation: 10%. This metric is the number of votes obtained from the reputation survey, and is calculated as the number of US teaching votes from the reputation survey and the number of US-only teaching votes from country section of the reputation survey.

The two remaining umbrella categories measure Financial Resources, including the amount spent per student; and the Environment, including the diversity of enrolled students (or faculty) across various ethnic groups. You can find a summary of the methodology here.

Here are the 23 colleges that scored at least 17.0 (out of 20) in Engagement and at least 30.0 (out of 40.0) in Output, listed in order of their overall place in the WSJ/TimesHigherEd rankings:

Stanford–Ranking 1; Engagement 17.4; Output 39.4

Penn–Ranking 4; Engagement 17.6; Output 39.0

Duke–Ranking 7; Engagement 17.2; Output 39.3

Cornell–Ranking 9; Engagement 17.3; Output 38.2

WUSTL–Ranking 11; Engagement 17.5; Output 38.6

Northwestern–Ranking 13; Engagement 17.1; Output 37.8

Carnegie Mellon–Ranking 19; Engagement 17.2; Output 37.0

Brown–Ranking 20; Engagement 17.5; Output 35.7

Vanderbilt–Ranking 21; Engagement 17.2; Output 38.8

Michigan–Ranking 24; Engagement 17.4; Output 37.2

Notre Dame–Ranking 25; Engagement 17.4; Output 37.0

Swarthmore–Ranking 34; Engagement 17.7; Output 31.0

Smith–Ranking 35; Engagement 17.1; Output 31.3

Univ of Miami–Ranking 37; Engagement 17.5; Output 30.8

Purdue–Ranking 37; Engagement 17.2; Output 34.1

UC Davis–Ranking 43; Engagement 17.1; Output 33.8

Illinois–Ranking 48; Engagement 17.1; Output 35.6

UT Austin–Ranking 51; Engagement 17.3; Output 33.3

Florida–Ranking 56; Engagement 17.1; Output 35.6

Pitt–Ranking 59; Engagement 17.0; Output 32

Michigan State–Ranking 63; Engagement 17.7; Output 32.9

Wisconsin–Ranking 67; Engagement 17.2; Output 33.5

Texas A&M–Ranking 81; Engagement 17.6; Output 31.7

Below are the top 100 colleges in the new rankings:

1. Stanford
2. MIT
3. Columbia
4. Penn
5. Yale
6. Harvard
7. Duke
8. Princeton
9. Cornell
10. Caltech
11. Johns Hopkins
11. WUSTL
13. Northwestern
13. Chicago
15. USC
16. Dartmouth
17. Emory
18. Rice
19. Carnegie Mellon
20. Brown
21. Vanderbilt
22. Williams
23. Amherst
24. Michigan
25. Notre Dame
26. UCLA
27. Tufts
28. Pomona
29. Georgetown
30. North Carolina
30. Wellesley
32. Case Western
33. NYU
34. Swarthmore
35. Smith
36. Middlebury
37. UC Berkeley
37. Carleton
37. Haverford
37. Univ of Miami
37. Purdue
42. Boston University
43. UC Davis
44. Bowdoin
45. Wesleyan
46. Claremont McKenna
47. Bryn Mawr
48. Illinois
49. UC San Diego
50. Lehigh
51. Georgia Tech
51. UT Austin
53. Bucknell
54. Colgate
54. Wake Forest
56. Virginia
56. Florida
58. Rochester
59. Pitt
60. Hamilton
61. Washington
62. Oberlin
63. Boston College
63. Michigan State
65. Trinity College (Conn.)
66. Colby
67. George Washington
67. Macalester
67. Wisconsin
70. WPI
71. Ohio State
72. Northeastern
73. Lafayette
73. Trinity (TX)
75. Tulane
75. Vassar
77. Davidson
78. Grinnell
78. RPI
80. Barnard
80. Texas A&M
82. Drexel
83. Denison
84. Occidental
84. Richmond
86. SMU
87. Howard
88. Holy Cross
89. Brandeis
90. Denver
91. De Pauw
92. Rose-Hulman
93. William and Mary
94. Kenyon
95. Bentley
96. Connecticut College
96. Penn State
96. Scripps College
99. Stevens Inst Tech
100. Maryland

 

 

 

Average U.S. News Rankings for 123 Universities: 2013-2020

Editor’s note: This post has now been updated, effective September 9, 2019, to include new U.S. News rankings for 2020.  Listed below are the yearly rankings and overall average rankings of 123 national universities that were included in the first tier of the U.S. News Best Colleges from 2013 through 2020. There are 61 public and 62 private universities. The list below not only shows the average rankings over this eight-year period but also lists the number of places lost or gained by each university.

U.S. News has changed its methodology, and there are some significant changes, especially after the top 30-35 places in the rankings. Major gains for Florida, Florida State, Georgia, Georgia Tech, most UC campuses.

Beginning in the 2019 edition, U.S. News “factored a school’s success at promoting social mobility by graduating students who received federal Pell Grants (those typically coming from households whose family incomes are less than $50,000 annually, though most Pell Grant money goes to students with a total family income below $20,000).”

This has shaken up the rankings quite a bit, and the trend will continue. Previously, a school’s wealth drove the rankings, without regard to the number of low-income students enrolled. Now school wealth can have a different impact by enabling institutions with more resources to “afford” the enrollment of more students who cannot pay full tuition. Some universities that lack big endowments formerly raised their rankings by enrolling students with higher test scores, even if merit aid was necessary. Now that model might not be as effective, since many of those students did not receive Pell grants.

While we appreciate the massive amount of data that the U.S. News rankings provide on class sizes, grad rates, retention rates, and even selectivity, on the whole the rankings fail to evaluate efficiency (the number of students who receive a high-quality education at a relatively low cost) and should not use selectivity and wealth as metrics.

Here are the historical rankings, the average of each school across eight years, and the increase or decline of each school from 2013 through 2020. The universities are listed in order of their average ranking across the years.

Here is the list.

US News 2013--202020132014201520162017201820192020Avg RankChg 2013
to 2020
Princeton1111111110
Harvard122222221.875-1
Yale3333333330
Columbia4444553341
Chicago454433364-2
Stanford654455765.250
MIT677775335.6253
Penn878988867.752
Duke8788898108.25-2
Caltech101010101210121210.75-2
Johns Hopkins1312121010111010113
Northwestern12121312121110911.3753
Dartmouth101011121111121211.125-2
Brown151416141414141414.3751
Cornell151615151514161715.375-2
Vanderbilt171716151514141715.6250
Washington Univ141414151918191916.5-5
Rice171819181514161716.750
Notre Dame171816181518181516.8752
Emory202021212021222120.75-1
UC Berkeley212020202021222220.75-1
Georgetown212021212020222421.125-3
UCLA242323232421192022.1254
USC242325232321222222.8752
Carnegie Mellon232325232425252524.125-2
Virginia242323262425252824.75-4
Wake Forest272327272727272726.50
Tufts282827272729272927.75-1
Michigan292829292728272527.754
North Carolina303030303030302929.8751
NYU323232323630302931.6253
Rochester333233333234332932.3754
Boston College313131303132383732.625-6
William & Mary333233343232384034.25-7
Georgia Tech363636363434352934.57
Brandeis333235343434354034.625-7
UC Santa Barbara414140373737303437.1257
Case Western373738373737424038.125-3
UC San Diego383937394442413739.6251
UC Davis383938414446383940.375-1
UC Irvine444942393942333640.58
Boston Univ514142413937424041.62511
RPI414142413942495043.125-9
UW Madison414147414446494644.375-5
Northeastern564942473940444044.62516
Lehigh384140474446535044.875-12
Florida544948475042353444.87520
Illinois464142414452464845-2
Tulane515254413940444045.12511
U of Miami444748514446535748.75-13
Penn State463748475052595749.5-11
Pepperdine545754525046465051.1254
UT Austin465253525656494851.5-2
Washington 465248525456596253.625-16
Ohio St5652545254545654542
Georgia 636062615654465056.513
George Washington515254575656637057.375-19
Syracuse586258616061535458.3754
Connecticut635758576056636459.75-1
SMU586058615661636460.125-6
Purdue656862616056565760.6258
Maryland586262576061636460.875-6
WPI6562685760615964621
Fordham585758666061707463-16
Pitt586262666868705763.8751
Clemson686262616667667065.25-2
Brigham Young686262666861637765.875-9
Yeshiva464748526694809766.25-51
Rutgers6869707270695662676
Texas A&M656968707469667068.875-5
Minnesota686971697169767070.375-2
Virginia Tech726971707469767471.875-2
American777571727169787773.750
Stevens Inst Tech7582767571697074741
Baylor777571727175787974.75-2
Clark837576757481669177.625-8
UMass Amherst979176757475706477.7533
Iowa727371828278898478.875-12
Michigan St727385758281858479.625-12
Delaware757576757981899180.125-16
UC Santa Cruz778685827981708480.5-7
Col School of Mines779188758275808481.5-7
Indiana837576758690897981.6254
Miami Oh897576827978899182.375-2
Marquette837576868690898483.625-1
TCU928276828278809783.625-5
Florida St979195969281705784.87540
Binghamton899788898687807986.87510
Denver839188868687969789.25-14
Stony Brook928288899697809189.3751
San Diego929195898690859189.8751
NC State1061019589928180849122
Colorado 9786888992909610492.75-7
Tulsa83868886868710612192.875-38
Vermont9282858992979612194.25-29
Drexel8397959996941029795.375-14
St. Louis92101999696941069797.625-5
Univ at Buffalo106109103999997897997.62527
Loyola Chicago106101106999910389104100.8752
Auburn899110310299103115107101.125-18
Tennessee101101106103103103115104104.5-3
Alabama77868896103110129153105.25-76
Oregon115109106103103103102104105.62511
New Hampshire1069799103107103106125105.75-19
Illinois Tech11310911610810310396117108.125-4
UC Riverside1011121131211181248591108.12510
South Carolina115112113108107103106104108.511
Oklahoma10110110610811197124132110-31
Iowa St101101106108111115119121110.25-20
Pacific106112116108111110106125111.75-19
Missouri979799103111120129139111.875-42
Nebraska10110199103111124129139113.375-38
Kansas106101106115118115129130115-24
Dayton115112103108111124127132116.5-17
Clarkson115121121115129124102117118-2
Arizona1201191211211241241061171193
Howard12014214513512411089104121.12516
Catholic120121116123124120129139124-19
Michigan Tech120117116123118124136147125.125-27
Arizona St139142129129129115115117126.87522
Kentucky125119129129133133147132130.875-7
Colorado St134121121127129124140166132.75-32
Arkansas134128135129135133147153136.75-19

 

Update No. 3: The 2016 Edition Is Coming Soon, with Important Changes

By John Willingham, Editor

The 2016 edition will have a new name– Inside Honors: Ratings and Reviews of 60 Public University Honors Programs. It is in the final proofing stage now. The goal is to publish in late September. Each edition includes a somewhat different group of honors colleges and programs, so there will be changes, even among the 40 or so programs that are reviewed in each edition.

As I have noted in previous updates, the book will take an almost microscopic view of 50 of these programs and also provide more general summary reviews of 10 additional programs. I can say now that there will be a few more programs that will receive the highest overall rating of five “mortarboards” than there were in 2014. (The final list of programs we are rating and reviewing for 2016 is below.)

The rating system makes it possible for any honors college or program, whether a part of a public “elite” or not, to earn the highest rating. Similarly, the ratings allow all types of honors programs to earn the highest rating. Those receiving five mortarboards will include core-type programs with fewer than 1,000 students and large honors programs with thousands of students. And absent any intentional preference for geographical diversity, the list does in fact include programs from north, south, east, and west.

By microscopic, I mean that the rating categories have increased from 9 to 14, and so has the depth of statistical analysis. The categories are, first, the overall honors rating; curriculum requirements; the number of honors classes offered; the number of honors classes in “key” disciplines; the extent of honors participation by all members in good standing; honors-only class sizes; overall class size averages, including mixed and contract sections; honors grad rates, adjusted for admissions test scores; ratio of students to honors staff; type of priority registration; honors residence halls, amenities; honors residence halls, availability; and the record of achieving prestigious scholarships (Rhodes, Marshall, Goldwater, etc.).

Sometimes readers (and critics) ask: Why so few programs? Doesn’t U.S. News report on hundreds of colleges?

The answer is: Honors colleges and programs are complicated. Each one of the 50 rated reviews in the new edition with by 2,500-3,000 words in length, or 7-8 pages. That’s almost 400 pages, not including introductory sections. The rest of the answer is: We are not U.S. News. With myself, one assistant editor, a contract statistician, and an outsourced production firm, our ability to add programs is very limited.

The 2016 profiles are full of numbers, ratios, and averages, more than in 2014 certainly–and too many, I believe, for readers who would prefer more narrative summary and description. So, yes, it is a wonkish book, even to a greater extent than this website tends to be. But then, they are honors programs after all.

Full ratings:

Alabama Honors
Arizona Honors
Arizona State Honors
Arkansas Honors
Auburn Honors
Central Florida Honors
Clemson Honors
Colorado State Honors
Connecticut Honors
CUNY Macaulay Honors
Delaware Honors
Georgia Honors
Georgia State Honors
Houston Honors
Idaho Honors
Illinois Honors
Indiana Honors
Iowa Honors
Kansas Honors
Kentucky Honors
LSU Honors
Maryland Honors
Massachusetts Honors
Minnesota Honors
Mississippi Honors
Missouri Honors
Montana Honors
New Jersey Inst of Tech
New Mexico Honors
North Carolina Honors
Oklahoma Honors
Oklahoma State Honors
Oregon Honors
Oregon State Honors
Penn State Honors
Purdue Honors
South Carolina Honors
South Dakota Honors
Temple Honors
Tennessee Honors
Texas A&M Honors
Texas Tech Honors
UC Irvine Honors
University of Utah Honors
UT Austin Honors
Vermont Honors
Virginia Commonwealth Honors
Virginia Tech Honors
Washington Honors
Washington State Honors

Summary Reviews:

Cincinnati Honors
Florida State Honors
Michigan Honors
New Hampshire Honors
Ohio Univ Honors
Pitt Honors
Rutgers Honors
Virginia Honors
Western Michigan Honors
Wisconsin Honors

 

University of Texas Chancellor Opposes Top 10 Percent Admission Rule

As a former Navy Seal and admiral in command of all U.S. Special Operations forces, UT System Chancellor Bill McRaven spent three decades serving and leading the most elite military forces in the world. He has made it clear that he now wants the state flagship to join the best of the best among the nation’s public universities.

But after seeing the system flagship turn away thousands of the state’s elite students because they did not make the top 10 percent (actually 7 percent at UT Austin this year) in the graduating classes of the state’s most competitive high schools, the chancellor sees the automatic admission rule as a major obstacle to keeping the brightest students in Texas–and at UT Austin.

“Candidly, right now what is holding us back is the 10 percent rule,” McRaven told state higher ed leaders recently.

The motives of the Top 10 proponents are certainly worthy–to increase the enrollment of high-achieving minority students at UT Austin. But what makes the rule (sort of) work is that it is predicated on the fact that many of the state’s high schools remain almost entirely segregated. Sometimes this is because an entire region is heavily Latino (the Rio Grande Valley); but elsewhere the segregation in urban centers is based on race and income.

Many of these high schools are among the least competitive in the state. Graduating in the top 7 percent of a high school that offers no AP or honors sections and that has low mean test scores is far different from reaching the top 7 percent of a graduating class of 800 students that has 70 National Merit Scholars.

What can happen to suburban students at very competitive schools is that an unweighted high school GPA of 3.9 (high school rank top 11 percent) and an SAT score of 1440 might not make the cut at UT Austin. Three-fourths of the school’s admits are from the top 7 percent pool; the other 25 percent of admits face a pool that is as competitive as many of the nation’s most selective private colleges.

And, McRaven would say, too many of these students are going out of state, where it costs them more and where they might remain rather than return to Texas. Moreover, the chancellor believes the rule is part of the reason that UT Austin, despite having a stellar faculty, is not rated as highly as it should be among the nation’s public universities.

“Candidly, I think we need to take a hard look at some of the ways that we address higher education, particularly at our flagship program. Your flagship, your number one university in the state of Texas is ranked 52nd on the U.S. News & World Report. To me that’s unacceptable. A lot of things drive that. The 10 percent rule drives that,” he told higher ed leaders.

While he did not specify exactly how the rule contributes to lower rankings, the graduation rate metric used by U.S. News might be lower for UT Austin in part because of the relatively lower standards in many poor and mostly segregated high schools. (It is possible that the chancellor also sees the large size of UT Austin as another issue.)

If the chancellor can find a way to maintain or improve minority enrollment and do away with the Top 10 rule, he might prevail. If the U.S. Supreme Court does not scrap the university’s current holistic admissions policy for students outside the top 7 percent, he might have a better chance; otherwise, his task will be as difficult as many he faced as a military leader.

“[The Top 10 Rule] is a very very sensitive topic,” State Rep. Robert Alonzo told McRaven. “It is a topic that we have discussed at length from all different aspects, and I would hope that we have put it to rest for a while.”

McRaven was undeterred. “I am a new chancellor, so I am going to take that opportunity to re-open that look again,” he said. “Because my charge is to make us the very best, and I think there are some obstacles to doing that.”

Alonzo replied: “Well, I accept the challenge, sir.”

Stay tuned, for this could be a big battle indeed.

Poets & Quants Composite MBA Rankings 2015 List 24 Public Programs in Top 50

The annual composite MBA rankings compiled by John A. Byrne at Poets & Quants combines rankings from the “five most influential rankings and weighs each of them by the soundness of their methodologies” in order to yield “a more credible list of the best MBA programs.”

We like Poets & Quants and Byrne’s rankings and try to write about them each year. The rankings from which he combines the comprehensive list are those from U.S. News, Forbes, Bloomberg, the Financial Times, and the Economist.

Here are the public MBA programs listed in the top 50 for 2015, and their composite rank:

8–UC Berkeley Haas

12–Virginia Darden

13–Michigan Ross

14–UCLA Anderson

17–North Carolina Kenan-Flagler

18–UT Austin McCombs

21–Indiana Kelley

22–Washington Foster

25–Michigan State Broad

29–Minnesota Carlson

31–Ohio State Fisher

32–Wisconsin

33–Penn State Smeal

34–Georgia Tech

35–Maryland Smith

36–Arizona State Carey

37–Iowa Tippie

40–Pitt Katz

41–Texas A&M Mays

44–Purdue Krannert

45–Illinois

46–Florida Hough

47–UC Irvine Merage

48–Georgia Terry

50–Temple Ford

 

Do Elite Colleges Really Offer Better Courses? Probably Yes, in Some Ways

Is it actually worth it, in terms of quality classroom learning, to land a place at an elite college or university? This is a question that many families with highly-talented students ask themselves. If their answer is yes, the result is likely to be a concerted, frenzied effort to mold the students in a way that gives them at least a modest chance of admission to such schools. (Of course, for better or worse, the question is often framed as “Is it worth it, in terms of career success, to land a place…”).

Regarding the differences in the quality of classes among all levels of institutions, new research provides some insights. The researchers lean toward minimizing the relationship between academic prestige and quality of instruction–but it appears that some of their own research suggests just the opposite.

In an article titled Are Elite College Courses Better?, Doug Lederman, editor and co-founder of Inside Higher Ed, provides an excellent, mostly neutral summary of the recent research that suggests course quality in a relatively broad range of institutions does not vary as much as the prestige of a given school might suggest.

“Researchers at Teachers College of Columbia University and at Yeshiva University… believe they are developing a legitimate way to compare the educational quality of courses across institutions,” Lederman writes, “and their initial analysis, they say, ‘raises questions about the value of higher-prestige institutions in terms of their teaching quality.'”

The researchers suggest that the drive to enhance prestige based on rankings and selectivity have led to “signaling”–branding, perceptions–that are increasingly divorced from the actual quality of classroom instruction. The laudable aim of the researchers is to turn the conversation away from college rankings and the metrics that drive them, and toward measurements of effective, challenging instruction.

Trained faculty observers visited nine colleges and 600 classes. Three of the nine had high prestige; two had minimum prestige; and four had low prestige. The schools were both public and private, with differing research and teaching emphases. We should note that there was no list of which schools were in each category, so we do not know exactly how the researchers defined “elite.” It appears likely, however, that many leading public research universities would be considered elite.

“Teaching quality was defined as instruction that displayed the instructor’s subject matter knowledge, drew out students’ prior knowledge and prodded students to wrestle with new ideas, while academic rigor was judged on the ‘cognitive complexity’ and the ‘level of standards and expectations’ of the course work,” Lederman writes.

“But they found that on only one of the five measures, cognitive complexity of the course work, did the elite colleges in the study outperform the non-elite institutions.”

First, we note that highly-qualified honors students at almost all colleges, including many less prestigious public universities, are far more likely to encounter more “cognitive complexity” in their honors courses. Whether this results from having more depth or breadth in actual assignments, from taking harder courses early on, or from engaging in more challenging interactions with similarly smart students and the best faculty, the learning experience in honors embraces complexity.

We also have to agree with one of the longest and most thoughtful comments posted on Lederman’s article, by one “catorenasci”:

“Well, is [more cognitive complexity] a surprise to anyone? After all…on average the students at elite colleges and universities (private or public) have demonstrated higher cognitive ability than the students at less prestigious colleges and universities. Which means that the faculty can teach at a level of greater cognitive complexity without losing (many) students.”

The full comment from “catorenasci” also seems to be on the mark when it comes to improved instruction in all other measured areas on the part of colleges with less prestige, regardless of honors affiliation.

“As for the level of ‘teaching quality’ based on faculty knowledge, given the job market today, it should hardly be surprising that it has equaled out since there are many top quality candidates for even less prestigious positions and overall, I would suspect that the ‘quality’ of the PhD’s of faculty at less elite schools is much closer to that of elite schools than it was during the ’50s and ’60s when higher education was expanding rapidly and jobs were plentiful.

“The transformational aspect should not be surprising either: assuming faculty are competent and dedicated, with less able students they will work harder to draw out what they know and build on it. And, it will be more likely that students will experience significant growth as the faculty do this.”

We Vote for US News Global Rankings vs Times Higher Ed World Rankings

The annual Times Higher Education World University Rankings have had the strongest presence in the ranking “world” since 2004, but here’s one vote for the U.S. News Best Global Universities rankings being better even though they have been around only two years. Both are useful because they measure the prestige and research impact of hundreds of universities around the world at a time when there is much more international cooperation–and competition–among institutions.

It is rare for us to applaud the U.S. News rankings because there are many serious issues with the annual “Best Colleges” publication. It over-emphasizes the financial resources of colleges and their selectivity, to the detriment of most public universities.

But when it comes to world rankings, U.S. News drops the focus on financial metrics in favor of academic reputation and research metrics, including the use of regional reputation surveys that help to offset the eurocentric bias of the Times Higher Ed rankings.

For example, the Times Higher Ed rankings list 42 European universities among the top 100 in the world, while U.S. News lists 31. The main reason is probably that the Times rankings do include financial metrics and do not factor in the additional regional reputation data.

Below is a table showing the U.S. public universities ranked among the top 100 in the world by U.S. News alongside the rankings of the same universities by Times Higher Ed. An additional column shows the average ranking of each school when both ranking systems are used. The average ranking of leading U.S. public universities by U.S. News is 44 out of 100; the average Times Higher Ed ranking of the same schools is 82.

 US News GlobalTimes Higher EdAverage
UC Berkeley3138
UCLA81612
Washington113221.5
Michigan172119
UC San Diego193929
UC Santa Barbara243931.5
Wisconsin265038
North Carolina276345
Minnesota296547
UT Austin304638
Ohio St349062
UC Davis394441.5
Maryland4111779
Illinois433639.5
Colorado4612786.5
Pitt477963
UC Santa Cruz4814496
Florida5312086.5
Penn State577566
Rutgers6012391.5
UC Irvine6110683.5
Georgia Tech644152.5
Arizona67163115
Purdue7211392
Michigan St829992.5
Texas A&M88193140.5
Virginia94147120.5

Expert on International Higher Ed: “Un-Excellence” Initiatives Aimed at Public Universities Harm U.S. Standing in the World

Editor’s note: This post updated on October 1, 2016, after release of Times Higher Ed Rankings for 2016.

It is likely that Philip G. Altbach, a research professor and the founding director of the Center for International Higher Education at Boston College, has the sharpest eye of anyone in America when it comes to seeing how well U.S. universities compare with rapidly improving institutions throughout the world. What he sees is not good.

U.S. public universities are losing ground to foreign institutions, most notably in Europe and Scandinavia.

(Below the following text are three tables. The first compares the Times Higher Ed world rankings of U.S. universities and those throughout the world for the years 2011 and 2016. The second shows the decline in rankings for U.S. public universities for the same  years. The third and final table shows the rise in rankings for U.S. private universities.

Altbach cites the work of colleague Jamil Salmi, who found that there are at least 36 “excellence initiatives” around the world “that have pumped billions of dollars into the top universities in these countries — with resulting improvements in quality, research productivity, and emerging improvements in the rankings of these universities. Even in cash-strapped Russia, the ‘5-100’ initiative is providing $70 million into each of 15 selected universities to help them improve and compete globally. [Emphasis added.]

“At the same time, American higher education is significantly damaging its top universities through continuous budget cuts by state governments. One might call this an American “unExcellence initiative” as the world’s leading higher education systematically damages its top research universities. Current developments are bad enough in a national context, but in a globalized world, policies in one country will inevitably have implications elsewhere. Thus, American disinvestment, coming at the same time as significant investment elsewhere, will magnify the decline of a great higher education system.”

One reason: All the bad publicity about cuts in funding, along with high-profile political grandstanding that has received far too much attention throughout the world academic community. For example, UT Austin endured years of highly publicized attacks by former Governor Perry during his second term, and UW Madison has been hurt by similar actions on the part of Governor Scott Walker.

Unless state legislatures move toward excellence and restore pre-Great Recession funding levels, there will be “a revolution in global higher education and create space for others at the top of the rankings. It would also be extraordinarily damaging for American higher education and for America’s competitiveness in the world.”

The average ranking for the 23 U.S. publics among the top 100 in the world in 2011 was 39th, while in 2016 it was 49th–and only 15 publics were among the top 100. Meanwhile, leading U.S. private universities have seen both their average reputation rankings and overall rankings rise since 2011.

To illustrate Professor Altbach’s point, we have generated the tables below.

This table compares the Times Higher Ed world  rankings of U.S. universities and those throughout the world for the years 2011 and 2016.

Top 100 Overall2011Public2016PublicGain/Loss
U.S.53233915-14
U.K./Ireland16160
Europe/Scand122513
Asia109-1
Australia561
Canada440

The following table shows the decline in rankings for U.S. public universities for the years 2011 and 2016. UC Davis is the only school to rise, while most dropped significantly.

 RepRankingRepRanking
Times Higher Ed Rankings2011201120162016
UC Berkeley48613
UCLA12111316
Michigan13151921
Illinois21333036
Wisconsin25433850
Washington26233332
UC San Diego30324139
UT Austin31294646
UC Davis38544444
Georgia Tech39274941
UC Santa Barbara55296539
North Carolina41306563
Minnesota43527565
Purdue 4710675113
Ohio State55667590
Pitt55647579
Averages33394749

This last table shows the rise in rankings for leading U.S. private universities for the years 2011 and 2016.

 RepRankingRepRanking
Times Higher Ed Rankings2011201120162016
Harvard1116
MIT2345
Stanford5453
Princeton7577
Yale910811
Caltech10291
Johns Hopkins14131812
Chicago15121110
Cornell16142018
Penn22192317
Columbia23181015
Carnegie Mellon28202822
Duke36243420
Northwestern40254725
NYU55602030
Averages19151614