Honors Completion Rates: A Statistical Summary

Editor’s Note: This is the third and final post in our series on honors program completion rates.

In the first post, we wrote about the hybrid structure of honors programs and how that can affect honors completion rates. An honors completion rate is the percentage of honors students who complete all honors course requirements for at least one option by the time they graduate. The second post presented a tentative formula for evaluating honors completion rates.

This post has two parts. The first part compares honors completion rates of main option and multiple option honors programs; the second part (2) a compares completion rates of honors colleges and honors programs.

Main option programs emphasize only one curriculum completion path, usually requiring more than 30 honors credits and often an honors thesis as well. Multiple option programs offer two or more completion paths for first-year students. One option might require 24 honors credits; another might require 15-16 credits. Either of these might also require a thesis.

Many universities are now establishing honors colleges. These usually have a dean and a designated staff of advisors. They typically provide at least enough honors housing space for first-year students. Some began as honors programs and then re-formed into honors colleges. Quite a few honors colleges have significant endowments.

Honors programs do not have a dean, but are administered by a director and staff. Sometimes there are few real differences between honors colleges and programs. In general, however, honors colleges have more staff and offer more access to honors housing.

We received data from 23 honors colleges and eight honors programs, having a combined enrollment of more than 64,000 honors students. The 31 parent universities had an average U.S. News ranking of 126, ranging from the low 50s to higher than 200.

The first summary is below:

PART ONE: SUMMARY STATISTICS   
MAIN OPT PROGRAMS VS
MULTI OPTION PROGRAMS
MEASUREALL PROGRAMSMAIN OPTIONMULTI OPTION
NO. OF PROGRAMS311516
NO. HONORS STUDENTS642872768836599
PROGRAM SIZE2073.81845.92287.4
COMPLETION % rate57.967.848.6
UNIVERSITY GRAD RT67.268.765.6
UNIV GRAD RT>COMPLETION RT9.3.917.0
HONORS GRAD RATE86.988.785.2
HONORS GR RT>COMPLETION RT29.020.936.6
HONORS GR RT>UNIV GR RT19.720.019.6
FRESH RETENTION86.787.785.6
TEST SCORES ADJ TO SAT1405.61416.91395.0
CURRICULUM REQUIREMENT AVG27.031.822.1
CLASS SIZE24.224.024.4
THESIS OPTION Y/N27/411/416/0
THESIS REQ ALL OPTIONS Y/N14/3110/54/16
DORM RMS / FR & SOPH0.53.57.48
HON CLASS SEATS / HON STUDENTS1.291.491.11
APPLY SEP TO HONORS Y/N23/812/311/5

The second summary, comparing honors colleges and honors programs, is below:

SUMMARY PART TWO: HON COLLEGESHON PROGRAMS
HONORS COLLEGES vs
HONORS PROGRAMS
NO. OF PROGRAMS238
NO. HONORS STUDENTS5277111516
PROGRAM SIZE2294.391439.5
COMPLETION % rate54.866.7
UNIVERSITY GRAD RT64.774.5
UNIV GRAD RT>COMPLETION RT9.97.8
HONORS GRAD RATE85.591.0
HONORS GR RT>COMPLETION RT30.724.3
HONORS GR RT>UNIV GR RT20.816.5
FRESH RETENTION85.689.5
TEST SCORES ADJ TO SAT1394.61437.3
CURRICULUM REQUIREMENT AVG26.029.75
CLASS SIZE25.022.0
MAIN OPTION 106
MULTIPLE OPTION132
THESIS OPTION Y/N20/236/8
THESIS REQ ALL OPTIONS Y/N10/233/8
DORM RMS / FR & SOPH.55.46
HON CLASS SEATS / HON STUDENTS1.251.44
APPLY SEP TO HONORS Y/N17/235/8

Advertisements

Here is a Formula for Evaluating Honors Completion Rates

Honors completion rates, as we noted in a previous post, are a complicated issue. They represent the percentage of students who enter an honors program and then complete all honors requirements for at least one completion option by the time they graduate.

They are related to university freshman retention rates and university graduation rates, but in order to evaluate them there must be some workable baseline completion rate derived from a significant sample of programs.

Honors deans and directors at 31 public university honors programs contributed the data used to calculate the values in the next paragraph, along with extensive additional data we use in rating honors programs. The 31 programs enrolled more than 64,000 honors students in Fall 2017. At some point we might include completion rates as a metric; if we do, then this formula, or an improved version, might be used.

This tentative formula takes into account (1) the average (mean) honors completion rate for the whole data set (57.88 percent); (2) the mean university-wide freshman retention rate for the whole data set (86.81 percent); (3) the completion rate of each program; (4) the freshman retention rate for the parent university of each program; and (5) the graduation rate of each university.

The formula assumes that a desirable target honors completion rate should at least equal the midway point between the university graduation rate and the adjusted honors completion rate. (See examples below, however, for programs that have honors completion rates that exceed the university graduation rate.) The formula can easily be changed to include lower or higher target levels by increasing or reducing the divisor.

H = the mean honors completion rate for the data set;

F = the mean freshman retention rate for the data set;

P = the program completion rate;

C = the completion rate of each program adjusted to the university freshman retention rate (.67*R);

R = the freshman retention rate of each parent university;

G = the graduation rate of each parent university;

T = the estimated target completion rate after the formula is applied. T = (G + C) /2. This is an estimate of what the minimum completion rate should be, given the university’s freshman retention rate and graduation rate, and the mean completion rate and mean freshman retention rate for this data set. Other data sets would of course have different data, but the formula could still be applied.

The completion rates of ten programs exceeded the graduation rates of their parent universities.

Here is the formula, where P = 61%; R = 92%; G = 83%:

First step = (H/F), or .57.88 / 86.81. The result is .67. This is a constant for this data set.

Second step is to adjust the completion rate in relation to the university freshman retention rate = .67 *R, or .67 *92. The result is 61.64 (C), a bit higher than the actual program completion rate of 61.0 (P), because of the relatively high freshman retention rate.

Third step is to adjust the completion rate C in relation to the university graduation rate in order to calculate the target completion rate. T = (G + C) /2, or (83 + 61.64) /2 = 72.32 (T).

Fourth step is to calculate P – T, which would be 61.00 – 72.32 = –11.32. This step calculates the extent to which the program completion rate varies from the estimated target rate. The program is performing below the estimated target rate. The relatively high university graduation rate is the main reason.

More examples:

Honors program A had a program completion rate (P) of 84%, a freshman retention rate (R) of 88%, and a university graduation rate (G) of 73%. The C rate would be .67*88, or 58.96. The T calculation would be (G + C) /2, or (73 + 58.96) / 2= 65.98 (T). Now calculate C – T, (or 84 – 65.98) = +18.02. This program is performing far above its estimated target rate.

Honors program B had the same program completion rate (P) of 84% but a much higher freshman retention rate (R) of 95%, and a university graduation rate (G) of 81%. Calculating the C value would be .67*95, or 63.7, and the T would (G + C) /2, or (81 – 63.7) /2 = 73.325. When we calculate C – T, (84 – 73.325), the result is + 11.675. This program is performing well above its estimated rage, but even with the same completion rate as Program A, the impact of higher graduation and freshman retention rates for Program B causes its relative performance rating to be lower than Program A. In other words, the expectations were higher for Program B. Both programs are exceptional in that their honors completion rates exceed their university graduation rates.

Honors program D had a program completion rate (P) of 40%, a freshman retention rate (R) of 82%, and a university graduation rate (G) of 53%. C would be .67*82, or 54.94. T would be (G + C) /2, or (53 + 54.94) /2 = 53.97. Calculating C – T, the result is 40 – 53.97, or -13.97. Program D is significantly underperforming based on the formula.

 

 

&nbsp

U.S. News Rankings for 57 Leading Universities, 1983–2007

Below are the U.S. News rankings from 1983 through 2007 for 57 leading national universities. For additional U.S. News rankings, please see U.S. News Rankings, 2008 through 2015, and Average U.S. News Rankings for 129 National Universities, 2011 to 2018.

Especially notable in the list below are the changes in major public universities.

Included here are institutions that were, at some point, ranked in the top 50 in those two categories. Some values are blank because in those years the magazine did not give individual rankings to every institution, instead listing them in large groups described as “quartiles” or “tiers.” The rankings shown for 1983 and 1985 are the ones that U.S. News published in its magazine in those same years. For all subsequent years, the rankings come from U.S. News’s separate annual publication “America’s Best Colleges”, which applies rankings for the upcoming year.

Here is the list:

 Year 83 85 88 89 90 91 92 93 94 95 96 97 98 99 0 1 2 3 4 5 6 7
Stanford University 1 1 1 6 6 2 3 4 6 5 4 6 5 4 6 6 5 4 5 5 5 4
Harvard University 2 2 2 4 3 1 1 1 1 1 1 3 1 1 2 2 2 2 1 1 1 2
Yale University 3 2 3 1 1 3 2 3 3 3 2 1 3 1 4 2 2 2 3 3 3 3
Princeton University 4 4 4 2 2 4 4 2 2 2 2 2 1 1 4 1 1 1 1 1 1 1
University of California at Berkeley 5 7 5 24 13 13 16 16 19 23 26 27 23 22 20 20 20 20 21 21 20 21
University of Chicago 6 5 8 10 9 11 10 9 9 10 11 12 14 14 13 10 9 12 13 14 15 9
University of Michigan at Ann Arbor 7 8 25 17 21 22 24 23 21 24 24 23 25 25 25 25 25 25 22 25 24
Cornell University 8 11 14 11 9 12 11 10 15 13 14 14 6 11 10 14 14 14 14 13 12
University of Illinois at Urbana-Champaign 8 20 45 50 45 42 34 41 36 38 40 37 42 41
Massachusetts Institute of Technology 10 11 5 7 6 6 5 4 4 5 5 6 4 3 5 5 4 4 5 7 4
Dartmouth College 10 10 6 7 8 8 8 7 8 8 7 7 7 10 11 9 9 9 9 9 9 9
California Institute of Technology 12 21 3 4 5 4 5 5 7 7 9 9 9 1 4 4 4 5 8 7 4
Carnegie Mellon University 13 22 24 19 24 24 23 28 23 25 23 23 22 21 23 22 22 21
University of Wisconsin at Madison 13 23 32 41 38 36 34 35 32 31 32 32 34 34
Case Western Reserve University 35 38 37 34 34 38 38 37 37 35 37 38
Tulane University 38 36 34 36 44 45 46 43 44 43 43 44
University of California at Irvine 48 37 41 36 49 41 41 45 45 43 40 44
Rensselaer Polytechnic Institute 39 48 49 49 48 47 48 46 43 42
University of Washington 50 42 44 45 45 47 45 46 45 42
University of Rochester 25 29 30 31 29 32 33 36 36 35 37 34 34
University of California at San Diego 43 34 33 32 32 31 31 31 32 35 32 38
Georgia Institute of Technology 42 48 41 46 40 35 41 38 37 41 37 38
Yeshiva University 45 48 42 44 45 41 40 40 46 45 44
Pennsylvania State University at University Park 41 45 44 40 44 46 45 48 50 48 47
Worcester Polytechnic Institute 48 55 55 53 64
Rutgers University at New Brunswick 45 60 58 60 60
Texas A&M University at College Station 48 48 67 62 60 60
Pepperdine University 49 48 47 51 52 55 54
Syracuse University 49 44 40 47 55 52 50 52
George Washington University 46 50 51 52 53 52
University of Florida 47 49 48 50 50 47
University of California at Santa Barbara 46 47 47 44 45 48 47 45 45 45 47
University of California at Davis 40 40 41 44 42 41 41 43 43 42 48 47
University of Texas at Austin 25 44 49 48 47 53 46 52 47
New York University 36 35 34 35 34 33 32 35 35 32 37 34
Boston College 37 38 38 36 39 38 38 40 40 37 40 34
Emory University 25 22 21 25 16 17 19 9 16 18 18 18 18 18 20 20 18
Vanderbilt University 24 19 25 20 18 22 20 19 20 20 22 21 21 19 18 18 18
Rice University 14 9 10 16 15 12 14 12 16 16 17 18 14 13 12 15 16 17 17 17
Johns Hopkins University 16 11 14 15 11 15 15 22 10 15 14 14 7 15 16 15 14 14 13 16
Brown University 7 10 13 15 12 17 18 12 11 9 8 9 10 14 15 16 17 17 13 15 15
Northwestern University 17 16 19 23 14 13 13 14 13 9 9 10 14 13 12 10 11 11 12 14
Washington University in St. Louis 23 19 22 24 18 20 18 20 20 17 17 16 17 15 14 12 9 11 11 12
Columbia University 18 8 11 10 9 10 11 9 15 11 9 10 10 10 9 10 11 9 9 9
Duke University 6 7 12 5 7 7 7 7 6 6 4 3 6 7 8 8 4 5 5 5 8
University of Notre Dame 18 23 25 19 18 17 19 18 19 19 19 18 19 18 18 20
Georgetown University 17 25 19 19 17 17 25 21 23 21 20 23 23 22 24 23 25 23 23
Lehigh University 33 32 34 36 34 38 38 40 37 37 32 33
Brandeis University 30 29 28 31 31 31 34 31 32 32 34 31
College of William and Mary 22 34 33 32 33 29 30 30 30 31 31 31 31
Wake Forest University 31 25 28 29 28 28 26 25 28 27 27 30
Tufts University 25 22 23 25 29 29 28 28 27 28 27 27
University of Southern California 44 43 41 41 42 35 34 31 30 30 30 27
University of North Carolina at Chapel Hill 9 11 23 18 20 25 27 25 27 24 27 25 28 28 29 29 27 27
University of California at Los Angeles 21 16 17 23 23 22 28 31 28 25 25 25 26 25 26 25 25 26
University of Virginia 15 20 21 18 21 22 21 17 19 21 21 22 22 20 24 23 21 22 23 24
University of Pennsylvania 19 15 20 13 13 14 16 12 11 13 7 6 7 6 5 4 5 4 4 7

Top 25 Universities for Silicon Valley Hires: 17 Are Public

The website Quartz just published a list of the universities that place the highest number of grads at tech firms in Silicon Valley.

“The most coveted jobs are in Silicon Valley, and most selective US universities are members of the Ivy League. So it stands to reason that tech giants like Apple, Google, Amazon and Facebook would scoop up best and brightest from those bastions of power and privilege.

“Think again. None of the eight Ivy League schools—Harvard, Yale, Princeton, Brown, Columbia, Cornell, Dartmouth and the University of Pennsylvania—cracked the top 10 on a list of the universities sending the most graduates to tech firms, according to an analysis by HiringSolved, an online recruiting company. The company used data from more than 10,000 public profiles for tech workers hired or promoted into new positions in 2016 and the first two months of 2017.”

Editor’s note: The HiringSolve link also lists the 10 specific skills most in demand as of 2017, with changes from 2016. For example, the top four skills for entry level placement in 2017 are Python, C++, Java, and algorithms. The top job titles for entry placement in 2017 are Software Engineering Intern, Software Engineer, Business Development Consultant, and Research Intern.

Now let it be said that the 17 public universities in the top 25 are generally much larger than the private institutions on the list, so the sheer volume of highly-trained tech grads from the publics is much larger.

But the final message from Quartz was this:

If the list tells us anything, it’s that admission to an elite university isn’t a prerequisite for a career in Silicon Valley, and what you know is more important than where you learn it.” [Emphasis added.]

Here are the top 25 universities for Silicon Valley tech placement, in numerical order:

UC Berkeley
Stanford
Carnegie Mellon
USC
UT Austin
Georgia Tech
Illinois
San Jose State
UC San Diego
Arizona State
Michigan
UCLA
NC State
Cal Poly
Cornell
Waterloo (Canada)
Texas A&M
Washington
Purdue
MIT
Santa Clara
Univ of Phoenix*
UC Santa Barbara
UC Davis
Penn State

*Hypothesis: hands-on experience and later degrees?

Money Magazine Best Values 2017: CUNY Baruch, Michigan, UC’s, UVA Lead Publics

The new rankings from Money are out, and public colleges and universities account for 27 of the top 50 best values in 2017. These rankings are likely the best college rankings overall, given their balanced approach.

As Jeffrey J. Selingo writes in the Washington Post, the earnings portion of the rankings are based in part on some very interesting new evidence: the “Chelly data.”

“That refers to Raj Chetty,” Selingo tells us, “a Stanford professor, who has led a team of economists that has received access to millions of anonymous tax records that span generations. The group has published several headline-grabbing studies recently based on the data. In findings published in January, the group tracked students from nearly every college in the country and measured their earnings more than a decade after they left campus, whether they graduated or not.

Money does a better job of ranking colleges based on “outcomes” than Forbes does (see Outcomes farther down). This is especially the case with the multiple earnings analyses.

To see the list of top publics, please skip the methodology discussion immediately below.

 

The 2017 rankings include 27 factors in three categories:

Quality of education (1/3 weighting), which was calculated using:

Six-year graduation rate (30%).

Value-added graduation rate (30%). “This is the difference between a school’s actual graduation rate and its expected rate, based on the economic and academic profile of the student body (measured by the percentage of attendees receiving Pell grants, which are given to low-income students, and the average standardized test scores of incoming freshmen).” [Emphasis added.]

“Peer quality (10%). This is measured by the standardized test scores of entering freshman (5%), and the percentage of accepted students who enroll in that college, known as the “yield” rate (5%).” Note: using the yield rate is an improvement over the U.S. News rankings.

“Instructor quality (10%). This measured by the student-to-faculty ratio.” Note: this is very similar to a U.S. News metric.

“Financial troubles (20%). This is a new factor added in 2017, as financial difficulties can affect the quality of education, and a growing number of schools are facing funding challenges.” Note: although this is not an “outcome” either, it is more meaningful than using data on alumni contributions, etc.

Affordability (1/3 weighting), which was calculated using:

“Net price of a degree (30%). This is the estimated amount a typical freshman starting in 2017 will pay to earn a degree, taking into account the college’s sticker price; how much the school awards in grants and scholarships; and the average time it takes students to graduate from the school, all as reported to the U.S. Department of Education….This takes into account both the estimated average student debt upon graduation (15%) and average amount borrowed through the parent federal PLUS loan programs (5%).

“Student loan repayment and default risk (15%).

“Value-added student loan repayment measures (15%). These are the school’s performance on the student loan repayment and default measures after adjusting for the economic and academic profile of the student body.

Affordability for low-income students (20%). This is based on federally collected data on the net price that students from families earning $0 to $30,000 pay.

Outcomes (1/3 weighting), which was calculated using:

“Graduates’ earnings (12.5%), as reported by alumni to PayScale.com; early career earnings within five years of graduation (7.5%), and mid-career earnings, which are for those whose education stopped at a Bachelor’s degree and graduated, typically, about 15 years ago. (5%).

“Earnings adjusted by majors (15%). To see whether students at a particular school earn more or less than would be expected given the subjects students choose to study, we adjusted PayScale.com’s data for the mix of majors at each school; for early career earnings (10%) and mid-career earnings (5%).

“College Scorecard 10-year earnings (10%). The earnings of federal financial aid recipients at each college as reported to the IRS 10 years after the student started at the college.

“Estimated market value of alumni’s average job skills (10%). Based on a Brookings Institution methodology, we matched up data provided by LinkedIn of the top 25 skills reported by each school’s alumni with Burning Glass Technologies data on the market value each listed skill.

“Value-added earnings (12.5%). To see if a school is helping launch students to better-paying jobs than competitors that take in students with similar academic and economic backgrounds, we adjusted PayScale.com’s earnings data for the student body’s average test scores and the percentage of low-income students at each school; for early career earnings (7.5%) and mid-career earnings (5%).

Job meaning (5%). We used the average score of each school’s alumni on PayScale.com’s survey question of “Does your work make the world a better place?”

“Socio-economic mobility index (20%).

For the first time, we included new data provided by the Equality of Opportunity Project that reveals the percentage of students each school move from low-income backgrounds to upper-middle class jobs by the time the student is 34 years old.Finally, we used statistical techniques to turn all the data points into a single score and ranked the schools based on those scores.” [Emphasis added.]

The inclusion of these metrics makes the Money rankings a hybrid of the Washington Monthly “public good” rankings, U.S. News, and Kiplinger rankings, with the socio-economic factors having a less significant impact than the Washington Monthly rankings on overall standing. Still, these factors do result in two CUNY campuses’ receiving high rankings.

“The data showed, for example,” Selingo writes, “that the City University of New York propelled almost six times as many low-income students into the middle class and beyond as all eight Ivy League campuses, plus Duke, M.I.T., Stanford and Chicago, combined. The California State University system advanced three times as many.”

TOP PUBLIC UNIVERSITIES, MONEY MAGAZINE, 2017, BY NAME AND OVERALL RANK INCLUDING PRIVATE INSTITUTIONS:

CUNY Baruch College–2
Michigan–3
UC Berkeley–4
UCLA–5
UC Irvine–7
UC Davis–9
Virginia–11
Washington–13
Georgia Tech–16
Florida–18
Maryland–20
Illinois–22
Virginia Tech–23
College of New Jersey–24
UC Riverside–29
Michigan State–30
UT Austin–31
Binghamton–33
Texas A&M–34
UC Santa Barbara–36
Connecticut–37
Purdue–37 (tie)
VMI–41
Cal State Long Beach–42
CUNY Brooklyn–43
UW Madison–45
James Madison–46
Rutgers, New Brunswick–49
NC State–50

 

Top Honors Programs, Honors Components Only

So, what do we mean by “honors components only”?

In our latest book of honors program ratings, we listed the honors programs and colleges that received an overall five “mortarboard” rating. One component of the rating model used in order to determine the leading programs is prestigious scholarships–the number of Rhodes, Marshall, Truman, Goldwater, etc., awards earned by students from each university as a whole.

In most cases, honors programs at these universities contribute most of the winners of these awards, but not in all cases. So while the prestigious scholarship component is worth including, we do not want it to override the 12 other rating components used in the ratings. These components are “honors only” because they do not include awards earned by non-honors students of the university as a whole.

Therefore, we decided to do a separate rating, one that is not included in the new book, INSIDE HONORS. The new rating uses only the 12 components listed below. Farther down,  you can see whether the prestigious scholarship component had a major impact on the overall ratings of top programs.

Those 12 additional components are…

  • Curriculum Requirements
  • Number of Honors Classes
  • Number of Honors Classes in 15 Key Disciplines
  • Extent of Honors Enrollment
  • Average Class Size, Honors-only Sections
  • Overall Average Class Size, All Sections
  • Honors Graduation Rate-Raw
  • Honors Graduation Rate-Adjusted for Test Scores
  • Student to Staff Ratio
  • Type and Extent of Priority Registration
  • Honors Residence Halls, Amenities
  • Honors Residence Halls, Availability

Below is a comparison of the honors programs that received a five mortarboard OVERALL RATING (left side) and those that receive the same rating for HONORS COMPONENTS ONLY (right side), all listed ALPHABETICALLY.

OVERALL FIVE MORTARBOARDS HONORS ONLY COMPONENTS, FIVE MORTARBOARDS
Arizona St Clemson
Clemson CUNY Macaulay
CUNY Macaulay Georgia
Georgia Houston
Houston Kansas
Kansas New Jersey Inst Tech
New Jersey Inst Tech Oregon
Oregon Penn St
Penn St South Carolina
South Carolina Temple
UT Austin UT Austin

It is notable that the overlap is almost identical: Arizona State is not on the second list, while Temple is not on the OVERALL list but is on the HONORS COMPONENTS list.

We must add that Temple barely missed a five mortarboard overall rating, while ASU was similarly close to making the honors components list.

Update No. 3: The 2016 Edition Is Coming Soon, with Important Changes

By John Willingham, Editor

The 2016 edition will have a new name– Inside Honors: Ratings and Reviews of 60 Public University Honors Programs. It is in the final proofing stage now. The goal is to publish in late September. Each edition includes a somewhat different group of honors colleges and programs, so there will be changes, even among the 40 or so programs that are reviewed in each edition.

As I have noted in previous updates, the book will take an almost microscopic view of 50 of these programs and also provide more general summary reviews of 10 additional programs. I can say now that there will be a few more programs that will receive the highest overall rating of five “mortarboards” than there were in 2014. (The final list of programs we are rating and reviewing for 2016 is below.)

The rating system makes it possible for any honors college or program, whether a part of a public “elite” or not, to earn the highest rating. Similarly, the ratings allow all types of honors programs to earn the highest rating. Those receiving five mortarboards will include core-type programs with fewer than 1,000 students and large honors programs with thousands of students. And absent any intentional preference for geographical diversity, the list does in fact include programs from north, south, east, and west.

By microscopic, I mean that the rating categories have increased from 9 to 14, and so has the depth of statistical analysis. The categories are, first, the overall honors rating; curriculum requirements; the number of honors classes offered; the number of honors classes in “key” disciplines; the extent of honors participation by all members in good standing; honors-only class sizes; overall class size averages, including mixed and contract sections; honors grad rates, adjusted for admissions test scores; ratio of students to honors staff; type of priority registration; honors residence halls, amenities; honors residence halls, availability; and the record of achieving prestigious scholarships (Rhodes, Marshall, Goldwater, etc.).

Sometimes readers (and critics) ask: Why so few programs? Doesn’t U.S. News report on hundreds of colleges?

The answer is: Honors colleges and programs are complicated. Each one of the 50 rated reviews in the new edition with by 2,500-3,000 words in length, or 7-8 pages. That’s almost 400 pages, not including introductory sections. The rest of the answer is: We are not U.S. News. With myself, one assistant editor, a contract statistician, and an outsourced production firm, our ability to add programs is very limited.

The 2016 profiles are full of numbers, ratios, and averages, more than in 2014 certainly–and too many, I believe, for readers who would prefer more narrative summary and description. So, yes, it is a wonkish book, even to a greater extent than this website tends to be. But then, they are honors programs after all.

Full ratings:

Alabama Honors
Arizona Honors
Arizona State Honors
Arkansas Honors
Auburn Honors
Central Florida Honors
Clemson Honors
Colorado State Honors
Connecticut Honors
CUNY Macaulay Honors
Delaware Honors
Georgia Honors
Georgia State Honors
Houston Honors
Idaho Honors
Illinois Honors
Indiana Honors
Iowa Honors
Kansas Honors
Kentucky Honors
LSU Honors
Maryland Honors
Massachusetts Honors
Minnesota Honors
Mississippi Honors
Missouri Honors
Montana Honors
New Jersey Inst of Tech
New Mexico Honors
North Carolina Honors
Oklahoma Honors
Oklahoma State Honors
Oregon Honors
Oregon State Honors
Penn State Honors
Purdue Honors
South Carolina Honors
South Dakota Honors
Temple Honors
Tennessee Honors
Texas A&M Honors
Texas Tech Honors
UC Irvine Honors
University of Utah Honors
UT Austin Honors
Vermont Honors
Virginia Commonwealth Honors
Virginia Tech Honors
Washington Honors
Washington State Honors

Summary Reviews:

Cincinnati Honors
Florida State Honors
Michigan Honors
New Hampshire Honors
Ohio Univ Honors
Pitt Honors
Rutgers Honors
Virginia Honors
Western Michigan Honors
Wisconsin Honors

 

Update: 2016 Edition of Honors Ratings and Reviews

By John Willingham, Editor

After three months of analyzing data, we are almost at the point of rating at least 50 honors programs, writing their profiles, and adding another 10 or so summary reviews (unrated).

What I can say now is that there will be some significant changes–and some surprises. We are running behind schedule, but I still hope for publication by late September.

Here’s why. The 2014 edition was a great improvement over the 2012 book. In 2012, I was so focused on the importance of honors curriculum and completion requirements, along with the glitz of prestigious scholarships (Rhodes, Marshall, Goldwater, etc.) that the first effort failed to drill deeply into the complexities of honors programs.

The 2014 edition moved the ball forward–about halfway downfield, or more–because I was able to obtain more information from honors deans and directors. I also studied class section data online and derived a lot of useful information about honors-only classes, including average class sizes and a general idea of the disciplines offered.

For the 2016 edition, I knew going in that I needed far more detailed information from the programs themselves to develop precise measures for all class sections (including mixed and contract sections). Fortunately, I have been working with that much better information. The result is that instead of listing the number of honors classes in, say, math, the 2016 edition will report how many sections there are in relation to the total number of honors students.

This approach will have a dramatic impact in some cases. For example, say that Program A has 4 honors math sections might have looked good in the 2014 edition; but if Program A has 1400 enrolled honors students, 4 sections do not look very strong.

Another difference will be in the rating for honors class size. In 2014, the most accurate ratings were for honors-only class sizes. But the fact is that many programs offer much of their honors credit via mixed and contract sections. Accurately measuring the class sizes for these sections is extremely difficult when using only the online data. Indeed, there is no section information about contract sections online. Approximately 60 percent of programs allow credit for honors contracts (basically, doing extra work in a regular section for honors credit). A few have use contracts extensively. The new edition will list the average size of contract and mixed sections (honors and non-honors students in the same class).

Finally, another major difference that will have an impact in 2016 is that the rating for honors housing will have a new dimension: one-third of the rating will now be based on the availability of housing space, in addition to the amenities and dorm layout.

So stay tuned!

Florida, Maryland, and Washington Will Soon Use Only the New Coalition App

Three prominent public universities–Florida, Maryland, and Washington–will begin using the application process developed by the Coalition for Access, Affordability, and Success (CAAS), a recently formed consortium of more than 90 leading public and private colleges and universities.

Our guess is that the three schools will opt for the new process in summer 2016.  (Note: the University of Washington never used the Common App previously.)

Note: A list of all public universities listed as CAAS members as of March 9, 2016, is below.

According to a Scott Jaschik article in Insider Higher Ed, member schools “are creating a platform for new online portfolios for high school students. The idea is to encourage ninth graders begin thinking more deeply about what they are learning or accomplishing in high school, to create new ways for college admissions officers, community organizations and others to coach them, and to help them emerge in their senior years with a body of work that can be used to help identify appropriate colleges and apply to them. Organizers of the new effort hope it will minimize some of the disadvantages faced by high school students without access to well-staffed guidance offices or private counselors.”

To qualify, as of now, for membership in the CAAS, a school must have a six-year graduation rate of 70 percent or higher. Several prominent public universities that qualify have not yet joined, among them all of the University of California institutions, UT Austin, and UW Madison.

Jaschik writes that the UC campuses have not joined because of present concerns about the ability of community college transfers to use the process effectively. UC schools have strong and highly successful articulation agreements with the state’s community colleges.

UT Austin questions the fairness of the new process, at least in its initial form. “Associate director of admissions Michael Orr said UT did not apply to the coalition because of criticisms of the programs, including the coalition’s failure to consult with high school counselors,” according to Jameson Pitts, writing for the Daily Texan. 

“The argument within the community … has been that there is a concern that students with means will be the ones that will be able to take advantage of that opportunity the most,” Orr said. He did not rule out the possibility of joining the Coalition if concerns about fairness can be resolved.

Several voices in the higher ed community have opposed the Coalition, saying that students are already over-focused on preparing for college admission and that the new approach will favor more privileged students.

Our question is this: If the new process is designed to help students who cannot afford college counselors and lack effective guidance in their schools, how will the students find out about the process in the first place and learn to use it to good effect?

Whatever the possible shortcomings may be, the CAAS has gained the membership so far of the 36 public universities listed below. It is important to note that only Florida, Maryland, and Washington have decided to use the CAAS process exclusively. The other schools listed below will, as of this date, use either the Common App or the CAAS process.

Clemson
College of New Jersey
Connecticut
Florida
Georgia
Georgia Tech
Illinois
Illinois St
Indiana
Iowa
James Madison
Mary Washington
Maryland
Miami Ohio
Michigan
Michigan St
Minnesota
Missouri
New Hampshire
North Carolina
North Carolina State
Ohio St
Penn State
Pitt
Purdue
Rutgers
South Carolina
SUNY Binghamton
SUNY Buffalo
SUNY Geneseo
Texas A&M
Vermont
Virginia
Virginia Tech
Washington
William and Mary

Poets & Quants Composite MBA Rankings 2015 List 24 Public Programs in Top 50

The annual composite MBA rankings compiled by John A. Byrne at Poets & Quants combines rankings from the “five most influential rankings and weighs each of them by the soundness of their methodologies” in order to yield “a more credible list of the best MBA programs.”

We like Poets & Quants and Byrne’s rankings and try to write about them each year. The rankings from which he combines the comprehensive list are those from U.S. News, Forbes, Bloomberg, the Financial Times, and the Economist.

Here are the public MBA programs listed in the top 50 for 2015, and their composite rank:

8–UC Berkeley Haas

12–Virginia Darden

13–Michigan Ross

14–UCLA Anderson

17–North Carolina Kenan-Flagler

18–UT Austin McCombs

21–Indiana Kelley

22–Washington Foster

25–Michigan State Broad

29–Minnesota Carlson

31–Ohio State Fisher

32–Wisconsin

33–Penn State Smeal

34–Georgia Tech

35–Maryland Smith

36–Arizona State Carey

37–Iowa Tippie

40–Pitt Katz

41–Texas A&M Mays

44–Purdue Krannert

45–Illinois

46–Florida Hough

47–UC Irvine Merage

48–Georgia Terry

50–Temple Ford