How Alive Are the Liberal Arts in Honors Programs?

The short answer: very alive.

After an extended period during which more and more students have felt the need–regardless of personal interest and aptitude–to major in business, engineering, or computer-related fields, the liberal arts, especially the humanities, have faced declining enrollment.

The impact that this trend has had on personal growth and enlightened participation in civic life is evident, given the tone and outcome of the presidential election.

In the meantime, several prominent public universities have endured attacks on their humanities departments and commitment to learning for learning’s sake, most notably UT Austin, Florida universities, and, very recently, UW Madison. Most states have dramatically reduced financial support for their universities; some regents have used the real or manufactured budget crisis as a pretext for attacking non-vocational disciplines.

But the liberal arts and, yes, the core humanities that are essential to the liberal arts, have survived in public honors colleges and programs. Some students express resentment that, in order to be in an honors program, they must take a series of interdisciplinary seminars and electives in the humanities. Under pressure from parents or highly focused on their chosen vocational discipline, they want “to  get on with it” and reach a point where they can start making real money and pay back those student loans.

This is understandable. But honors educators know that almost every bright student is in many ways unformed and searching for paths of meaning in their lives. One course in history, or philosophy, or literature, or maybe in religious studies or film, can guide a student toward a lifetime of serious inquiry, self reflection, and greater compassion for others. These and other courses in the liberal arts reinforce the application of informed judgment to facts that are often contradictory or in flux.

Consensus is emerging that for many students, “We don’t need more STEM majors. We need more STEM majors with liberal arts training.” Indeed, this is one of the two or three major advantages of honors programs. STEM majors who otherwise would take few liberal arts courses (and an extremely small number of humanities classes), must take them as members of a university-wide honors college or program.

But one other major–business–could likely benefit even more from greater exposure to the liberal arts and, again, to the humanities

Recent research shows that “critical thinking,” measured after adjusting for entrance test scores, shows the greatest gains for students in the liberal arts.  Engineering and technology students have high raw entrance test scores and strong critical thinking ability, but after adjusting for the effect of the high test scores, their critical thinking skills are relatively lower.

Business majors do not receive high raw or adjusted scores in critical thinking. Given that a plurality of bachelor’s degrees are awarded in business subjects, this is a matter of significant concern.

English is the discipline most offered by honors programs. This is so because many of the required English classes have a heavy writing component, often associated with the study of rhetoric. In these classes the humanities and vocational mastery come together in a way, for the most successful and most fulfilled professionals often have outstanding communication skills and a heightened sensitivity to the thoughts and needs of others.

So what are the “liberal arts”? The answer to this question varies, but here we will include the following disciplines, all of which are traditional core offerings in liberal arts colleges (humanities, natural sciences, and social sciences):

Humanities: English, history, philosophy, fine arts, foreign languages, religious studies, film, classics. Sciences: math, biology, chemistry, physics, geology. Social Sciences: sociology, anthropology, gender studies, psychology, communications, political science, economics, and geography.

(One can see that many of these can be, and often are, “vocational” in themselves.)

Using the above as our “liberal arts,” we used data gathered for our most recent book, Inside Honors, which included 4,460 honors sections. Of these, we found that 59% were in the liberal arts, not counting interdisciplinary seminars, which accounted for another 26% of sections. Most of these seminars had a humanities focus, so about 85% of honors sections were in the liberal arts.

By discipline, English had the highest percentage of sections, even when sections in business, engineering, and technology are included. Math and business disciplines combined had about the same number of sections as English.

The STEM disciplines are strongly represented, however, accounting for 25% of honors sections. (But the science and math sections counted here are also part of the overall liberal arts group.)

Engineering and technology, considered separately, make up  8% of honors sections. Admittedly, the “regular” courses in these disciplines are usually rigorous enough in themselves.

Not all of the humanities are strongly represented, however, with classics, film, and religious studies combined counting for only 1.4% of honors sections. In fairness, the classics do feature prominently in many interdisciplinary seminars.

 

 

 

 

 

 

Advertisements

Average U.S. News Rankings for 123 Universities: 2012-2019

Updated September 10, 2018, to include new U.S. News rankings for 2019.  Listed below are the yearly rankings and overall average rankings of 123 national universities that were included in the first tier of the U.S. News Best Colleges from 2012 through 2019. There are 61 public and 62 private universities. The list below not only shows the average rankings over this eight-year period but also lists the number of places lost or gained by each university.

U.S. News has changed its methodology, and there are some significant changes, especially after the top 30-35 places in the rankings. Major gains for Florida, Florida State, Georgia, most UC campuses.

“New this year, we factored a school’s success at promoting social mobility by graduating students who received federal Pell Grants (those typically coming from households whose family incomes are less than $50,000 annually, though most Pell Grant money goes to students with a total family income below $20,000).” This has shaken up the rankings quite a bit.

More on the methodology changes in a future update.

As a group, the private universities have had an average increase in the rankings of .18 places, while the public universities have had an average decline of 3.6, demonstrating what we have observed in the past–public universities are, in general, not on an upward trajectory in the rankings. And this is a step backward compared with the last set we published, covering the years 2011–2018, but still better than 2010–2017, when public universities lost an average five places during that eight-year period while private colleges gain two places.

While we appreciate the massive amount of data that the U.S. News rankings provide on class sizes, grad rates, retention rates, and even selectivity, on the whole the rankings fail to evaluate efficiency (the number of students who receive a high-quality education at a relatively low cost) and should not use selectivity and wealth as metrics.

One reason for the sudden rise in a school’s ranking is increased “gaming” of the rankings. Some institutions, public and private, but mostly the latter, have geared their marketing and merit aid to increase the number of applicants and lower their acceptance rates accordingly. This makes them more “selective” and helps to improve their rankings.

The U.S. News rankings not only over-emphasize the metrics related to a university’s financial resources but also, especially in the last five years or so, reward selectivity when, in fact, the results of the selectivity are already considered. Why should Stanford be rewarded for having an acceptance rate of 5% and be rewarded for having high graduation and retention rates, both of which are largely the result of selectivity. Using test scores as a factor in predicting what grad rates should be is fine, as is rewarding or penalizing schools for exceeding or not meeting such predictions. But the high scores themselves and the low acceptance percentages merely duplicate what is more properly measured by outcomes.

Here are the historical rankings, the average of each school across eight years, and the increase or decline of each school from 2012 through 2019. The universities are listed in order of their average ranking across the years.

Here is the list.

US News 2012--201920122013201420152016201720182019Avg RankChg 2012
to 2019
Princeton1111111110
Harvard112222221.75-1
Yale3333333330
Columbia444445534.1251
Chicago545443333.8752
Stanford565445575.125-2
MIT567777535.8752
Penn587898887.625-3
Duke1087888988.252
Caltech5101010101210129.875-7
Dartmouth111010111211111211-1
Johns Hopkins131312121010111011.3753
Northwestern121212131212111011.752
Brown151514161414141414.51
Cornell151516151515141615.125-1
Washington Univ141414141519181915.875-5
Vanderbilt171717161515141415.6253
Rice171718191815141616.751
Notre Dame191718161815181817.3751
Emory202020212120212220.625-2
UC Berkeley212120202020212220.625-1
Georgetown222120212120202220.8750
USC2324232523232122231
UCLA252423232324211922.756
Carnegie Mellon232323252324252523.875-2
Virginia252423232624252524.3750
Wake Forest252723272727272726.25-2
Tufts292828272727292727.752
Michigan282928292927282728.1251
North Carolina293030303030303029.875-1
Boston College313131313031323831.875-7
NYU333232323236303032.1253
William & Mary333332333432323833.375-5
Brandeis313332353434343533.5-4
Rochester353332333332343333.1252
Georgia Tech363636363634343535.3751
Case Western383737383737374237.875-4
UC San Diego373839373944424139.625-4
UC Santa Barbara424141403737373038.12512
UC Davis383839384144463840.250
Lehigh383841404744465343.375-15
RPI504141424139424943.1251
UC Irvine454449423939423341.62512
UW Madison424141474144464943.875-7
Illinois454641424144524644.625-1
Boston Univ535141424139374243.2511
U of Miami384447485144465346.375-15
Penn State454637484750525948-14
Tulane505152544139404446.3756
Washington 424652485254565951.125-17
Florida585449484750423547.87523
Northeastern625649424739404447.37518
UT Austin454652535256564951.125-4
Pepperdine555457545250464651.759
George Washington505152545756566354.875-13
Ohio St555652545254545654.125-1
Yeshiva454647485266948059.75-35
Fordham535857586660617060.375-17
Maryland555862625760616359.75-8
SMU625860586156616359.875-1
Georgia 62636062615654465816
Syracuse625862586160615359.3759
Connecticut586357585760566359-5
Purdue626568626160565661.256
WPI626562685760615961.753
Pitt585862626668687064-12
Clemson6868626261666766652
Brigham Young716862626668616365.1258
Texas A&M586569687074696667.375-8
Minnesota686869716971697670.125-8
Rutgers686869707270695667.7512
Virginia Tech717269717074697671.5-5
Baylor757775717271757874.25-3
American827775717271697874.3754
Iowa717273718282788977.25-18
Delaware757575767579818978.125-14
Michigan St717273857582818578-14
Stevens Inst Tech887582767571697075.7518
Col School of Mines757791887582758080.375-5
Indiana758375767586908981.125-14
UC Santa Cruz757786858279817079.3755
Clark94837576757481667828
Miami Oh908975768279788982.251
Marquette828375768686908983.375-7
UMass Amherst949791767574757081.524
Tulsa7583868886868710687.125-31
TCU979282768282788083.62517
Denver828391888686879687.375-14
Binghamton8889978889868780888
Vermont829282858992979689.375-14
Alabama757786889610311012995.5-54
Colorado 949786888992909691.5-2
San Diego979291958986908590.62512
Drexel8883979599969410294.25-14
St. Louis88921019996969410696.5-18
Auburn8289911031029910311598-33
Stony Brook1119282888996978091.87531
Florida St1019791959692817090.37531
NC State101106101958992818093.12521
Missouri90979799103111120129105.75-39
New Hampshire1011069799103107103106102.75-5
Tennessee101101101106103103103115104.125-14
Iowa St94101101106108111115119106.875-25
Oklahoma10110110110610811197124106.125-23
Nebraska10110110199103111124129108.625-28
Univ at Buffalo11110610910399999789101.62522
Loyola Chicago119106101106999910389102.7530
Oregon101115109106103103103102105.25-1
Pacific101106112116108111110106108.75-5
Kansas101106101106115118115129111.375-28
Dayton101115112103108111124127112.625-26
Illinois Tech11111310911610810310396107.37515
South Carolina111115112113108107103106109.3755
UC Riverside9710111211312111812485108.87512
Michigan Tech111120117116123118124136120.625-25
Catholic119120121116123124120129121.5-10
Clarkson119115121121115129124102118.2517
Arizona124120119121121124124106119.87518
Howard1111201421451351241108912222
Colorado St128134121121127129124140128-12
Kentucky124125119129129133133147129.875-23
Arizona St132139142129129129115115128.7517
Arkansas132134128135129135133147134.125-15

Update No. 2: It’s Complicated–the 2016 Edition of Honors Ratings and Reviews

By John Willingham, Editor

Honors colleges and programs are complex. If you think about it, how could they not be? Take a (generally) large public research university with many thousands of students, sprawling campuses, hundreds of professors, and the huge football stadium somewhere close at hand–and then create an honors program, or even a college within a college, a hybrid for high achievers who might have gone elsewhere.

Any book that attempts to rate or review honors programs can skim the surface and use only a handful of criteria that are relatively simple to assess, or the book can go inside honors in order to explain the more subtle differences. My first book on honors programs was, in retrospect, simplistic. The second was much more in-depth, but did not capture or explain precisely the many types and actual sizes of honors classes, especially sections that are “mixed” or “contract” sections. (A mixed section has honors students as well as non-honors students, the latter often majors in the discipline; in a typical honors contract section,  only one or two honors students receive credit for doing work in a regular section.)

The third book will be the best, and I hope will do justice to the complexity of honors education. But beware: the new book will somewhat complicated itself.
(And getting it out is complicated, too. I am hoping for mid-September. There will be 50 in-depth rated reviews, plus either 5 or 10 summary reviews, time permitting.)

A big reason involves a prospective student who has received an acceptance letter from the prestigious first-choice private college or public elite–but the need-based aid falls short. The “safe” public university, typically in-state or nearby, now receive more serious attention. It is at this point that the honors program or college can incline a student one way or the other.

It is obvious that prestige often plays a large role when it comes to first and second choices of a college. Now with the need-based aid falling short, the cost of prestige has become a problem for the prospective student. If the safe school does not have the same prestige, then what exactly does it have that would is most important to the student, prestige now set aside? Here is the time that parents and students look at the nuts and bolts.

Of course cost is still a huge factor. I will have a much-improved section on merit scholarships at each honors program.

How about small classes, the types of classes, the range of honors classes across disciplines? The data I have this time around is far better than I was able to receive for previous editions; the ratings will be much more precise for class size, type, and range.

But this is the main reason the new book will be somewhat complicated itself. In order to define these types of classes, there are additional categories: Number of Honors Sections; Honors Sections in Key Disciplines (15); Level of Enrollment–the extent to which honors students remain active in the programs; Honors-only class sizes, and the percentage of these actually taken; mixed class sizes, with the same information about the percentage of students; and contract sections, also with the percentage.

How about honors housing? Many prestigious private colleges have residence facilities that are outstanding. Now I will report not only the amenities for honors housing but also the availability of that housing. The rating will now show the reader the ratio of honors dorm space to the number of first- and second-year students in the program.

Did I say ratio? Yes, and some of the ratings can veer into wonkish territory. So…please be patient with the details, for they are where the decisions are made. The student who loves and thrives in small classes needs that detail, and the additional information about mixed and contract classes. The student who wants honors seminars and dozens of honors classes in his or her discipline, will focus on those details; the student who doesn’t have time for seminars will want the straight-from-the shoulder program. And the students who not only desire high-quality dorms but actually want to know if there is space in those dorms, will focus on that detail.

For many students and families, the merit aid and total cost will be the deciding factors. Notice that I did not say “detail.”

While the idea that an honors program “offers the benefits of the liberal arts experience along with the advantages of a major public research university” is generally true, the ways in which honors programs try to meet this goal vary greatly. The new book will be the best effort yet to light up the ways honors works in public institutions.

Inside Honors: What 9,000 Class Sections Can Tell You

By John Willingham
Editor, Public University Honors

When parents and prospective students (not to mention college junkies) want to “know” about a college, what they want most is to get a sense of what it’s “really like,” the inside story so to speak.

Most college rankings focus only on what can be measured: test scores, class sizes, financial resources, selectivity, grad and retention rates, the salaries graduates can receive. Some non-numerical ratings–the famous Fiske guide, for example–focus less on formal measures and do offer narratives that provide impressionistic glimpses of campus life. Taken together, rankings and good rating guidebooks provide much excellent information.

But surely a big part of the “what’s it really like” story has to be not only the graduation requirements but also the actual classes and coursework required for graduation. How many courses are available in your student’s proposed major? Are there interdisciplinary seminars? How about access to mentors and support for undergraduate research, both more likely if small classes are offered.

Yes, you can read about courses if you work your way through undergraduate catalogues. In some cases there will be course descriptions. But what you probably won’t find in catalogues are the number of sections and the actual enrollment in each one. What I have found during five years of analyzing public honors programs and colleges is that one cannot come close to understanding the real nature of these programs without poring over the actual class sections–and course descriptions.

When the first edition of A Review of Fifty Public University Honors Programs appeared in April 2012, I realized that it was a tentative step in the process of trying to analyze and report on the most important characteristics of honors programs in prominent state universities.

What I failed to understand was just how “tentative” that first effort was.

The original emphasis was on honors curriculum and completion requirements, and the overriding idea was that the more honors classes a student had to take, the more that student would benefit from what I called “honors contacts” at the time.  Honors students would have more contact with professors in smaller honors classes; they would find a ready cohort of serious students like themselves; they would have far more research opportunities, again allowing more contact with professors.

If honors programs sought to provide an Ivy or liberal arts education in the midst of a large public university setting, then the extent of honors contacts within that larger context would measure how well the program was meeting its mission.

I continue to believe the curriculum completion requirements are at the heart of an honors program or college. But those requirements only quantify the total number of credits a student must earn to graduate; they do not speak to the range of honors courses offered in each academic discipline, or to how small the classes really are, or to the type of class experiences that are available (seminars, lectures, labs).  The credit requirements do not yield an impression of how creative a program is or how interesting its courses may be.

In other words, the emphasis on the bare curriculum completion requirements does not get at the heart (some might say guts) of an honors program.

Now, with more than 90 percent of our data for the new 2016 edition in house, we have begun to explore the inside of honors education at 60 public universities, which means a somewhat tedious analysis of data for approximately 9,000 honors class sections.

Here are examples of what we learn from this work:

  1. How to develop basic classifications for the honors programs and colleges. The courses tell us whether a given program is a “core” program, a “blended program,” or a “department-based” program. A relatively small program with small, honors-only seminars along with relatively few set science and math requirements is a core program. Generally larger programs (some with more than 6,000 students) can be “blended” or “department-based.” If blended, they will have a large number of all-honors seminars, perhaps one-third to one-half of the total honors courses available, and the remainder of courses will be more narrowly defined by the academic departments. Department-based programs might offer a few seminars but offer most honors sections through the academic departments. If a blended or department-based program has a lot of “mixed” class sections (honors students plus non-honors students in the same sections), we can then pass along this information to readers, who may or may not care that many sections are mixed.
  2. How to asses the size of class sections. We have actual enrollment levels for the 9,000 class sections we review. This will allow us to tell readers about the overall average class size for all honors sections, including mixed sections which tend to be larger. From this, readers will gain an idea of how much close interaction with “honors contacts” is likely.
  3. How many honors classes are “contract” or “add-on” sections. Contract sections require an honors student to sign an agreement with the instructor specifying the extra work the student will do to earn honors credit. Most contract sections have only a very few honors students. The same is generally true of “add-on” sections, but these are somewhat more formal in that they are regularly offered term after term and have more established requirements that honors students have to meet to earn honors credit in a regular section. Readers may or may not like the idea of this type of section. Are they less rigorous? Is the flexibility they allow worth it? Our data indicate that in our data set of 60 programs, these types of classes may be about 25 percent of total honors sections. Please note that about two-thirds of programs offer contract or add-on sections for credit, but only five or six offer them on a large scale.

So…to know what “it’s really like really like” in honors program A or honors college B, you have to put yourself in the classroom, so to speak, and get a feel for the characteristics and subject matter of those class sections. Do you want the feel of a small, closely-knit program with a well-defined curriculum and rigorous seminars? Do you want the intimacy of seminars but also the nuts and bolts offered by a broad range of departmental honors classes? Or, are you mainly interested in having as many class choices in as many disciplines as possible, even if some of your classes will be mixed and relatively larger than the all-honors sections.

Once we have finished our “classroom work,” we should be able to give you a better sense of what 60 prominent honors programs and colleges are, in fact, like.

Florida, Maryland, and Washington Will Soon Use Only the New Coalition App

Three prominent public universities–Florida, Maryland, and Washington–will begin using the application process developed by the Coalition for Access, Affordability, and Success (CAAS), a recently formed consortium of more than 90 leading public and private colleges and universities.

Our guess is that the three schools will opt for the new process in summer 2016.  (Note: the University of Washington never used the Common App previously.)

Note: A list of all public universities listed as CAAS members as of March 9, 2016, is below.

According to a Scott Jaschik article in Insider Higher Ed, member schools “are creating a platform for new online portfolios for high school students. The idea is to encourage ninth graders begin thinking more deeply about what they are learning or accomplishing in high school, to create new ways for college admissions officers, community organizations and others to coach them, and to help them emerge in their senior years with a body of work that can be used to help identify appropriate colleges and apply to them. Organizers of the new effort hope it will minimize some of the disadvantages faced by high school students without access to well-staffed guidance offices or private counselors.”

To qualify, as of now, for membership in the CAAS, a school must have a six-year graduation rate of 70 percent or higher. Several prominent public universities that qualify have not yet joined, among them all of the University of California institutions, UT Austin, and UW Madison.

Jaschik writes that the UC campuses have not joined because of present concerns about the ability of community college transfers to use the process effectively. UC schools have strong and highly successful articulation agreements with the state’s community colleges.

UT Austin questions the fairness of the new process, at least in its initial form. “Associate director of admissions Michael Orr said UT did not apply to the coalition because of criticisms of the programs, including the coalition’s failure to consult with high school counselors,” according to Jameson Pitts, writing for the Daily Texan. 

“The argument within the community … has been that there is a concern that students with means will be the ones that will be able to take advantage of that opportunity the most,” Orr said. He did not rule out the possibility of joining the Coalition if concerns about fairness can be resolved.

Several voices in the higher ed community have opposed the Coalition, saying that students are already over-focused on preparing for college admission and that the new approach will favor more privileged students.

Our question is this: If the new process is designed to help students who cannot afford college counselors and lack effective guidance in their schools, how will the students find out about the process in the first place and learn to use it to good effect?

Whatever the possible shortcomings may be, the CAAS has gained the membership so far of the 36 public universities listed below. It is important to note that only Florida, Maryland, and Washington have decided to use the CAAS process exclusively. The other schools listed below will, as of this date, use either the Common App or the CAAS process.

Clemson
College of New Jersey
Connecticut
Florida
Georgia
Georgia Tech
Illinois
Illinois St
Indiana
Iowa
James Madison
Mary Washington
Maryland
Miami Ohio
Michigan
Michigan St
Minnesota
Missouri
New Hampshire
North Carolina
North Carolina State
Ohio St
Penn State
Pitt
Purdue
Rutgers
South Carolina
SUNY Binghamton
SUNY Buffalo
SUNY Geneseo
Texas A&M
Vermont
Virginia
Virginia Tech
Washington
William and Mary

University of Texas Chancellor Opposes Top 10 Percent Admission Rule

As a former Navy Seal and admiral in command of all U.S. Special Operations forces, UT System Chancellor Bill McRaven spent three decades serving and leading the most elite military forces in the world. He has made it clear that he now wants the state flagship to join the best of the best among the nation’s public universities.

But after seeing the system flagship turn away thousands of the state’s elite students because they did not make the top 10 percent (actually 7 percent at UT Austin this year) in the graduating classes of the state’s most competitive high schools, the chancellor sees the automatic admission rule as a major obstacle to keeping the brightest students in Texas–and at UT Austin.

“Candidly, right now what is holding us back is the 10 percent rule,” McRaven told state higher ed leaders recently.

The motives of the Top 10 proponents are certainly worthy–to increase the enrollment of high-achieving minority students at UT Austin. But what makes the rule (sort of) work is that it is predicated on the fact that many of the state’s high schools remain almost entirely segregated. Sometimes this is because an entire region is heavily Latino (the Rio Grande Valley); but elsewhere the segregation in urban centers is based on race and income.

Many of these high schools are among the least competitive in the state. Graduating in the top 7 percent of a high school that offers no AP or honors sections and that has low mean test scores is far different from reaching the top 7 percent of a graduating class of 800 students that has 70 National Merit Scholars.

What can happen to suburban students at very competitive schools is that an unweighted high school GPA of 3.9 (high school rank top 11 percent) and an SAT score of 1440 might not make the cut at UT Austin. Three-fourths of the school’s admits are from the top 7 percent pool; the other 25 percent of admits face a pool that is as competitive as many of the nation’s most selective private colleges.

And, McRaven would say, too many of these students are going out of state, where it costs them more and where they might remain rather than return to Texas. Moreover, the chancellor believes the rule is part of the reason that UT Austin, despite having a stellar faculty, is not rated as highly as it should be among the nation’s public universities.

“Candidly, I think we need to take a hard look at some of the ways that we address higher education, particularly at our flagship program. Your flagship, your number one university in the state of Texas is ranked 52nd on the U.S. News & World Report. To me that’s unacceptable. A lot of things drive that. The 10 percent rule drives that,” he told higher ed leaders.

While he did not specify exactly how the rule contributes to lower rankings, the graduation rate metric used by U.S. News might be lower for UT Austin in part because of the relatively lower standards in many poor and mostly segregated high schools. (It is possible that the chancellor also sees the large size of UT Austin as another issue.)

If the chancellor can find a way to maintain or improve minority enrollment and do away with the Top 10 rule, he might prevail. If the U.S. Supreme Court does not scrap the university’s current holistic admissions policy for students outside the top 7 percent, he might have a better chance; otherwise, his task will be as difficult as many he faced as a military leader.

“[The Top 10 Rule] is a very very sensitive topic,” State Rep. Robert Alonzo told McRaven. “It is a topic that we have discussed at length from all different aspects, and I would hope that we have put it to rest for a while.”

McRaven was undeterred. “I am a new chancellor, so I am going to take that opportunity to re-open that look again,” he said. “Because my charge is to make us the very best, and I think there are some obstacles to doing that.”

Alonzo replied: “Well, I accept the challenge, sir.”

Stay tuned, for this could be a big battle indeed.

Do Elite Colleges Really Offer Better Courses? Probably Yes, in Some Ways

Is it actually worth it, in terms of quality classroom learning, to land a place at an elite college or university? This is a question that many families with highly-talented students ask themselves. If their answer is yes, the result is likely to be a concerted, frenzied effort to mold the students in a way that gives them at least a modest chance of admission to such schools. (Of course, for better or worse, the question is often framed as “Is it worth it, in terms of career success, to land a place…”).

Regarding the differences in the quality of classes among all levels of institutions, new research provides some insights. The researchers lean toward minimizing the relationship between academic prestige and quality of instruction–but it appears that some of their own research suggests just the opposite.

In an article titled Are Elite College Courses Better?, Doug Lederman, editor and co-founder of Inside Higher Ed, provides an excellent, mostly neutral summary of the recent research that suggests course quality in a relatively broad range of institutions does not vary as much as the prestige of a given school might suggest.

“Researchers at Teachers College of Columbia University and at Yeshiva University… believe they are developing a legitimate way to compare the educational quality of courses across institutions,” Lederman writes, “and their initial analysis, they say, ‘raises questions about the value of higher-prestige institutions in terms of their teaching quality.'”

The researchers suggest that the drive to enhance prestige based on rankings and selectivity have led to “signaling”–branding, perceptions–that are increasingly divorced from the actual quality of classroom instruction. The laudable aim of the researchers is to turn the conversation away from college rankings and the metrics that drive them, and toward measurements of effective, challenging instruction.

Trained faculty observers visited nine colleges and 600 classes. Three of the nine had high prestige; two had minimum prestige; and four had low prestige. The schools were both public and private, with differing research and teaching emphases. We should note that there was no list of which schools were in each category, so we do not know exactly how the researchers defined “elite.” It appears likely, however, that many leading public research universities would be considered elite.

“Teaching quality was defined as instruction that displayed the instructor’s subject matter knowledge, drew out students’ prior knowledge and prodded students to wrestle with new ideas, while academic rigor was judged on the ‘cognitive complexity’ and the ‘level of standards and expectations’ of the course work,” Lederman writes.

“But they found that on only one of the five measures, cognitive complexity of the course work, did the elite colleges in the study outperform the non-elite institutions.”

First, we note that highly-qualified honors students at almost all colleges, including many less prestigious public universities, are far more likely to encounter more “cognitive complexity” in their honors courses. Whether this results from having more depth or breadth in actual assignments, from taking harder courses early on, or from engaging in more challenging interactions with similarly smart students and the best faculty, the learning experience in honors embraces complexity.

We also have to agree with one of the longest and most thoughtful comments posted on Lederman’s article, by one “catorenasci”:

“Well, is [more cognitive complexity] a surprise to anyone? After all…on average the students at elite colleges and universities (private or public) have demonstrated higher cognitive ability than the students at less prestigious colleges and universities. Which means that the faculty can teach at a level of greater cognitive complexity without losing (many) students.”

The full comment from “catorenasci” also seems to be on the mark when it comes to improved instruction in all other measured areas on the part of colleges with less prestige, regardless of honors affiliation.

“As for the level of ‘teaching quality’ based on faculty knowledge, given the job market today, it should hardly be surprising that it has equaled out since there are many top quality candidates for even less prestigious positions and overall, I would suspect that the ‘quality’ of the PhD’s of faculty at less elite schools is much closer to that of elite schools than it was during the ’50s and ’60s when higher education was expanding rapidly and jobs were plentiful.

“The transformational aspect should not be surprising either: assuming faculty are competent and dedicated, with less able students they will work harder to draw out what they know and build on it. And, it will be more likely that students will experience significant growth as the faculty do this.”

New Coalition for Access and Affordability: A Revolution in Admissions?

The Coalition for Access and Affordability is a new group of 80-plus colleges and universities, all with six-year grad rates of 70 percent and higher, and all apparently committed to transforming the admissions process at high-profile institutions. Among the members are all Ivy League schools, top liberal arts colleges, and many leading public universities. So far, the UC System and the UT System are not listed as members.

Note: A link showing coalition members is at the end of this post.

What exactly all of this means for the Common App is uncertain. For now, it appears that coalition members will use it.

“What the emergence of a new rival might mean for the Common Application could become an intriguing storyline over the next few years,” the Chronicle of Higher Ed reports. The standardized admissions form used by more than 600 colleges worldwide has long dominated the college-admissions realm.

“But it’s raising the college-access flag, too. Recently, the organization bolstered the college-planning resources for students on its website, including information specifically for middle-school students and ninth graders. ‘It’s planning to roll out ‘virtual counselor’ materials, including articles and videos that answer specific questions about the application process,” said Aba G. Blankson, director of communications for the Common Application.”

Questions remain about the mission and intentions of the coalition. One dean of admissions told the Chronicle of Higher Ed that “I’m not convinced about the true intentions of the coalition. The schools participating in this effort should not mask their intentions on the guise of ‘access.’ It’s a deceiving marketing ploy… ”

As usual, Nancy Griesemer, writing for the Washington Examiner, has written an excellent post on the hot topic.

“In a nutshell,” she writes, “the Coalition is developing a free platform of online college planning and application tools. The tools will include a digital portfolio, a collaboration platform, and an application portal.

“High school students will be encouraged to add to their online portfolios beginning in the ninth grade examples of their best work, short essays, descriptions of extracurricular activities, videos, etc. Students could opt to share or not share all or part of their portfolios with college admissions or counseling staff and ‘community mentors.'” [Emphasis added.]

The planning site and portfolio portals are supposed to be open to high school students in January 2016, and the supposition is that coalition members will be using the data then.

“Billed as a system designed to have students think more deeply about what they are learning or accomplishing in high school by the development of online portfolios, the new endeavor will actually create efficient ways for college admissions officers to access more detailed information about prospective applicants earlier in the game,” Griesemer writes.

“The coalition application is an interesting concept, but begs the question of who will benefit more from the information-sharing plan—high school students or colleges. And while the plan is promoted as helping students—particularly disadvantaged students—to present themselves to colleges in a more robust manner, it seems likely that students able to afford early college coaching may actually benefit quite a bit from being able to post their accomplishments on a platform viewed and commented on by admissions staff.” [Emphasis added.]

Here is a link showing coalition members as of this date.

Expert on International Higher Ed: “Un-Excellence” Initiatives Aimed at Public Universities Harm U.S. Standing in the World

Editor’s note: This post updated on October 1, 2016, after release of Times Higher Ed Rankings for 2016.

It is likely that Philip G. Altbach, a research professor and the founding director of the Center for International Higher Education at Boston College, has the sharpest eye of anyone in America when it comes to seeing how well U.S. universities compare with rapidly improving institutions throughout the world. What he sees is not good.

U.S. public universities are losing ground to foreign institutions, most notably in Europe and Scandinavia.

(Below the following text are three tables. The first compares the Times Higher Ed world rankings of U.S. universities and those throughout the world for the years 2011 and 2016. The second shows the decline in rankings for U.S. public universities for the same  years. The third and final table shows the rise in rankings for U.S. private universities.

Altbach cites the work of colleague Jamil Salmi, who found that there are at least 36 “excellence initiatives” around the world “that have pumped billions of dollars into the top universities in these countries — with resulting improvements in quality, research productivity, and emerging improvements in the rankings of these universities. Even in cash-strapped Russia, the ‘5-100’ initiative is providing $70 million into each of 15 selected universities to help them improve and compete globally. [Emphasis added.]

“At the same time, American higher education is significantly damaging its top universities through continuous budget cuts by state governments. One might call this an American “unExcellence initiative” as the world’s leading higher education systematically damages its top research universities. Current developments are bad enough in a national context, but in a globalized world, policies in one country will inevitably have implications elsewhere. Thus, American disinvestment, coming at the same time as significant investment elsewhere, will magnify the decline of a great higher education system.”

One reason: All the bad publicity about cuts in funding, along with high-profile political grandstanding that has received far too much attention throughout the world academic community. For example, UT Austin endured years of highly publicized attacks by former Governor Perry during his second term, and UW Madison has been hurt by similar actions on the part of Governor Scott Walker.

Unless state legislatures move toward excellence and restore pre-Great Recession funding levels, there will be “a revolution in global higher education and create space for others at the top of the rankings. It would also be extraordinarily damaging for American higher education and for America’s competitiveness in the world.”

The average ranking for the 23 U.S. publics among the top 100 in the world in 2011 was 39th, while in 2016 it was 49th–and only 15 publics were among the top 100. Meanwhile, leading U.S. private universities have seen both their average reputation rankings and overall rankings rise since 2011.

To illustrate Professor Altbach’s point, we have generated the tables below.

This table compares the Times Higher Ed world  rankings of U.S. universities and those throughout the world for the years 2011 and 2016.

Top 100 Overall2011Public2016PublicGain/Loss
U.S.53233915-14
U.K./Ireland16160
Europe/Scand122513
Asia109-1
Australia561
Canada440

The following table shows the decline in rankings for U.S. public universities for the years 2011 and 2016. UC Davis is the only school to rise, while most dropped significantly.

 RepRankingRepRanking
Times Higher Ed Rankings2011201120162016
UC Berkeley48613
UCLA12111316
Michigan13151921
Illinois21333036
Wisconsin25433850
Washington26233332
UC San Diego30324139
UT Austin31294646
UC Davis38544444
Georgia Tech39274941
UC Santa Barbara55296539
North Carolina41306563
Minnesota43527565
Purdue 4710675113
Ohio State55667590
Pitt55647579
Averages33394749

This last table shows the rise in rankings for leading U.S. private universities for the years 2011 and 2016.

 RepRankingRepRanking
Times Higher Ed Rankings2011201120162016
Harvard1116
MIT2345
Stanford5453
Princeton7577
Yale910811
Caltech10291
Johns Hopkins14131812
Chicago15121110
Cornell16142018
Penn22192317
Columbia23181015
Carnegie Mellon28202822
Duke36243420
Northwestern40254725
NYU55602030
Averages19151614

U.S. News Top 20 Publics 2009-2016: Resources, Grad Rates, Class Size

The annual U.S. News Best Colleges rankings are extremely useful and interesting, but not exactly because of the rankings themselves.

The rankings do not so much answer questions as raise them. Why did a college or university with a strong academic reputation and grad rates not do better? Why did another college with a weak academic reputation and so-so grad rates do so well? How much does class size matter?

Of course the rankings also provide great and useful stats about the questions raised above and about the test score ranges and high school GPA’s of enrolled students.

But when it comes to the rankings, what should matter most are grad and retention rates, class sizes, academic reputation, and, most problematically, the financial resources of colleges. The last should not be counted because it is the “input” factor for the other things measured as “outputs” and therefore only magnifies the impact of money.

The table below shows how the impact of “financial resources” can have such an impact on rankings. Only the College of William and Mary seems invulnerable to the typically damaging impact of low financial resources. The main reason is high grad rates, the 4th highest percentage (among top publics) of classes with 20 or fewer students, and the lowest percentage of classes with 50 or more students.  Of course, William and Mary is, by far, the smallest of the leading publics.

The University of Washington has the most puzzling combination: a financial resources ranking significantly higher than the ranking of the school. For a while now, the numbers for UW have looked odd to us, although the school itself is at least on a par with Wisconsin, Illinois, and UT Austin.

UT Austin is a more typical case. The university is penalized not only for having relatively low financial resources (per student) but also because the resulting outputs (especially class sizes) are well below the mean. That UT Austin is too large to be ranked in proportion to its academic reputation may be the fault of the powers that be in Texas; but that it is penalized as well because of the low score in financial resources is, in effect, a double whammy.

The other side of the coin, so to speak, is that magnifying the wealth metric is a built-in boost to many private universities. Most public universities, however, do relatively well only in spite of the magnification of the wealth factor.

Because of the width of the table, readers will need to scroll horizontally to see the last one or two columns. If you want to track a single school while scrolling, just select the entire row and then scroll.

NationalYRYRAvgChg 09GradClassClassPeerGr/Ret$$$
University20092016Rankto 16Rate%<20%>50RatingRankRank
Berkeley212020.7519159154.72339
UCLA25232429150.9224.22320
Virginia232624.125-39455154.21167
Michigan262928.125-39148.1184.42341
N. Carolina303029.62509039.21542832
Wm Mary323432.625-29047.593.728113
Ga Tech353635.625-18238.6254.14944
UCSD353936.875-48638.1363.83621
UC Davis444139.87538734.8273.84732
UCSB443740.7578749.4183.55467
Wisconsin354141.375-68545.8204.14062
Illinois404142.625-18441.7203.94056
UC Irvine443943.7558758.3203.63651
Washington 415245.5-118435.1223.85432
Penn State474745.508638.4153.63856
UT Austin475248.375-58136.52545878
Florida494750.62528849.1163.63445
Ohio St565254.2548329.6223.74971
Maryland535757-48645.4163.64083
Georgia 586160-38538.8123.440120
Mean Avg39.2540.240.06875-0.9586.943.96519.43.88537.5556.5