The website Quartz just published a list of the universities that place the highest number of grads at tech firms in Silicon Valley.
“The most coveted jobs are in Silicon Valley, and most selective US universities are members of the Ivy League. So it stands to reason that tech giants like Apple, Google, Amazon and Facebook would scoop up best and brightest from those bastions of power and privilege.
“Think again. None of the eight Ivy League schools—Harvard, Yale, Princeton, Brown, Columbia, Cornell, Dartmouth and the University of Pennsylvania—cracked the top 10 on a list of the universities sending the most graduates to tech firms, according to an analysis by HiringSolved, an online recruiting company. The company used data from more than 10,000 public profiles for tech workers hired or promoted into new positions in 2016 and the first two months of 2017.”
Editor’s note: The HiringSolve link also lists the 10 specific skills most in demand as of 2017, with changes from 2016. For example, the top four skills for entry level placement in 2017 are Python, C++, Java, and algorithms. The top job titles for entry placement in 2017 are Software Engineering Intern, Software Engineer, Business Development Consultant, and Research Intern.
Now let it be said that the 17 public universities in the top 25 are generally much larger than the private institutions on the list, so the sheer volume of highly-trained tech grads from the publics is much larger.
But the final message from Quartz was this:
“If the list tells us anything, it’s that admission to an elite university isn’t a prerequisite for a career in Silicon Valley, and what you know is more important than where you learn it.” [Emphasis added.]
Here are the top 25 universities for Silicon Valley tech placement, in numerical order:
San Jose State
UC San Diego
Univ of Phoenix*
UC Santa Barbara
*Hypothesis: hands-on experience and later degrees?
The new rankings from Money are out, and public colleges and universities account for 27 of the top 50 best values in 2017. These rankings are likely the best college rankings overall, given their balanced approach.
As Jeffrey J. Selingo writes in the Washington Post, the earnings portion of the rankings are based in part on some very interesting new evidence: the “Chelly data.”
“That refers to Raj Chetty,” Selingo tells us, “a Stanford professor, who has led a team of economists that has received access to millions of anonymous tax records that span generations. The group has published several headline-grabbing studies recently based on the data. In findings published in January, the group tracked students from nearly every college in the country and measured their earnings more than a decade after they left campus, whether they graduated or not.
Money does a better job of ranking colleges based on “outcomes” than Forbes does (see Outcomes farther down). This is especially the case with the multiple earnings analyses.
To see the list of top publics, please skip the methodology discussion immediately below.
The 2017 rankings include 27 factors in three categories:
Quality of education (1/3 weighting), which was calculated using:
Six-year graduation rate (30%).
Value-added graduation rate (30%). “This is the difference between a school’s actual graduation rate and its expected rate, based on the economic and academic profile of the student body (measured by the percentage of attendees receiving Pell grants, which are given to low-income students, and the average standardized test scores of incoming freshmen).” [Emphasis added.]
“Peer quality (10%). This is measured by the standardized test scores of entering freshman (5%), and the percentage of accepted students who enroll in that college, known as the “yield” rate (5%).” Note: using the yield rate is an improvement over the U.S. News rankings.
“Instructor quality (10%). This measured by the student-to-faculty ratio.” Note: this is very similar to a U.S. News metric.
“Financial troubles (20%). This is a new factor added in 2017, as financial difficulties can affect the quality of education, and a growing number of schools are facing funding challenges.” Note: although this is not an “outcome” either, it is more meaningful than using data on alumni contributions, etc.
Affordability (1/3 weighting), which was calculated using:
“Net price of a degree (30%). This is the estimated amount a typical freshman starting in 2017 will pay to earn a degree, taking into account the college’s sticker price; how much the school awards in grants and scholarships; and the average time it takes students to graduate from the school, all as reported to the U.S. Department of Education….This takes into account both the estimated average student debt upon graduation (15%) and average amount borrowed through the parent federal PLUS loan programs (5%).
“Student loan repayment and default risk (15%).
“Value-added student loan repayment measures (15%). These are the school’s performance on the student loan repayment and default measures after adjusting for the economic and academic profile of the student body.
Affordability for low-income students (20%). This is based on federally collected data on the net price that students from families earning $0 to $30,000 pay.
Outcomes(1/3 weighting), which was calculated using:
“Graduates’ earnings (12.5%), as reported by alumni to PayScale.com; early career earnings within five years of graduation (7.5%), and mid-career earnings, which are for those whose education stopped at a Bachelor’s degree and graduated, typically, about 15 years ago. (5%).
“Earnings adjusted by majors (15%). To see whether students at a particular school earn more or less than would be expected given the subjects students choose to study, we adjusted PayScale.com’s data for the mix of majors at each school; for early career earnings (10%) and mid-career earnings (5%).
“College Scorecard 10-year earnings (10%). The earnings of federal financial aid recipients at each college as reported to the IRS 10 years after the student started at the college.
“Estimated market value of alumni’s average job skills (10%). Based on a Brookings Institution methodology, we matched up data provided by LinkedIn of the top 25 skills reported by each school’s alumni with Burning Glass Technologies data on the market value each listed skill.
“Value-added earnings (12.5%). To see if a school is helping launch students to better-paying jobs than competitors that take in students with similar academic and economic backgrounds, we adjusted PayScale.com’s earnings data for the student body’s average test scores and the percentage of low-income students at each school; for early career earnings (7.5%) and mid-career earnings (5%).
Job meaning (5%). We used the average score of each school’s alumni on PayScale.com’s survey question of “Does your work make the world a better place?”
“Socio-economic mobility index (20%).
For the first time, we included new data provided by the Equality of Opportunity Project that reveals the percentage of students each school move from low-income backgrounds to upper-middle class jobs by the time the student is 34 years old.Finally, we used statistical techniques to turn all the data points into a single score and ranked the schools based on those scores.” [Emphasis added.]
The inclusion of these metrics makes the Money rankings a hybrid of the Washington Monthly “public good” rankings, U.S. News, and Kiplinger rankings, with the socio-economic factors having a less significant impact than the Washington Monthly rankings on overall standing. Still, these factors do result in two CUNY campuses’ receiving high rankings.
“The data showed, for example,” Selingo writes, “that the City University of New York propelled almost six times as many low-income students into the middle class and beyond as all eight Ivy League campuses, plus Duke, M.I.T., Stanford and Chicago, combined. The California State University system advanced three times as many.”
TOP PUBLIC UNIVERSITIES, MONEY MAGAZINE, 2017, BY NAME AND OVERALL RANK INCLUDING PRIVATE INSTITUTIONS:
CUNY Baruch College–2
College of New Jersey–24
UC Santa Barbara–36
Cal State Long Beach–42
Rutgers, New Brunswick–49
In our latest book of honors program ratings, we listed the honors programs and colleges that received an overall five “mortarboard” rating. One component of the rating model used in order to determine the leading programs is prestigious scholarships–the number of Rhodes, Marshall, Truman, Goldwater, etc., awards earned by students from each university as a whole.
In most cases, honors programs at these universities contribute most of the winners of these awards, but not in all cases. So while the prestigious scholarship component is worth including, we do not want it to override the 12 other rating components used in the ratings. These components are “honors only” because they do not include awards earned by non-honors students of the university as a whole.
Therefore, we decided to do a separate rating, one that is not included in the new book, INSIDE HONORS. The new rating uses only the 12 components listed below. Farther down, you can see whether the prestigious scholarship component had a major impact on the overall ratings of top programs.
Those 12 additional components are…
Number of Honors Classes
Number of Honors Classes in 15 Key Disciplines
Extent of Honors Enrollment
Average Class Size, Honors-only Sections
Overall Average Class Size, All Sections
Honors Graduation Rate-Raw
Honors Graduation Rate-Adjusted for Test Scores
Student to Staff Ratio
Type and Extent of Priority Registration
Honors Residence Halls, Amenities
Honors Residence Halls, Availability
Below is a comparison of the honors programs that received a five mortarboard OVERALL RATING (left side) and those that receive the same rating for HONORS COMPONENTS ONLY (right side), all listed ALPHABETICALLY.
OVERALL FIVE MORTARBOARDS
HONORS ONLY COMPONENTS, FIVE MORTARBOARDS
New Jersey Inst Tech
New Jersey Inst Tech
It is notable that the overlap is almost identical: Arizona State is not on the second list, while Temple is not on the OVERALL list but is on the HONORS COMPONENTS list.
We must add that Temple barely missed a five mortarboard overall rating, while ASU was similarly close to making the honors components list.
The 2016 edition will have a new name– Inside Honors: Ratings and Reviews of 60 Public University Honors Programs. It is in the final proofing stage now. The goal is to publish in late September. Each edition includes a somewhat different group of honors colleges and programs, so there will be changes, even among the 40 or so programs that are reviewed in each edition.
As I have noted in previous updates, the book will take an almost microscopic view of 50 of these programs and also provide more general summary reviews of 10 additional programs. I can say now that there will be a few more programs that will receive the highest overall rating of five “mortarboards” than there were in 2014. (The final list of programs we are rating and reviewing for 2016 is below.)
The rating system makes it possible for any honors college or program, whether a part of a public “elite” or not, to earn the highest rating. Similarly, the ratings allow all types of honors programs to earn the highest rating. Those receiving five mortarboards will include core-type programs with fewer than 1,000 students and large honors programs with thousands of students. And absent any intentional preference for geographical diversity, the list does in fact include programs from north, south, east, and west.
By microscopic, I mean that the rating categories have increased from 9 to 14, and so has the depth of statistical analysis. The categories are, first, the overall honors rating; curriculum requirements; the number of honors classes offered; the number of honors classes in “key” disciplines; the extent of honors participation by all members in good standing; honors-only class sizes; overall class size averages, including mixed and contract sections; honors grad rates, adjusted for admissions test scores; ratio of students to honors staff; type of priority registration; honors residence halls, amenities; honors residence halls, availability; and the record of achieving prestigious scholarships (Rhodes, Marshall, Goldwater, etc.).
Sometimes readers (and critics) ask: Why so few programs? Doesn’t U.S. News report on hundreds of colleges?
The answer is: Honors colleges and programs are complicated. Each one of the 50 rated reviews in the new edition with by 2,500-3,000 words in length, or 7-8 pages. That’s almost 400 pages, not including introductory sections. The rest of the answer is: We are not U.S. News. With myself, one assistant editor, a contract statistician, and an outsourced production firm, our ability to add programs is very limited.
The 2016 profiles are full of numbers, ratios, and averages, more than in 2014 certainly–and too many, I believe, for readers who would prefer more narrative summary and description. So, yes, it is a wonkish book, even to a greater extent than this website tends to be. But then, they are honors programs after all.
Arizona State Honors
Central Florida Honors
Colorado State Honors
CUNY Macaulay Honors
Georgia State Honors
New Jersey Inst of Tech
New Mexico Honors
North Carolina Honors
Oklahoma State Honors
Oregon State Honors
Penn State Honors
South Carolina Honors
South Dakota Honors
Texas A&M Honors
Texas Tech Honors
UC Irvine Honors
University of Utah Honors
UT Austin Honors
Virginia Commonwealth Honors
Virginia Tech Honors
Washington State Honors
Florida State Honors
New Hampshire Honors
Ohio Univ Honors
Western Michigan Honors
After three months of analyzing data, we are almost at the point of rating at least 50 honors programs, writing their profiles, and adding another 10 or so summary reviews (unrated).
What I can say now is that there will be some significant changes–and some surprises. We are running behind schedule, but I still hope for publication by late September.
Here’s why. The 2014 edition was a great improvement over the 2012 book. In 2012, I was so focused on the importance of honors curriculum and completion requirements, along with the glitz of prestigious scholarships (Rhodes, Marshall, Goldwater, etc.) that the first effort failed to drill deeply into the complexities of honors programs.
The 2014 edition moved the ball forward–about halfway downfield, or more–because I was able to obtain more information from honors deans and directors. I also studied class section data online and derived a lot of useful information about honors-only classes, including average class sizes and a general idea of the disciplines offered.
For the 2016 edition, I knew going in that I needed far more detailed information from the programs themselves to develop precise measures for all class sections (including mixed and contract sections). Fortunately, I have been working with that much better information. The result is that instead of listing the number of honors classes in, say, math, the 2016 edition will report how many sections there are in relation to the total number of honors students.
This approach will have a dramatic impact in some cases. For example, say that Program A has 4 honors math sections might have looked good in the 2014 edition; but if Program A has 1400 enrolled honors students, 4 sections do not look very strong.
Another difference will be in the rating for honors class size. In 2014, the most accurate ratings were for honors-only class sizes. But the fact is that many programs offer much of their honors credit via mixed and contract sections. Accurately measuring the class sizes for these sections is extremely difficult when using only the online data. Indeed, there is no section information about contract sections online. Approximately 60 percent of programs allow credit for honors contracts (basically, doing extra work in a regular section for honors credit). A few have use contracts extensively. The new edition will list the average size of contract and mixed sections (honors and non-honors students in the same class).
Finally, another major difference that will have an impact in 2016 is that the rating for honors housing will have a new dimension: one-third of the rating will now be based on the availability of housing space, in addition to the amenities and dorm layout.
Our guess is that the three schools will opt for the new process in summer 2016. (Note: the University of Washington never used the Common App previously.)
Note: A list of all public universities listed as CAAS members as of March 9, 2016, is below.
According to a Scott Jaschik article in Insider Higher Ed, member schools “are creating a platform for new online portfolios for high school students. The idea is to encourage ninth graders begin thinking more deeply about what they are learning or accomplishing in high school, to create new ways for college admissions officers, community organizations and others to coach them, and to help them emerge in their senior years with a body of work that can be used to help identify appropriate colleges and apply to them. Organizers of the new effort hope it will minimize some of the disadvantages faced by high school students without access to well-staffed guidance offices or private counselors.”
To qualify, as of now, for membership in the CAAS, a school must have a six-year graduation rate of 70 percent or higher. Several prominent public universities that qualify have not yet joined, among them all of the University of California institutions, UT Austin, and UW Madison.
Jaschik writes that the UC campuses have not joined because of present concerns about the ability of community college transfers to use the process effectively. UC schools have strong and highly successful articulation agreements with the state’s community colleges.
UT Austin questions the fairness of the new process, at least in its initial form. “Associate director of admissions Michael Orr said UT did not apply to the coalition because of criticisms of the programs, including the coalition’s failure to consult with high school counselors,” according to Jameson Pitts, writing for the Daily Texan.
“The argument within the community … has been that there is a concern that students with means will be the ones that will be able to take advantage of that opportunity the most,” Orr said. He did not rule out the possibility of joining the Coalition if concerns about fairness can be resolved.
Several voices in the higher ed community have opposed the Coalition, saying that students are already over-focused on preparing for college admission and that the new approach will favor more privileged students.
Our question is this: If the new process is designed to help students who cannot afford college counselors and lack effective guidance in their schools, how will the students find out about the process in the first place and learn to use it to good effect?
Whatever the possible shortcomings may be, the CAAS has gained the membership so far of the 36 public universities listed below. It is important to note that only Florida, Maryland, and Washington have decided to use the CAAS process exclusively. The other schools listed below will, as of this date, use either the Common App or the CAAS process.
College of New Jersey
North Carolina State
William and Mary
The annual composite MBA rankings compiled by John A. Byrne at Poets & Quants combines rankings from the “five most influential rankings and weighs each of them by the soundness of their methodologies” in order to yield “a more credible list of the best MBA programs.”
We like Poets & Quants and Byrne’s rankings and try to write about them each year. The rankings from which he combines the comprehensive list are those from U.S. News, Forbes, Bloomberg, the Financial Times, and the Economist.
Here are the public MBA programs listed in the top 50 for 2015, and their composite rank:
The annual Times Higher Education World University Rankings have had the strongest presence in the ranking “world” since 2004, but here’s one vote for the U.S. News Best Global Universities rankings being better even though they have been around only two years. Both are useful because they measure the prestige and research impact of hundreds of universities around the world at a time when there is much more international cooperation–and competition–among institutions.
It is rare for us to applaud the U.S. News rankings because there are many serious issues with the annual “Best Colleges” publication. It over-emphasizes the financial resources of colleges and their selectivity, to the detriment of most public universities.
But when it comes to world rankings, U.S. News drops the focus on financial metrics in favor of academic reputation and research metrics, including the use of regional reputation surveys that help to offset the eurocentric bias of the Times Higher Ed rankings.
For example, the Times Higher Ed rankings list 42 European universities among the top 100 in the world, while U.S. News lists 31. The main reason is probably that the Times rankings do include financial metrics and do not factor in the additional regional reputation data.
Below is a table showing the U.S. public universities ranked among the top 100 in the world by U.S. News alongside the rankings of the same universities by Times Higher Ed. An additional column shows the average ranking of each school when both ranking systems are used. The average ranking of leading U.S. public universities by U.S. News is 44 out of 100; the average Times Higher Ed ranking of the same schools is 82.
Editor’s note: Last updated on September 15, 2019.
Parents and prospective students are often interested in the average size of course sections. Assessing class size is extremely difficult because, first, a “class” has to be defined. Why is that difficult?
Some main sections (often in intro science, economics, business, etc.) may be very large, well in excess of 100 students, but the breakout labs and discussion sections are much smaller. Should only the main section be counted, or should the labs and discussion sections also be included?
The best source for class size information and interesting data in general is the Common Data Set. Some universities publish their submissions, while others do not. The data in the CDS are used by U.S. News for many purposes, including the calculation of the percentage of classes with fewer than 20 students and the percentage of classes with more than 50 students. Using that information, we can also calculate the percentage of classes with enrollments between 20 and 50 students.
U.S. News does not include tutorials, thesis research, lab or discussion sections, and neither do we in our estimates of class sizes for honors programs. According to our counts, the average public university honors-only class section has 17.54 students. When we include honors credit classes that also have non-honors students, the overall average class size is 24.9 students. Both of these numbers are better than in 2016. This is an important consideration, given that the tables below show that public universities as a whole have significantly larger class sizes, and several have increased over 2016.
The next level of difficulty when using U.S. News data is to plug in an average number for the three class size groupings. For the group of fewer than 20 students, we are using an average of 17.5 students per section; for the group of classes with 20-50 students, we are using an average of 35 students; and for the group of classes with more than 50 students, we are using an average of 110 students.
The table below shows the percentage of classes in each size category along with the increase or decrease in the overall class size average since 2016. Universities are in rank order according to lowest estimated overall class size in 2020.
Editor’s note: This post updated on October 1, 2016, after release of Times Higher Ed Rankings for 2016.
It is likely that Philip G. Altbach, a research professor and the founding director of the Center for International Higher Education at Boston College, has the sharpest eye of anyone in America when it comes to seeing how well U.S. universities compare with rapidly improving institutions throughout the world. What he sees is not good.
U.S. public universities are losing ground to foreign institutions, most notably in Europe and Scandinavia.
(Below the following text are three tables. The first compares the Times Higher Ed world rankings of U.S. universities and those throughout the world for the years 2011 and 2016. The second shows the decline in rankings for U.S. public universities for the same years. The third and final table shows the rise in rankings for U.S. private universities.
Altbach cites the work of colleague Jamil Salmi, who found that there are at least 36 “excellence initiatives” around the world “that have pumped billions of dollars into the top universities in these countries — with resulting improvements in quality, research productivity, and emerging improvements in the rankings of these universities. Even in cash-strapped Russia, the ‘5-100’ initiative is providing $70 million into each of 15 selected universities to help them improve and compete globally. [Emphasis added.]
“At the same time, American higher education is significantly damaging its top universities through continuous budget cuts by state governments. One might call this an American “unExcellence initiative” as the world’s leading higher education systematically damages its top research universities. Current developments are bad enough in a national context, but in a globalized world, policies in one country will inevitably have implications elsewhere. Thus, American disinvestment, coming at the same time as significant investment elsewhere, will magnify the decline of a great higher education system.”
One reason: All the bad publicity about cuts in funding, along with high-profile political grandstanding that has received far too much attention throughout the world academic community. For example, UT Austin endured years of highly publicized attacks by former Governor Perry during his second term, and UW Madison has been hurt by similar actions on the part of Governor Scott Walker.
Unless state legislatures move toward excellence and restore pre-Great Recession funding levels, there will be “a revolution in global higher education and create space for others at the top of the rankings. It would also be extraordinarily damaging for American higher education and for America’s competitiveness in the world.”
The average ranking for the 23 U.S. publics among the top 100 in the world in 2011 was 39th, while in 2016 it was 49th–and only 15 publics were among the top 100. Meanwhile, leading U.S. private universities have seen both their average reputation rankings and overall rankings rise since 2011.
To illustrate Professor Altbach’s point, we have generated the tables below.
This table compares the Times Higher Ed world rankings of U.S. universities and those throughout the world for the years 2011 and 2016.
Top 100 Overall
The following table shows the decline in rankings for U.S. public universities for the years 2011 and 2016. UC Davis is the only school to rise, while most dropped significantly.
Times Higher Ed Rankings
UC San Diego
UC Santa Barbara
This last table shows the rise in rankings for leading U.S. private universities for the years 2011 and 2016.