In our previous post we described the information desert in which students, employers, and the university wander, and the challenges this poses in narrowing the skills gap. Is there any solution to this information drought? We propose a four-part plan.
Collecting and Publicly Reporting Placement Data for All Students
Our first proposal is to require universities to collect and publicly report rich data on student placements. Of course, universities typically do collect and report some placement data, but if you look at the details, you’ll realize that in most cases the statistics are based on a very small number of self-selected students. And they are aggregated in a way that prevents much understanding of the variation in outcomes.
Instead, universities should collect and report to state and federal data depositories the placement outcomes for every single student with detailed information: what degree they completed (using standardized instructional program codes); are they heading to a graduate/professional school, still seeking, or have a job offer; and if they have an offer, what job they took (using standardized occupational codes), with what employer, and at what salary, along with information on debt incurred along the way. This placement data (reported both separately for each campus in a university system, and aggregated over campuses) should be supplemented with data on non-completion (including debt incurred by non-completers).
To put teeth in such a requirement, we would incentivize participation by both students and institutions. Completing this survey would be a graduation requirement for students - taxpayers subsidize the cost of education in both public and private institutions and deserve information on their return. University compliance could be tied to state funding or to institutions’ eligibility to participate in student loan programs, with university leadership held directly – and legally – responsible for accurate collection and reporting of this data.
We would require that these data be made available anywhere a prospective student or employer or higher education funding body might look. That glossy brochure highlighting the football team and the rec center might look a lot less inviting if it carried a warning label showing high loan defaults, low completion rates, or lousy starting salaries. Funding bodies should know what their investments in universities are generating in terms of degree completion, jobs and earnings, and value for the economy.
In our view, this would yield one of two outcomes. Either the data would support the view that universities and/or particular programs of study are adding meaningful value for students and for the firms who hire them - and help dispel a misleading narrative about the returns to higher education.
Or, these detailed data would reveal significant weaknesses – particular schools, particular programs – where those returns are well below expectations, allowing students, administrators, and funders to make better choices about what should change and where money should flow. Data of this sort could also help employers pinpoint hidden gems where they might more effectively recruit talent or understand why their sub-standard wage offers are being rejected.
One subtlety here is that universities and programs differ markedly in the types of students they serve. A university whose mission is to expand educational access by making admissions offers to many first-gen students, low-income students, and/or those with marginal SAT scores is very likely to see worse (unconditional) outcomes on completion, employment, and starting salaries. This is the value of reporting data for every student, not just a self-selected sliver of the total: data tools can easily filter results by the attributes of the incoming student population. Funding formulas can be adjusted to reward the value-added of the university, that is, outcomes conditional on the characteristics of the students they admitted.
To see one example of what we are talking about, visit the Department of Education’s College Scorecard to see tools that provide information on earnings and student debt with information by degree program data. The Post-Secondary Value Commission provides another tool building on the Department of Education data, supplemented with additional data sources such as the American Community Survey. It provides additional nuance related to family income and race/ethnicity, and much better data visualization options for making comparisons.
However, a challenge with these tools is that they are based entirely on data from students participating in the federal student loan program. For reasons we have discussed in an earlier post on student loans, this is an increasingly unrepresentative sample of students, as only 25% of students take out student loans, and they are more likely to come from low income families and/or attend more expensive universities. Still, these are great tools! Imagine what one could learn if we had systematic information like this on all students.
Capturing Co-curricular Participation in Transcripts
Our second proposal elevates the role of co-curricular activities. At most universities there is a clear distinction between “transcript-able” activities that are part of a required curriculum and everything else a student might do in the co-curriculum. And as the saying goes, ‘what is measured is what is managed’.
It is clear to us that many of the professional skills students need for career success are developed outside the formal 120 credit hours required for a degree and are never formally captured. Employers may see these listed in a resume but the university itself has no systematic way of knowing how students spent their time outside the classroom, nor any way to understand the role these “extra” activities played in placement success. This makes it extremely hard to know whether these investments are sufficient, whether some or many students participate, and how participation correlates with student success.
Similarly, prospective students may be provided with a long-list of student clubs and activities as part of their college recruitment process. But they have no way of knowing whether the university has made significant investments in co-curricular activities that are curriculum adjacent, no information that helps them identify which of these activities would best support their professional growth and hence generate the greatest ROI, and no formal way to demonstrate participation in them.
We will follow up in a future post discussing a few experiments (and associated challenges) with transcripting the co-curriculum.
Life-long Effects of College: Partnering with State Agencies
Our third proposal is that universities should partner with state agencies to link student-level transcript and related data to state administrative data from unemployment insurance or income records that provide a view into career progression and earnings throughout the professional life of a graduate. Too much of what we know about graduate success for a particular university is based on first placement data. To be sure, first placements matter, but we should know much more about whether we are setting our students up for lifelong success.
This would also help us to understand whether particular degree programs excel in first placements but not subsequent career progression, or vice versa. It may be the case, for example, that liberal arts degrees do not equip students with specific practical skills that are in demand for entry level positions. But that degree may provide capabilities that enable graduates to flourish in later career stages.
An example of administrative linking in action can be found in Texas, with the extraordinary datasets maintained by the University of Texas at Dallas Education Research Center. They begin with the universe of Texas high school students, capture their applications to (and acceptance/rejection) at 35 public universities in the state, and link it to workforce records to capture career outcomes - all with detailed demographic and school performance data. This can be used by researchers who want to understand, e.g. the causal impact of getting into a better university, or changes in admissions policies that favor/disfavor certain groups of students.
If we could empower faculty scholars and institutional researchers with access to data that linked students’ curricular and co-curricular choices to their career outcomes, we could go much further in understanding correlations (or even causal linkages) to career success. And we could more effectively experiment to understand how changes in curricula and/or student experiences yield different outcomes.
Should state agencies prove uncooperative, a university could perhaps partner with LinkedIn to study career evolution or make a (much more) dedicated effort to systematically survey its alumni to learn more about their career progression and the strengths and shortcomings of their learning. It is regrettable that alumni relationships are viewed much too narrowly and through a transactional lens: inviting them to attend football games or basketball watch parties, or identifying their giving capacity and making a pitch for the annual giving campaign.
Our alumni and their stories are perhaps our best source of information and inspiration and we know far too little about what in their college experience contributed to their career success, or lack thereof.
Forward-looking Employment Forecasts and Employer Engagement
Our final proposal targets information that would be useful to university leadership in adjusting the mix of investment in, and graduates from, different departments. And then adapting curricular and co-curricular experiences to reflect the particular skill sets needed.
Suppose a university president or provost wanted to better match the supply of students produced to the needs of the local economy. How would they go about doing that? They would need information on employers hiring plans, not just this year, but at least 5 years ahead. They would need information on specific jobs the employers wanted to hire, and some information on the types of majors they would seek to hire from.
Most surveys of employers highlight gaps in particular capabilities like communication or problem-solving, and that is useful information. But in our experience employers select for these capabilities within a narrow range of majors they might choose from. And it is far more useful to a university to know employers want accountants who can problem-solve, and mechanical engineers who can communicate, if that is how employers plan to hire.
There is some information like this available from the Bureau of Labor Statistics employment forecasts, and those forecasts have recently been supplemented to describe needed skills in some depth. While these are useful (and all universities should at a minimum be studying this information), they have two weaknesses. One, they are national in scope, and the majority of universities draw on a local student population who will, on average, mostly fill jobs in the state or nearby region. Two, they are fundamentally statistical forecasting exercises operating at the level of the national economy. Presumably local employers know much more about their future growth and hiring plans than a group of economists and statisticians in DC.
Undoubtedly employers will push back against this suggestion and say: we don’t know what the future will bring and we can’t tell you who we will want to hire. Fair enough. But if employers won’t tell universities who they might be hiring in the next five years, how in the world are universities supposed to produce graduates who will fill these hiring needs?
As a separate and equally important issue, university administrators, faculty, and staff need much more detailed information on the specific skills students will need to be successful in their jobs. We see this happening in two ways.
First, employers can actively participate as advisors to curriculum committees, faculty planning their courses, and staff supporting co-curricular activity. This requires some degree of sensitivity on both sides of that exchange. Advice is not a vote, but advice consistently ignored will not be offered long.
Second, employers can help universities build the specific skills they want to hire by expanding internship and co-operative education programs. Employers can also work closely with university departments to embed experiential learning opportunities within the curriculum as well as engaging deeply in co-curricular activities. In our experience, these programs can be difficult to start and expensive to scale, but they can transform student capabilities and give employers a test-drive of future employees.
The University Role: Reactive or Proactive?
No university wants to have its financial resources tied to regulatory compliance, and many of our suggestions have more than a whiff of coercion about them. That is in large part because value to students and to the university itself rests on comparison: which universities and which degree programs are performing better and adding value and why exactly is that? A reliance only on internal data which indicates seemingly satisfactory placement and completion rates makes it far too easy to sleep at night unconcerned with student outcomes.
So comparison matters and it is perhaps best achieved by some external pressure, either regulatory compliance or (shudder) by rankings. Still, even if faced with no such external pressure, we believe a university that took these proposals seriously would create a dramatically improved information environment and a starting point for making changes that could be distinguishing in the current environment.
Of course, there is a risk that improvement would run into the wall of misaligned incentives and structural inflexibility, topics we tackle next.
“Finding Equilibrium” is coauthored by Jay Akridge, Professor of Agricultural Economics, Trustee Chair in Teaching and Learning Excellence, and Provost Emeritus at Purdue University and David Hummels, Distinguished Professor of Economics and Dean Emeritus at the Daniels School of Business at Purdue.