Why Earnings Alone Cannot Define Higher Education Accountability

Why Earnings Alone Cannot Define Higher Education Accountability

Why Earnings Alone Cannot Define Higher Education Accountability

Why the accountability debate is more complicated than it looks

Higher education accountability is increasingly centered on earnings outcomes. The assumption is straightforward: students earn a credential, enter the workforce, and their salaries reflect institutional quality.

But Glenda Morgan argues the reality is far more complex.

Earnings are not produced by institutions alone. They are shaped by geography, labor markets, career pathways, industry structures, and personal choices. Treating salary as a direct institutional output ignores the broader systems that influence economic outcomes.

That distinction matters because accountability systems shape policy, funding, and which programs institutions choose to sustain.


Why earnings are not a clean institutional metric

A graduate’s salary reflects more than where they studied.

Regional differences play a major role. Urban and rural labor markets produce different wage outcomes, even for students with similar credentials. Cost of living also affects salary structures. The same graduate may earn dramatically different wages depending on location.

Career pathways matter too. Some professions have highly structured salary trajectories, while others develop more gradually over time.

Morgan’s argument is that earnings are a systems-level outcome, not a simple cause-and-effect institutional measure.


Why median earnings can distort accountability

Median earnings simplify complexity into a single number.

That can obscure important differences between programs and professions. High-variance programs may produce both very high and very low earners. Low-floor professions may provide critical public value despite lower salaries.

Morgan also argues that earnings snapshots fail to account for long-term trajectories. Some fields produce immediate returns, while others develop more slowly over the course of a career.

Research shows that liberal arts graduates, for example, may initially earn less than engineering graduates but eventually narrow or surpass those gaps over time.


What accountability systems should measure instead

Morgan argues for a more nuanced accountability framework.

Completion rates should play a larger role, particularly given the scale of students with some college but no credential. Time to degree also matters because delays increase cost and debt burdens.

Geography, labor markets, and career variation should be incorporated into outcome measures. Accountability systems should recognize that different programs produce different types of value and different earning trajectories.

Most importantly, institutions should be evaluated using multiple measures rather than a single earnings metric.


Why this matters for public policy

The design of accountability systems influences institutional behavior.

If metrics are too narrow, institutions may reduce investment in socially valuable professions with lower earnings outcomes. That could worsen shortages in fields like teaching, counseling, and social work.

The challenge for policymakers is to build systems that value outcomes without oversimplifying how education, labor markets, and society actually interact.

 

Read Glenda Morgan’s article Earnings Data Are Driving Policy—and Misleading It” for more insights.

Transcript

Wes (00:26.786)  Morgan, thank you for joining us today and welcome to the President’s Forum Podcast.

Glenda Morgan (00:47.604) Thanks and it’s a pleasure to be here.

Wes (00:50.488) Hey, your article argues that it isn’t just a measurement of earnings that’s the problem. It’s actually a causality problem. So it’s very detailed in laying that out for us, but earnings are being attributed to institutions when they’re actually produced by systems. Can you explain that to our listeners and tell us a little bit about why that distinction matters?

for how we design accountability in public policy for higher education.

Glenda Morgan (01:26.25) Sure, yeah, you know, in a lot of the accountability discourse that’s going on, earnings are often treated like a clean institutional output. know, somebody goes to college or university, they graduate, they have earnings and they’re seen as a, you you’ve got cause and effect. But actually what happens is much more complex than that, is that somebody goes to university, they take one of a variety of different kinds of programs.

and then they graduate. But what they actually earn is a product of all different kinds of things. It is a product of where they graduate, are they going to be living in urban or rural kind of setting, but also what kind of a job they’re going into. Some jobs have very determined pathways, others are much more flexible.

And so you’ve got these multiple causality things going on and so what people are actually earning after they graduate is the result of multiple factors all acting together. So it’s not just cause and effect. It’s a highly complex kind of a system. So holding one aspect of that responsible for the outcome is just a crazy sort of setup.

you know, because what’s actually happening is you’ve got all kinds of things interacting to produce a highly variable.

Glenda Morgan (03:21.268) It makes sense to everybody, you know, where you live is going to determine what your costs of living are. And it also sort of determines what you’re paid. I mean, it’s so ingrained in us to understand that, but somehow it hasn’t made its way into the metrics yet. You know, it’s not just urban and rural. It’s also, I mean, there’s a regional aspect that I didn’t write about because my colleague Phil has written about that. But where you live determines a lot of

your costs but it also determines where you’re paid. I used to work for Gardner and they actually you know it was a fully remote company but they actually linked your your salary to where you were living. There were high cost places and low cost places.

Wes (04:05.432) Yeah, that makes sense. Well, in this paper, you also mentioned you described three types of programs that have very different earning structures. And the three programs that you lay out are pipeline programs, high-variance programs, and low-floor programs. First, can you just describe what each those are, each program is for our listeners? And then…

I’d love to get into some of the details of measuring those and why one single median measurement doesn’t quite work.

Glenda Morgan (04:43.114) Sure, as we go on, just want to be sure to call out Ithaca, which my little article was based on their research. Ithaca SNR did some great research on South Carolina, but it’s broadly applicable. So much depends on the kind of the program and then the pathway out of that program for graduates out of there. And they identified three. So the first one are pipeline programs. This is where

You graduate from a program and your pathway is pretty determined. You’re something like nursing where, you know, there are a couple of different paths you can take, but it’s pretty set. And your salaries are in some ways determined by that pathway. And so they’re somewhat predictable. Another one is engineering, you know, how you progress and where you go. You you’ve got certifications and things like that that you do, but it’s certainly set.

And then you’ve got much more flexible kinds of programs. Sorry. High variance programs. this, you know, with a pipeline program, your career and what you’re going to do after you graduate are are largely determined by the program that you’ve done.

Wes (05:58.563) high variance programs.

Glenda Morgan (06:15.136) With high variance programs, it’s less a profession than a set of opportunities. So something like business and even computer science, I would argue, are high variance programs. So they’re not only in terms of what you’re actually going to do is going to vary a lot. You can go to lots of different kinds of places and it’s really up to you in terms of what you’re going to do and what you’re going to make of that, but also your salary, what you’re actually paid.

is going to determine is going to vary a lot. So you’re to have a huge variation in terms of earnings and pathways and occupations. It’s really not determined by the actual degree. It’s determined by what your interests are and how you progress in that. I, for example, I have a PhD in political science, you know, and

you could have become, I could have become a professor or I chose to become an industry analyst and it’s the ultimate high variance kind of programs. And then you’ve got low floor programs and these are sort of, they’ve got elements of both of those in that there’s a big variation in terms of what people do, but earnings are traditionally fairly low. So things like social work, counseling,

often the arts as well. So there’s a lot of variation in terms of what people do, but the floor tends to be pretty low as well in terms of what they make.

Wes (07:49.358) Could we lump in like teaching, mental health programs? Yeah, okay. So these are programs that we actually really do need.

Glenda Morgan (08:03.59) Absolutely, yes. You know, as a society, we rely on those kinds of things. But they have traditionally been paid less. In part, you know, there’s somebody who writes about librarians, for example, who talks about vocational awe, you know, where everybody really admires what they do, but they aren’t prepared to pay for it. And so you’ve got these low-floor kinds of things.

Wes (08:31.79) Okay, so when you take a median, when you just break that down and take one number out, how does that not yield the accountability that we’re actually looking for?

Glenda Morgan (08:47.914) So, you know, people often think about medians as being better than averages and they are, but, you know, they aren’t accounting for the variation across that. Particularly, I think the most egregious example is the high variance programs because a median is just telling you, you know, the middle of between the bottom and the end. And it’s not sort of really telling you in general how people are going to do there, but they’re certainly not capturing

the value of the input as well. There’s a logic breakdown there because what people are earning is determined by the system, not by the actual input of the beginning. It’s just the beginning point that we’re putting a lot of emphasis on and it’s not really a valid measure of anything.

Wes (09:44.674) Well, it just seems that those three different types of programs could create a little bit of a problem having, just evaluating that one number, particularly at the end of the day, when you’re looking at social value of some of these low floor careers and the credentials that are required for that.

Glenda Morgan (10:10.014) Yeah.

Wes (10:14.146) We have, you can’t get rid of all of these credentials because they don’t provide you the economic return that some other careers might because you need them for society. How do you deal with that?

Glenda Morgan (10:28.82) Yeah, you know, that’s a slightly different thing than I argued in the piece, but I think, you know, we have to think about what we need as a society. I remember, as it happens, I’m South African originally. And there was this sort of amazing moment where I sort of understood things in a much deeper kind of way. I was just before I came to the US, it was the end of apartheid.

And as it happened, I went to the University of Cape Town, one of the best universities in the continent of Africa. And I remember hearing a conversation and it was a time of rapid change. There was this guy who was on the Board of Governors, the Board of Regents of the University of Cape Town. He was a businessman, very successful. He said,

My job is to understand the role of the university. And so, for example, in the College of Medicine, we have to provide doctors to the whole of the society. And, you know, as a businessman, I understand inputs and I understand outputs. And if we only get one kind of input, we’re only going to have one kind of output.

So we need multiple kinds of inputs in order to provide doctors for all the different parts, know, for rural, for plastic surgeons, for orthopedic surgeons, for all these different kinds of things. And so I think in terms of our accountability, we need to think of the same sort of thing, inputs and outputs, you know, we need social workers, we need teachers, we need these kinds of things. So we need to make sure that we produce them because we’re going to hurt if we don’t.

Wes (12:23.086) Right, right. Well, you know, that’s clearly the the when you’re talking about we don’t just measure inputs. We do want to look to outcomes. You’re I mean, that’s speaking President’s forum language. We’ve been talking about that for a long, long time. But look, we can’t just we can’t measure accountability by, you know, the way that education is provided, whether that’s in person or online or.

Glenda Morgan (12:34.208) Yeah.

Wes (12:51.16) We can’t just look to the inputs, but inputs and outputs can both be important. Boiling it down to one specific earning number is more complicated than it seems, but let’s get to the, if we’re redesigning this system, tell us what you would build if it were a ground up build on accountability. Well, how would you do it?

Glenda Morgan (13:16.734) We’ve got 43 million Americans with some college no credential. And I think…

Wes (13:49.538) Ha

Glenda Morgan (14:14.472) you know, you can have the best earning credential in the business, but if you’re not actually getting the credential, it’s not going to help you. So I think, you know, including more metrics there, including completion, time to degree, those kinds of things, you know, is sort of is part of that. And really developing a more nuanced measure of that. So including regionality.

including urban versus rural, those kinds of things. So that’s sort of how I would start to design it more from the ground up. But I would put heavily an emphasis on if somebody actually is going to college that they’re coming out of it with a degree or a credential of some sort.

Wes (15:01.878) I love that thinking and that does get forgotten when it’s just one metric after, if you’re just looking at earnings, you’re not seeing all of the non-completers and the cost to the system that that is.

Glenda Morgan (15:15.455). Yeah, no, absolutely. And then they’re stuck with the debt often. And it’s just a sort of nightmare. So I want that to be part of the part of that sort of calculation, but also, you know, thinking also in terms of where people going and how they’re doing. The other thing we haven’t talked about is also time, which I wrote about in the in the article is that, you know, a snapshot in time is not going to give you a

a great measure because some of these professions, for example, the pipeline things are relatively high earning right out the gate, whereas other ones are slow brewing. So there are studies that show that right out the gate engineering graduates earn much more than say, science people. But in the long term, the liberal arts actually catch up and overtake.

I think just looking at snapshots in time is problematic. You need a longer term measure.

Wes (16:26.22) I’m glad you brought that up because that’s a huge variance and it’s really important to capture. It’s hard to capture. It’s very difficult. I don’t know if there’s a clean way that you can do that, but your point is some of these take a much longer time than five years out your credential. They brew over a career.

Glenda Morgan (16:44.768) Yeah, absolutely. Yeah, no, absolutely. And, you know, going back to the median issue, I’ve just been rereading Todd Rose’s The End of Average. And a lot of people have some issues with the book, but I sort of really like it. It’s that, you know, when you’ve got things that don’t correlate, you’ve got multiple measures that don’t correlate, just using an average really gives you a bad result. You know, he uses the example of

Wes (16:55.086) Mm-hmm.

Glenda Morgan (17:10.096) airplane cockpits. Originally they were designed for the average person but turns out nobody’s actually average. Because you’ve got these multiple measures, know, and so we need to sort of bring multiple measures into things instead of using that median of just the earnings.

Wes (17:28.398) Right, well this has been a very interesting conversation Morgan. We will direct our listeners to your piece on this so they can read all the details and we would love to continue this conversation as things move forward with accountability during this administration and future administrations. We really appreciate your thinking about this.

Glenda Morgan (17:37.269) Bye.

Glenda Morgan (17:51.134) my absolute pleasure and lovelies to speak with you. Okay, thanks.

Wes (17:54.616) Thanks, Morgan.

Accreditation, Innovation and Modernization (AIM) Week One Negotiation Update

Accreditation, Innovation and Modernization (AIM) Week One Negotiation Update

The Department of Education’s proposed accreditation reforms represent a fundamental shift in federal oversight, moving from a compliance-driven model toward one focused on student outcomes, institutional value, and market competition.

At a high level, the administration asserts that the proposal would reduce regulatory burden, expand accreditor competition, and strengthen federal guardrails for legal compliance and consumer protection, while emphasizing program-level outcomes and return on investment.

Week One Negotiation Update

The first week of negotiated rulemaking underscored both the scope and complexity of the proposed changes.

Discussions were anchored in a 151 page draft regulatory text released by the Department earlier this month. By the end of the week, negotiators had worked through approximately two-thirds of the draft (97 pages), leaving substantial ground still to cover in the second week of negotiations scheduled for May 18 – 22

While the Department introduced some revisions in response to committee feedback, those changes were described as primarily structural rather than substantive, suggesting that little progress has been made on consensus. Key areas, including outcomes-based accountability, legal compliance expectations, and accreditor flexibility, continue to generate significant discussion and, in some cases, concern among negotiators.

 

Department Goals with the NPRM

  • Affordability and Efficiency:

    Expectations that institutions demonstrate cost-effectiveness and support credit mobility and lower-cost delivery models. Accreditors review the institution’s cost-benefit analysis of support services and facility operations/expansions.

  • Transparency and Consumer Protection:

    Enhanced disclosure requirements and stronger accountability for institutional integrity and Title IV fraud procedures.

  • Focus on Outcomes and Value:

    Increased emphasis on program-level outcomes, signaling a stronger federal focus on return on investment.

  • Less Process, More Flexibility:

    Streamlined accreditation requirements and fewer procedural constraints, with greater institutional flexibility to select or change accreditors.

  • More Competition in Accreditation:

    Expanded pathways for new accreditors and reduced barriers to switching, creating a more competitive and less geographically defined accreditation landscape.

  • Heightened Legal and Compliance Expectations:

    Integration of federal legal and constitutional compliance into accreditation standards, including nondiscrimination and First Amendment considerations.

Next Steps:

  • Second week of Negotations scheduled for May 18 – 22

Resources:

Grad PLUS: What the Latest Change Means for Graduate Students

Grad PLUS: What the Latest Change Means for Graduate Students

Grad PLUS: What the Latest Change Means for Graduate Students

What changed with Grad PLUS loans

The Department of Education recently updated its guidance on how Grad PLUS loans are treated under new lifetime borrowing limits.

Previously, institutions were told that Grad PLUS loans would not count toward the $257,500 lifetime cap. That guidance has now changed. For students no longer eligible under legacy provisions, prior Grad PLUS borrowing will now count toward that cap.

This shift introduces immediate implications for how graduate education is financed.


Who is most affected

The impact will be concentrated among students in higher-cost graduate programs.

This includes students in medical, dental, counseling, and doctoral programs. Part-time doctoral students may face particular challenges, as longer timelines can increase cumulative borrowing.

Students who previously used Grad PLUS loans and are returning to school are also at risk. They may reach the cap before completing their program.


What institutions need to do now

Institutions will need to adjust quickly.

Financial aid teams should revisit awarding and packaging strategies, especially for students enrolling in upcoming terms. Advising will also need to shift. Students must understand how prior borrowing affects their remaining eligibility.

Clear communication will be critical. Institutions should prepare to re-advise students and update financial plans in real time as guidance evolves.


What to watch next

The timeline is tight.

Final regulations are expected at least 30 days before the effective date of July 1, 2026. That leaves a narrow window for institutions to prepare and for students to adjust their plans.

There is also ongoing uncertainty. Recent guidance has already shifted once, and additional changes remain possible.

Transcript

Wes (07:33.942) Amy, welcome to the show and let’s get right to it. The Department of Education just reversed course. Graduate Plus loans will now count toward the new lifetime borrowing limit under the one big, beautiful bill. What does that actually mean for students that are trying to finance their graduate education? Maybe we should start with, give our listeners background on what happened and then let’s answer that question.

Amy Glynn (07:56.885) Yeah, so when we’re talking about those new lifetime borrowing caps, the department had previously told schools that graduate plus loans would not count towards the $257,500 lifetime cap established under OB-3. In a webinar last week, this guidance was shifted and schools have been told that once a student is no longer eligible for the legacy provisions, previous borrowed grad plus loans will count against that lifetime cap.

And this is a really significant change for institutions that are serving those non-traditional students who are trying to fund their education. They are at severe risk of losing access to all Title IV aid for graduate programs because we know those funding sources are limited to the student loan portfolio.

Wes (08:43.118) Interesting. Okay, so give us the students who are most affected by this.

Amy Glynn (08:47.796) Yeah, students at greatest risk of having to stop out because of this change are going to be those in some of our more costly programs. If you think about things that are in the medical field, dental, our counseling programs, doctoral students, all doctoral students, but especially ones who are enrolled part time. Obviously, it’s our students who previously borrowed Graphic Plus looking to return to school.

And those are individuals who are going to face potential challenges by hitting that lifetime cap. Because the cap never resets. Even if a student has repaid or paid down their student debt, the cap is established and once you hit it, you lose access to those funding programs.

Wes (09:32.342) Interesting, interesting. Okay, so we know that students have to be aware of this now. And that also, I mean, you speak from an administration perspective. You’ve been there on the ground working in financial aid. Tell us what institutions need to know about the change.

Amy Glynn (09:36.448) Thank

Amy Glynn (09:49.044) Yeah, it’s one thing to understand the change. They probably need to shift their advising and their packaging, their financial aid offers that have been established, especially for students that are in these affected populations who are taking courses this summer, but especially students who are starting in the fall in a more traditional sense.

And so they need to understand the provisions. They need to establish a communication protocol to advise or re-advise students. And they need to be prepared to be incredibly nimble because this guidance could change again. You we thought we had a pretty clear understanding based on good faith negotiations that these loans were not going to count against a student’s cap. And this reversal

has significantly changed things and created a level of uncertainty in the knowledge that we did have around negotiations that are on an incredibly aggressive and tight timeframe already.

Wes (10:54.242) So speaking of the timeframe, let’s conclude with some details on the timeframe. What does it look like moving forward?

Amy Glynn (11:01.374) Yeah, so final regulations are supposed to be published a minimum of 30 days before the effective date. Effective date is July 1 of 26. So the time frame is pretty fast and pretty furious and is coming at us real quick.

Wes (11:18.624) Amy, thanks for your time. It was very clear and very useful.

Amy Glynn (11:22.443) Thanks Wes.

What Presidents Need to Know About Workforce Pell and AHEAD Rulemaking

What Presidents Need to Know About Workforce Pell and AHEAD Rulemaking

What Presidents Need to Know About Workforce Pell and AHEAD Rulemaking

What is changing with Workforce Pell

Workforce Pell is not just a new funding stream. It represents a shift in how institutions design and deliver programs.

The focus is on working adults, military-connected learners, and students who are re-skilling. These students are mobile, outcome-focused, and balancing multiple responsibilities. Institutions that succeed will treat Workforce Pell as an expansion of what they already do well, not as a separate initiative.

What presidents should prioritize now

Building Workforce Pell programs at scale requires three core capabilities.

First, program development must move faster. Institutions need governance models that allow for rapid iteration as workforce needs change. Second, strong employer partnerships are essential. High-quality programs depend on direct industry input to remain relevant. Third, institutions must have the infrastructure to track completion and job placement outcomes in real time.

Programs must also be designed with stackability in mind, giving students clear pathways to continue learning without losing momentum.

How accountability is shifting

Accountability frameworks are moving toward earnings and outcomes, but the design details will determine whether they work.

If metrics focus too narrowly on short-term earnings, institutions may be discouraged from supporting students who continue into additional credentials. Stackable pathways, which are central to Workforce Pell, do not always produce immediate income gains.

A student-centered accountability system must account for how working learners actually progress, including re-enrollment and continued education.

The risk of getting accountability wrong

There is a real risk of overcorrecting.

If accountability frameworks become too complex, institutions may pull back from offering Workforce Pell programs altogether. That would limit access for the very students these policies are intended to serve.

At the same time, weak accountability undermines trust in the system. The challenge is to strike a balance that protects students while preserving flexibility and innovation.

What to watch beyond Workforce Pell

The broader AHEAD rulemaking signals a long-term shift toward outcomes-based policy.

Institutions should pay close attention to how cohorts are defined, how long outcomes are measured, and whether continued education is properly accounted for. These details will shape how programs are evaluated in practice.

There is also growing concern about implementation complexity. Even well-designed policies can slow innovation if institutions lack the data infrastructure to execute them effectively.

What this means for institutional strategy

This moment is less about the direction of policy and  more about execution.

Institutions that align program design, employer partnerships, and data systems will be best positioned to succeed Those that cannot adapt quickly may struggle to participate at scale.

The opportunity is significant, but so is the need to get the design right. 

Transcript

Wesley Smith (02:24.214) Today I’m joined by President’s Forum Policy Director, Cam Mortenson, and Policy Fellow, Amy Glenn. We’re here to break down what presidents need to know about the ahead rulemaking. Cam and Amy, thanks for joining.

Cameron Mortensen (02:44.092) Thanks Wes, happy to be here.

Amy Glynn (02:46.029) Thanks so much for having me.

Wesley Smith (02:48.206) Hey, so let’s start with the first question right out of the box. What should presidents prioritize to build workforce Pell programs that work at scale, especially across states and for mobile and working learners? Amy, what do you think?

Amy Glynn (03:03.553) Yeah, so when I think about Workforce Pell, I don’t think about it as a new program. I think about it as a shift in how we operate and expand what we’re already doing really well. Like you said, the students that’s designed for our students who are reskilling, working adults, military connected, parenting students, right? They’re mobile, they’re balancing a lot, and they’re really outcome focused. So the questions for presidents, I really think are three key ones. Can your programs…

Can your program development model support the speed and agility that you need? Can your governance process accommodate iteration at the speed and need for the program relevance? And do you have the infrastructure to track and meet the completion and placement metrics? So this means really having flexibility in how our programs are built and revised, along with strong employer partnerships. Some of those highest quality programs rely deeply on that industry partnership and insight.

And we don’t want our policy to unintentionally limit that. It also means, like you said, consistency and engagement across states because students aren’t confined to one place. Universities that are used to leveraging NC-SARA to minimize state authorization burden are really going to need to ensure that they have a framework in place if developing those programs through distance education to meet those state unique needs.

and understand that their programs may not be authorized in all states because labor market and workforce demands are different. So like the most important thing is it’s about design with the learner in mind, stackable programs, clear pathways and strong support, which is exactly what our institutions are really great at already.

Wesley Smith (04:47.05) Right, right. like your emphasis though. I mean, this will be a little bit of something to navigate when it comes to scaled institutions that are operating in 50 states.

Amy Glynn (05:00.105) Absolutely.

Wesley Smith (05:01.56) Cam, what are your thoughts on presidents? What should they be prioritizing as they build towards Workforce Pell eligible programs?

Cameron Mortensen (05:10.32) Yeah, to play to continue on the cross state functionality piece is as as the laws written programs eligible for work for spell are going to be have to be approved by the state’s governor.

So it’s going to be really important for states and institutions to create a system where programs, especially those delivered online by institutions across the country, can be reviewed and approved so that students learning across the country are able to access these funds for programs that are relevant in more than just one state. So we’re going to need to all work together, states, governors, states, and institutions, in order to find the way to most efficient.

efficiently do that.

Wesley Smith (05:54.402) Yeah, right. mean, that’s going to be, hopefully, we can piggyback on, Sarah, or some other system that gets us essentially consensus from the states on what programs should meet these requirements and what programs don’t. I guess the tension there is

It’s really designed for local workforce as well. I don’t know if there is an easy resolution to this other than saying collaboration is really, really important as we move forward. Any thoughts on that, Amy?

Amy Glynn (06:31.062) that’s a big one to tackle. I agree with you absolutely, collaboration is key. We are already seeing states roll out proposed legislation and language that would have different accountability measures than the federal government would have different requirements. And so really trying to figure out how to find a solution that is elegant.

and well positioned to protect both students and institutions will be really important in seeing success in workforce Pell implementation.

Wesley Smith (07:10.518) Yeah, lots of work left to figure these things out, especially on 50 state operations. Let’s talk about a little bit, though. You mentioned states are working on some accountability, and the accountability framework is tough to say exactly what it should be. But tell us, we’re at the president’s forum, so we care about student-centric accountability frameworks. What does that look like? How can we do that without blocking innovation?

Amy Glynn (07:39.629) Yeah, I think the key here for the accountability conversation is about balance. We have to get the balance right. We absolutely want to protect students and ensure quality, but we also have to make sure we’re not building something so complex that institutions pull back from serving those students. And so one thing that’s really important is recognizing

how working learners actually move through education and getting our voice into the conversation about establishing these metrics. So with these programs, the goal is actually to increase access to stackability of the credentials so that students can re-enroll quickly.

their earnings won’t always show up in a straight line. So schools should not be penalized for having a student move more quickly into a subsequently related program. So if we don’t design these metrics carefully, looking at who’s included, when we measure, we risk missing the real value of the programs, especially for the non-traditional working adult. And so when you think about this,

Fairness really matters, right? Institutions serving students online or across states should not be at a disadvantage to traditional brick and mortar institutions. And right now there really is a risk of that. So at its best, accountability should create clarity and trust, but still leave room for institutions to respond to those workforce needs. And we need to work with our federal partners and the state partners in that triad to make sure that happens.

Wesley Smith (09:12.216) Well, I mean, to your point, the whole reason Workforce Pell was considered a necessity by legislators, by Congress, Congress thought, hey, we have some pathways to opportunity that aren’t necessarily four-year or bachelor’s or associate degrees. Whatever those look like right now, they’re not that. They’re short-term credentials that could work. And the idea here is to make these accessible to as many people as we can.

The risk that we run is with accountability frameworks that are too robust, do you actually discourage people from moving in that direction and programs from qualifying for those? But at the same time, you want to make sure that if you are putting federal dollars and investing those behind them, you get it right and you do have workforce opportunities that result in it. So Cam, tell us what your thoughts are on accountability. How do you strike that right balance?

Cameron Mortensen (10:11.43) Yeah, I think Amy’s point on on credential stacking is really important because if we set up an accountability system that is tracking the is always tracking the income of students right as they come out of programs, we’re going to de incentivize.

institutions that are encouraging students to continue their education after these workforce training programs. And I don’t think that’s necessarily the motive that we want to give. as Amy said, we really want to be student focused. We want to focus on we all I also think it’s really important that we have an accountability system that does focus on outputs rather than delivery method or other sorts of inputs. But there’s just really important nuances such as

the credential stacking.

that we need to make sure we get right. I think one other point is we don’t want to have a constantly, we don’t want to constantly be moving the goalposts. Let’s get, let’s set up a system and let’s stick to it and let’s take the time to make sure we get it right. Actually, I don’t want to say stick to it. I, I’m going to reset that part. let’s, let’s take our time to set up a system that we can all feel good about and is, is bringing true accountability, which, which we support, but let’s get it right. And let’s not try to

rush into something that will lead us to having to change it soon after.

Wesley Smith (11:37.548) Right. You did say something, Kam, that every time I hear it, I want to say amen while people are mid-sentence to it. And that is we should be working with outcomes, not inputs, when it comes to accountability. We shouldn’t be telling institutions how to get the right outcomes. We should be judging them based on their outcomes. And that’s just one of the fundamental principles that the President’s Forum has been.

advocating for decades now at this point.

Let’s let me let me move us to the final question and that is I want to look beyond workforce Pell and I want to look at other aspects of the ahead rulemaking What should institutions watch for? In any other regard Let’s not let’s put workforce Pell aside and look at other issues that that ahead addressed

Wesley Smith (12:39.382) Amy, why don’t you lead on this?

Amy Glynn (12:41.182) Yeah, so as we look what’s coming through the AHEAD and the AEM negotiations, I think the president should really be focused on how accountability is taking shape in practice. We know the direction. There’s going to be a stronger emphasis on earnings and outcomes, return on investment. We’ve seen it in AHEAD, obviously, in the accountability framework. We’ve seen it in the accreditation negotiations.

But the details really matter. How cohorts are defined. How long is data aggregated? Whether we’re accounting for students who continue their education. Those are things that are really going to determine whether these metrics actually reflect reality. The other thing I’d watch for is complexity. There’s a risk of building something that’s technically sound, but really hard to implement given our outdated data structure.

And that can slow down innovation, especially for workforce programs that need to move quickly. And then just making sure that all of this aligns with how students actually progress today, especially in our stackable pathways. For me, it’s more, it’s a moment less about where policy is going and more about whether we got the design right.

Wesley Smith (13:58.776) Right. I love your point about complexity. It’s getting it right, being right theoretically is a lot different than being right practically and driving outcomes for students. And complexity has that ability to kind of suffocate innovation if we spend too much time on the complexity of measuring outcomes. So we’ve got to get that. We’ve got to have solid outcomes, but we have to do it in a way that doesn’t suffocate innovation.

Cam, tell us anything to add to that. What else should we be looking for in the ahead rulemaking?

Cameron Mortensen (14:36.848) Yeah, I mean, just as far as kind of a schedule goes, there’s obviously a lot happening right now as far as higher education regulation executive rulemaking goes. The public comment period for the workforce Pell portion of AHEAD just closed. That’s what we’ve been talking about today. We are also going to get an NPRM, a notice of proposed rulemaking on the accountability measures that were considered in the AHEAD rulemaking.

that this is published, we’ll have finished the first week of that accreditation, innovation, and modernization or AEM negotiation. And then the next session of that will be taking place in May the 18th through the 22nd. So there’s a lot going on right now, a lot that we as the President’s Forum are following and will be involved in. So definitely stay tuned and we’re always happy to hear others’ thoughts and input as well.

Wesley Smith (15:36.142) Thanks, Cam. Thanks, Amy. We appreciate you joining us and giving us some insight on Workforce Pell specifically and what’s next in the ahead rulemaking.

Cameron Mortensen (15:46.374) Thanks.

Amy Glynn (15:47.319) Thanks for having me.

 

Rethinking Tuition Assistance

Rethinking Tuition Assistance

Rethinking Tuition Assistance

In contemporary debates on American defense manpower and national competitiveness, military voluntary education occupies an odd intellectual position. It is normatively celebrated as a mechanism for self-improvement and transition, and it is formally justified in statute as a tool for recruiting, retention, and readiness, yet the evaluative apparatus around it remains remarkably thin. Most analyses track enrollment, course completion, and degree attainment, occasionally extending to nearterm reenlistment effects, but they seldom grapple with the deeper question of how these programs structure the flow of human capital into the nation’s critical infrastructure workforce. Against this backdrop, the Unicorn manuscript advances a more ambitious claim: that voluntary education can be reconceived as a national institute of workforce formation if we are willing to treat individual desire as a measurable construct and link it systematically to both educational capacity and industrial demand.

The conceptual pivot in Unicorn is to move from a supplyside view of education (what programs exist, how many people use them) to a demandside view anchored in person–environment fit theory. Holland’s vocational choice framework and subsequent person–environment fit literature posit that individuals seek environments where they can express their interests and values, and that congruence between vocational personality and work setting predicts satisfaction, performance, and reduced turnover intentions. Unicorn operationalizes this insight by specifying a “Desire universe” in which each servicemember is represented by a structured object comprising a multitude of dimensions. Rather than treating desire as a vague preference, the manuscript treats it as a high-dimensional data object that can be measured, aggregated, and analyzed at scale.

Once desire is formalized in this way, new analytic possibilities emerge. At the micro level, desire objects can be matched to families of occupations across the critical infrastructure landscape, from advanced and additive manufacturing to cyber defense, energy systems, logistics, and data-intensive roles. At the meso level, aggregating these objects reveals latent patterns: clusters of servicemembers whose interest–value–skill profiles align with particular sectors, regional concentrations of underdeveloped potential, or systematic mismatches between what individuals want and what existing education pathways make visible. At the macro level, these desire distributions can be compared against labor market projections in the defense industrial base, semiconductor ecosystems, and broader national security-relevant industries, where workforce shortages in the millions are now regularly cited in both government strategies and industry analyses.

However, Desire is only one of three universes in the Unicorn architecture. The second, Capacity, reframes the higher education enterprise as a programmable layer of human capital production. Here, the manuscript argues for a comprehensive mapping of programs, particularly at regional research universities (R2s), community colleges, and technical institutes, tagged not only by discipline and credential level but by their relevance to critical infrastructure workforce categories. This capacity map makes it possible to ask analytically precise questions: Where do existing offerings already intersect with observed desire clusters for cyber or energy roles? Where are there pockets of strong desire but insufficient capacity, suggesting a need for new cohorts, microcredentials, or industry embedded pathways? And where is capacity abundant but loosely coupled to both desire and demonstrable workforce demand, raising questions of allocative efficiency?

The third universe, Connection, is where Unicorn’s analytic exposition pushes most directly into institutional design. Building on the first two universes, Connection is described as an interface between individuals, educational institutions, and the critical infrastructure workforce at large. It encompasses the matching algorithms and governance structures that translate desire and capacity into concrete trajectories: from initial counseling and course selection through completion, credential stacking, and placement into roles recognized across agencies and industries alike in the critical infrastructure taxonomies. In theoretical terms, this layer operationalizes person–environment fit not just within an abstract “job” but within a national system of essential work, where resilience of energy grids, defense supply chains, cyber systems, and logistics networks are now treated in strategic documents as a core security concern.

What makes Unicorn particularly provocative for scholars of military sociology, higher education, and labor economics is its insistence that voluntary education outcomes be evaluated against this connection frame rather than against proximal educational metrics alone. Existing empirical work on tuition assistance and related programs offers mixed evidence on retention, in part because participation has unfolded in an environment where neither desires nor workforce linkages were systematically specified. By contrast, Unicorn sketches a counterfactual regime in which education benefits are intentionally used to steer desire rich populations into undersupplied critical infrastructure roles, and in which success is measured by changes in reenlistment among targeted skill communities, promotion and readiness indicators, and postservice earnings in strategically salient sectors. This is not simply a call for better metrics but for a different dependent variable: from “did the member complete a degree?” to “did the system convert desire plus capacity into durable contributions to the critical infrastructure workforce?”.

The manuscript thus offers, in condensed form, a three universe theory of how desire, educational capacity, and economic structure might be jointly modeled in the context of U.S. defense and national security. For academic readers, it opens several lines of inquiry. One could test the stability and predictive validity of the proposed Desire construct across cohorts and services. One could examine how different capacity configurations say, varying densities of R2 institutions with strong engineering programs, alter the efficiency with which desire is translated into critical infrastructure employment. And one could interrogate the normative and distributive implications of using a military education apparatus as a national workforce instrument, particularly in light of broader debates about reindustrialization, regional inequality, and the civilian–military boundary.

Unicorn does not claim to resolve these questions within its own covers. Instead, it offers a deliberately constructed architecture, Desire, Capacity, Connection, as a researchable object, and as an invitation. If desire is indeed measurable, and if voluntary education can be reconceived as a critical infrastructure institute rather than a peripheral benefit, then scholars and practitioners alike face a different set of design problems than those that have dominated the TA literature (such as it is) to date. The full text elaborates this architecture, populates it with empirical estimates and sectoral projections, and sketches legislative and administrative pathways for implementation. The argument, in short, is that there is a unicorn here, not in the sense of an impossible creature, but in the sense of a rare institutional configuration hiding in plain sight, waiting to be specified, measured, and built.

Expanding Opportunity for Those Who Serve

Expanding Opportunity for Those Who Serve

Expanding Opportunity for Those Who Serve

Why it matters

Military learners balance service, family, and education under extraordinary conditions.

Higher education policy must reflect that reality.

The challenge

Military learners face:

  • Frequent relocations
  • Unpredictable schedules
  • Training that isn’t always recognized for credit
  • Complex transfer and enrollment systems

Bottom line

Military learners remind us why student-first innovation matters.

Our job is to build systems that match their commitment with opportunity.