1 of 2
John Paul Henry, ASSOCIATED PRESS
Chelsey Gillum glances back at the crowd filling the Carson Center Monday, Dec. 15, 2014, during the West Kentucky Community and Technical College December 2014 Commencement Ceremony in Paducah. The 2014 nursing pinning ceremony preceded the commencement, with 42 nursing students receiving nursing pins representing the completion of the college's Associate Degree Nursing (ADN) program.

Not long ago, parents and students evaluating colleges turned to U.S. News rankings, the only game in town. No longer. Rating colleges is now, if not a booming business, at least a surging sideline.

The most dramatic recent entrant may be the White House. After President Obama announced plans to rank and hold colleges accountable for graduation rates and debt defaults last year, the White House went online with a sophisticated user-friendly database. The interface allows student and parents to see how particular schools do on graduation rates, on time graduation, and average debt loads.

But the user at the White House college ratings website is told that the Department of Education is actively working on getting data about the employment records of students who took out federal student loans.

That data is not yet available, the White House website says. But that may soon change. A bipartisan consensus has formed to make available that data link, the holy grail of higher education investment returns, says Anthony Carnevale, director of the Georgetown University Center on Education and the Workforce.

Sen. Lamar Alexander, R-Tennessee, who heads the Senate Education Committee, is on board. And the bipartisan “know before you go” legislation sponsored by Sens. Marco Rubio, R-Florida, and Sen. Ron Wyden, D-Oregon, would also break the impasse, allowing states to more easily connect employment and earnings to education data.

"When the Congress starts working again," Carnevale said, "it's going to happen."

Many think that opening the data bottleneck cannot happen too soon. Researchers and consumers are routinely running into the limits of data as they try to weigh the economic returns of attending a given school, studying a major, or entering a career.

Most frustrating, researchers say, is that the data we need is actually available, but current federal law prevents it being put in play. Unemployment and Social Security employment data could be combined with college transcript data on majors and degrees completed. This link, experts argue, could be done without using any individual identifiers or violating privacy.

Three prominent new takes on the college value data hunt were released in the last couple of weeks. All three circle around that elusive gold standard, trying to approximate it in different ways.

The elusive link

The Aspen Institute, a Washington, D.C.-based think tank, released a report in April titled "From College to Jobs," which surveys several researchers on the cutting edge of this data hunt.

The Aspen report focuses on how to measure the labor market returns of higher education. While conceding that higher education cannot be purely reduced to economics, the authors of the Aspen report argue that governments and students do view college as an investment and economic returns must be properly understood.

"This is not original research on our part," said Josh Wyner, vice president and executive director of the Aspen Institute's College Excellence Program. "It's an attempt to synthesize the perspectives of eight different policy analysts based on the data that they've produced."

Some of that data comes from unemployment and census records along with other sources, Wyner said.

Like most researchers in this space, Wyner points to the problem of trying to link individual student transcript data to wage information. He notes that most states can get this data for about 67 percent of their students who stay in the state. But once they cross state lines, it becomes clumsy to track. State unemployment insurance data also miss those who are self-employed or in the military.

Adjusting curriculum

In an era of tight budgets and declining investment in higher education, Wyner argued, policymakers controlling those budgets badly need more comprehensive data to bettter understand returns on investment.

"By unpacking these data," Wyner said, "colleges and universities can get a sense for where they are providing the greatest value to students and where they are not."

This does not, Wyner argues, mean that schools should abandon all of their early higher education programs and pump out more petroleum engineers. "Although wages are low, there is a strong labor market for teachers," he said.

But Wyner believes even fine arts programs could benefit from more rigorous analysis of their labor market results. "If they could actually track the success of their graduates they would likely find students who have strong technology skills or are capable as teachers or who can run a small business do better," Wyner said. "That could help them align their curriculum to make sure their students have not only artistic skills but also market-based skills."

Majors matter

Another new report from Georgetown's Center on Education and the Workforce finds the predictable result that college degrees matter — a lifetime earning bonus of over $1 million beyond a high school diploma. But it also finds that majors among college students matter even more. The highest-paying college degree, petroleum engineering, pays $3.4 million more over a lifetime than does the lowest paying major, early childhood education.

"We use census data, which is the best data we have at the moment," Carnevale said. "There are other sources on the value of college degrees, but they are pretty inferior."

The downside of census data is that while it allows the researchers to pinpoint earnings by major it does not allow them to see differences between majors at different schools. Some schools may be greatly under- or over-performing for their students, but the census data won't show that.

Adding value

One ambitious project that tries to get at differences among schools, rather than majors, was also released in recent weeks by the Washington, D.C.-based Brookings Institute.

Brookings' "Beyond College Rankings" uses a sophisticated model that predicts how students would perform in the economy based on the type of school and the background of the students. It then compares those predictions to actual results for the students. The model separates two-year and four-year schools.

The "value added" number then, in theory, reveals whether the average student might expect to do better there than attending a parallel school with similar demographics and similar course offerings.

The report offers "a starting point for looking at a college’s strengths and weaknesses in preparing for a career," said the report's main author, Jonathan Rothwell, a research fellow at Brookings' Metropolitan Policy Program.

And by the same token, Brookings was not able to separate the performance of students emphasizing different majors within a given school. So a school that vastly over-performs in one field that produces a lot of graduates could be masking serious failings in other majors at the same school.

"This is most useful if you don't know what you are majoring in before you go," Rothwell said. "If you do know what you are majoring in, there is still some useful information here, but it's harder."

Rothwell tackles the question from a different angle than Carnevale. Rothwell can tell you which schools perform better or worse than they should. Carnevale can tell you which majors are more or less lucrative. But neither is able to cross that bridge and unite the two.

Also, like most research in this space, Rothwell had to rely on approximations for some key data. To estimate earnings, he relied on Payscale.com, a website that aggregates self-reported salaries. To figure out where a school's graduates end up, he used LinkedIn.

Not everyone is comfortable with using this data this way. Derrick Anderson, who teaches program evaluation and research methods at Arizona State University, thinks Brookings is doing innovative and important work to parse how different schools perform, but that the data being used here is highly questionable.

"If this were in a journal and I were reviewing it, I would applaud the innovation," Anderson said. But he expressed serious reservations about the quality of the data and, as a result, concerns about Brookings putting an interactive tool online for consumers — who will not understand those shortcomings — to use.

For his part, Rothwell is very aware of the limits of the data. He wishes he had more on outcomes, like Carnevale and others want, but he would also like to see "more fine grained data on student preparation," which would make comparisons of expected to actual results more accurate.