Quantcast

Credit hours vs. competency debate continues for classes

Published: Sunday, Nov. 11 2012 9:55 p.m. MST

Julie Laub, a nontraditional master's graduate, conducts a chemistry experiment with Spencer Gilbert at Davis High School.

Laura Seitz, Deseret News

SALT LAKE CITY — Julie Laub worked as a professional chemist before choosing to become a stay-at-home mother. Five children later, and after many years of volunteering in her kids' classrooms, she decided to become a high school chemistry teacher. But Laub lacked a teaching certificate and the master's degree that would equip her to teach advanced chemistry classes. She wanted the training, but didn't need the frills of campus life.

Laub's dilemma highlights a hot topic in the higher education world: traditional college credit hours represent seat time, and that doesn't always equate with learning. The failed connection between seat time and student academic progress is the subject of a new report, "Cracking the Credit Hour," by Amy Laitinen of the New America Foundation, a nonprofit think tank based in Washington, D.C.

The report shows that most American college graduates don't know as much as they should, and suggests that changes in the credit hour system could spur improvement.

Laub, who lives in Kaysville, found her solution at Western Governors University, a nonprofit online school created by the governors of 19 U.S. states in 1997.

WGU's degree programs are based on competency assessments instead of credit hours. The school created an individualized program that allowed Laub — a smart, motivated chemist/mom — to get herself qualified and into the classroom on a speedy schedule that fit her situation.

A weakened system

When Andrew Carnegie came up with the credit hour as part of a 19th century, teacher compensation system, he intended credit hours to measure time, not learning. But, because they are easy to understand and measure, credit hours became the basic currency of higher education. Graduation requirements are based on credit hours, as are financial aid amounts.

The credit hour's clout is weakening, however, with a dawning realization that four years spent in college does not guarantee success at a job. A 2006 study by the National Center for Education Statistics showed that 69 percent of college graduates could not perform basic tasks such as comparing opposing newspaper editorials or comparing the cost per ounce of different foods.

Laitinen said a college degree should signify a transformational process that results in a learned person, but that doesn't always happen. Richard Arum and Josipa Roksa's 2010 book "Academically Adrift: Limited Learning on College Campuses" highlighted the problem.

When 2,300 students at four-year colleges took the Collegiate Learning Assessment to measure higher-level skills taught at college, 45 percent didn't demonstrate significant improvement in learning during the first two years of college and 36 percent did not demonstrate significant learning over four years of college, the book said.

A survey by the Association of American College and Universities showed that one-third of employers said "no" when asked if college graduates are well-prepared to succeed in entry-level positions in their companies. And when employers drill down to grades on transcripts when screening job applicants, it's hard to tell what graduates know.

Grade inflation can be blamed for that. In 2008, 43 percent of all college grades were A's. In 1961, the number stood at 15 percent, according to the Teachers College Record.

Credit hours cater to the increasingly rare student who lives and studies on campus, making the system outdated, Laitinen's report said. Meanwhile, millions of workers who possess valuable knowledge and skills gained on the job have no way to get credit for what they know. Another frustration with credit hours is the difficulty students often face when trying to transfer them from one institution to another.

Standardize or not?

Get The Deseret News Everywhere

Subscribe

Mobile

RSS