Most deans and provosts accept that assessment is an inevitable and time-consuming part of their work. Expectations for assessment from accrediting agencies are high. In fact, it might sometimes seem as if you’re asked for so many reports by institutional and program accreditors that you don’t have time to make sure the results are used to improve student experiences.

Four members of Dean & Provost’s advisory board participated in a conference call to share their best strategies for making assessment not only manageable but also useful.

“We are definitely better as a result of assessment and the continuous improvement philosophy,” said Jill Murray, executive vice president and chief academic officer at Lackawanna College in Pennsylvania.

Process raises challenges

There’s no question that assessment is time-consuming. And in divisions with numerous program-specific accreditations, it can require constant effort.

LUCINDA LAVELLI

For example, the University of Florida’s College of Fine Arts has regional, statewide and program accreditations. Most of the requested reports ask for a lot of the same information but in slightly different formats. That makes it impossible to cut and paste, said Dean Lucinda Lavelli.

Officials have hardly finished one process when they need to jump into the next one, which doesn’t leave time to put improvements into place, Lavelli said.

There are “so many pieces and moving parts” of accreditation, Murray said. But the real challenge is “not just moving and doing it but what you do with it,” she added.

Assessment is an unfunded mandate, said Herman Berliner, provost and senior vice president for academic affairs at Hofstra University. But putting the money and time into doing it well so that your institution gets a clean accreditation report is a lot easier than correcting deficiencies, he said.

Organization is key

Effective assessment relies on having a good process in place. At Hofstra, an associate provost for assessment and accreditation oversees a faculty-driven process, Berliner said. Because there are vast differences among disciplines, the faculty can choose how they want to perform assessment, but participating is not optional. Hofstra’s record of Middle States accreditation visits with no follow-up involved are a persuasive argument for its importance.

JILL MURRAY

At UF, two institutionwide assessment offices produce several reports every day, Lavelli said. Within her unit, the associate deans, program directors and faculty all have responsibility for assessment activities.

Connecting assessment with other campus efforts and time lines ensures that it occurs on a continuous basis. For example, strategic plans can help structure institutional assessment. Lackawanna’s includes facilities, and Middle States looks carefully at how those are assessed, Murray said.

Officials at Lackawanna review progress toward their three-year plan on a quarterly and yearly basis. Administrators use it to evaluate space needs, classroom technology and athletics. Officials can review assessment results from academic and nonacademic areas to determine the best ways to balance spending among the areas, Murray said.

At Lackawanna, departmental plans link with and support the institutional plan. “Assessment is embedded into each departmental plan, and the implementation of each departmental objective or strategy is measured,” Murray said.

Since the plans were implemented in 2011, 68 percent of all the strategies and objectives identified in departmental plans were achieved, 4 percent were not yet achieved due to budget constraints, and 28 percent were not yet achieved for reasons unrelated to the budget but are in process, Murray said.

HERMAN BERLINER

In June, the entire community was invited to a two-day summit to review departmental plans and discuss progress to date.

Hofstra’s five-year plan helps officials set priorities. During the recession, they couldn’t do everything in the plan, but regular assessment provided them with a format for determining priorities.

At UF, the plan also helps set priorities with aging facilities. Since a significant amount of maintenance is deferred, sometimes Occupational Safety and Health Administration regulations determine what will be done, Lavelli said.

Complete loop for continual improvement

Assessment for its own sake is a waste of time. But the real goal is to use what you learn to make improvements. For example, assessing the general education curriculum at Hofstra convinced faculty members that it did not include enough focus on oral communications skills, Berliner said. They made changes to gen ed, plus each major added a course that included a significant oral communications requirement.

Also, the Psychology Department administered a standardized test to majors to identify areas they knew well and those they did not. Then they revised the curriculum to address the deficiencies.

DARBY DICKERSON

Assessment results are also useful for advocating for resources such as new faculty lines, Lavelli said.

At the Texas Tech University School of Law, officials administer standardized tests such as the Law School Survey of Student Engagement to benchmark students’ experiences against those of peers at other law schools, said Dean Darby Dickerson.

Programs to address problems are offered in ways that appeal to law students. For example, a program about drinking might focus on how students can represent their future clients, but it could also cover how an attorney’s conduct could put his licensing at risk.

Good data is essential

Ensuring data integrity in the accreditation process is a challenge, Murray said. Lackawanna is a small college, so officials have the luxury of mining data in different ways to compare the results for accuracy.

Also, multiple individuals work with the data so that checks on its validity can be made on a continual basis.

At Hofstra, the institutional research vice president checks data integrity, Berliner said. The two people who have held that position have been an accounting professor and a math professor who understood data well. Both prepared templates for faculty members to input data so that the professors could enter it in a consistent format.

For law schools, institutions altering data reported to groups such as U.S. News & World Report has been a big problem and has resulted in poor press for several major schools, Dickerson said.

Plus, law schools must report career and salary data for graduates. At Texas Tech, officials start with forms completed by students. They review the results carefully to make sure they are accurate before releasing a report.

Want to think about context?

To think about how the trend toward increasing assessment ties into other developments, Lucinda Lavelli, dean of the School of Fine Arts at the University of Florida, recommends Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed by James C. Scott.

The Bottom Line …

To ensure effective assessment, make sure your institution implements these strategies:

  • Engage stakeholders. Faculty will be more enthusiastic if they are driving the process. And widespread involvement makes it possible to complete the large amount of work required.
  • Ensure data integrity. The results you get when you analyze data are only as good as the data you start with.
  • Organize your process. Assessment efforts need to be completed on a time line.
  • Use results for improvement. There’s no point to assessment if the findings aren’t put to work.