Evaluating Solutions for
Improving Program Assessment

A framework-driven approach to identifying, evaluating, and implementing assessment systems in higher education, culminating in the innovative ALTMAP model.

Nathan C. Anderson, Ph.D. Alaric Williams, Ed.D. Daniel Ringrose, Ph.D.

Five Categories of Assessment for Planning and Evaluation

Comprehensive planning and evaluation aligns with standard logic models and common assessment questions to guide programmatic decisions.

Needs Assessment

Establishing whether a problem exists, describing it, and identifying gaps to provide a rationale for a program.

Tap to flip

Example Questions

  • What are the needs of the population?
  • What are the nature and magnitude of the problem to be addressed?

Theory Assessment

Revealing which potential solutions and program activities are appropriate responses to the identified needs.

Tap to flip

Example Questions

  • What services should be provided?
  • What are the best delivery options for the services?
  • How should the program be organized?

Process Assessment

Monitoring outputs to ensure the selected solution is being implemented as intended in daily operations.

Tap to flip

Example Questions

  • Are administrative and service objectives being met?
  • Are the intended services being delivered to the intended persons?

Outcome Assessment

Evaluating impacts to determine whether an implemented solution is having its desired beneficial effects.

Tap to flip

Example Questions

  • Do the services have beneficial effects on the recipients?
  • Is the problem or situation the services are intended to address made better?

Efficiency Assessment

Comparing outcomes with costs to reveal whether resources for implementation are being utilized responsibly.

Tap to flip

Example Questions

  • Are resources being used efficiently?
  • Is the cost reasonable in relation to the magnitude of the benefits?
Institutional Case Study

Applying the Framework in Practice

A Midwestern university utilized the framework to assess its Yearly Program Assessment (YPA) process, generating actionable evidence to select a new digital system.

Needs Assessment

Identified gaps in the current assessment process through interviews, observations, content analysis, and a faculty survey.

Tap for Evidence

Faculty Survey Priorities

Percent indicating the area is a "Moderate, High, or Essential Priority" for improvement.

Template Consistency68%
Efficiency80%
Relevance84%
Comprehensiveness67%
Institutional Memory68%
Assessment Guidance76%

Theory Assessment

Examined whether potential vendor-based solutions could appropriately address the YPA requirements and resolve the demonstrated needs.

Tap for Evidence

Vendor Congruency

Three online systems were reviewed against required MS Word reporting fields (Goals, Outcomes, Methods, Targets, Results).

  • Consistency Static interfaces prevent year-to-year template drift and confusion.
  • Efficiency Automated workflows and single systems replace manual email trails.
  • Relevance Standardized data entry allows for institution-wide aggregate reporting and trend analysis.

Efficiency Assessment

Compared the three theoretically appropriate systems against available resources, financial parameters, and support infrastructure.

Tap for Evidence

Resource Comparison

Criteria Sys 1 Sys 2 Sys 3
Annual Cost $0 ~$30k ~$30k
State Network
Contract Exists
Vendor Support

The Decision: Selecting System 1

By comprehensively evaluating the evidence, the institution confidently selected System 1. The Needs Assessment revealed critical areas for improvement in the current process. The Theory Assessment verified that System 1 was structurally appropriate to resolve those specific reporting and workflow gaps. Finally, the Efficiency Assessment proved that System 1 was drastically more feasible than alternatives—costing $0 annually due to existing state contracts while providing essential vendor technical assistance.

A New Synthesis

The ALTMAP Model

A comprehensive lens for planning, implementing, and evaluating purposeful initiatives. Tap each component to explore.

A

Assessment

Tap to flip

Assessment

Refers to the 5 categories of assessment (Needs, Theory, Process, Outcome, Efficiency). Serves as the foundational evidence generator for the entire initiative.

L

Logic Model

Tap to flip

Logic Model

A one-page snapshot detailing the program's Context/Need, Expected Goals, Target Population, Resources, Activities, Outputs, and Short/Mid/Long-term Outcomes.

T

Theory of Change

Tap to flip

Theory of Change

A high-level diagram illustrating the causal relationships. It visually maps how resources and activities are expected to trigger a chain reaction leading to the desired long-term outcomes.

M

Metrics Tracker

Tap to flip

Metrics Tracker

An operational tool (like a spreadsheet) used to compile target and actual values for specific output and outcome indicators aligned with the Logic Model.

A+

Inquiry

Tap to flip

A+ Inquiry

An 8-stage cyclical framework for disciplined inquiry, centered around 'Awareness', used to operationalize specific metrics.

P

Project Schedule

Tap to flip

Project Schedule

Details the sequential scope of activities across time. Tracks tasks, status, dates, and responsible parties to ensure optimal implementation.