Assess & Progress: Increasing Value and Eliminating Cost for Assessment in Student Affairs
Assessment, Evaluation, and Research
July 14, 2025
In 2019, Student Affairs Professionals (SAPros) at James Madison University (JMU) partnered with the Center for Assessment and Research Studies (CARS) to conduct a needs assessment of student affairs practitioners (Pope, 2020). The goal was to determine how much professionals in the Division of Student Affairs valued assessment and evidence-informed practice and whether they had the skills to engage in this expected work. SAPros were asked questions that would frame future development, such as:
-
How much time do SAPros spend consuming empirical research to build effective programming?
-
Do SAPros at JMU value the creation of evidence-informed programming (EIP)?
-
Do SAPros value the assessment of program effectiveness?
-
Are SAPros confident that they possess the necessary knowledge and skills for EIP and assessment?
-
Do SAPros at JMU engage in EIP and assessment?
Results of this needs assessment showed that while most SAPros valued assessment of program effectiveness, recognizing its importance for advancing the profession, many did not regularly engage in assessment/EIP (Pope, 2020). More concerning, SAPros reported that they were least likely to use current research to inform the development and facilitation of programming and services, a crucial aspect of the “planning” phases of the assessment cycle (see Figure 1).
Figure 1
The Assessment Cycle (Wild, LaFrance, & Stewart, 2025)

Barriers to EIP/assessment included no time to read literature or implement EIP, in addition to a lack of expectation from leadership regarding EIP. These findings mirrored existing literature which states that SAPros are not always trained to conduct assessment (Wawrzynski et al., 2015) and that there are significant gaps in student affairs graduate education and professional development related to assessment (Rehr et al., 2024; Strine-Patterson et al., 2024). Despite valuing assessment internally, motivation to conduct this work wasn’t there, aligning with what Culp & Dungy (2012) described as a ‘culture of good intentions.’ In response, we aimed to shift the Division towards a ‘culture of evidence.’
We framed the missing motivation using Barron & Hulleman’s (2015) Expectancy-Value-Cost (EVC) model. This model suggests that optimal motivation for a task (i.e., Assessment, EIP) occurs when an individual 1) believes they can do the task [Expectancy], 2) wants to do the task [Value], and 3) is free of barriers preventing them from investing time, energy, and resources into the task [Cost]. While professionals and leadership valued EIP and assessment, this Value was undermined by a lack of Expectancy and an overwhelming perception of Cost. Professionals lacked time to learn about EIP or apply it in practice; moreover, their supervisors didn’t expect them to. When SAPros lack Expectancy and face significant cost barriers, motivation to engage in assessment decreases. It became clear that flexible resources were needed to change our Division’s assessment culture. We decided to teach professionals about EIP, work directly with leadership to raise expectations, and carve out intentional time for assessment work.
In 2022, SAPros at JMU, in collaboration with CARS, began to develop a series of professional development workshops titled “Assess and Progress: Building a Culture of Improvement in Student Affairs”. These workshops, piloted in fall 2023, were held on the first Friday of each month (August through April) and lasted approximately 3 hours. The first 1.5 hours were dedicated to educational instruction, while the second 1.5 hours served as protected work time for SAPros to focus on unit-specific assessment projects. The series as a whole sought to encourage staff to develop and submit completed assessment reports at the end of the academic year.
The workshops were intentionally designed to address the components of the EVC model and foster a culture of evidence. We focused first on Expectancy. The initial 1.5 hours of each workshop featured lectures from assessment consultants, designed to build foundational knowledge that SAPros felt they were missing. Each workshop aligned with a step in the assessment cycle (Figure 1), providing an overview of material and incorporating equity considerations specific to that assessment step. Assessment consultants were also available during the second 1.5 hours of independent work time to offer one-on-one guidance for individual EIP projects. The series was intentionally sequenced to build competency in structured, manageable segments, with the culminating goal of producing a completed assessment report.
We explicitly addressed Value by aligning the training with professional standards (e.g., ACPA-NASPA, CAS). For example, at the start of each workshop, we highlighted the CAS ‘must’ statements that would be covered by that session. To enhance continuity, each session was tied back to the previous one. Additionally, by incorporating equity considerations, we provided practical, equity-focused strategies for professionals to implement – enabling them to engage in something they valued but previously were not sure how to approach. We also subtly reinforced Value by employing the saying-is-believing strategy (Higgins & Rholes, 1978, as cited in Walton & Wilson, 2018).
Lastly, we aimed to eliminate Costs by working directly with SA leadership to explicitly block and protect these three hours on SAPros’ calendars, providing structured, uninterrupted time for assessment work. Assessment literature highlights time as a major barrier to engaging in assessment work (Duncan & Holmes, 2015; Schuh et al., 2016). Through our series, we carved out three hours a month, encouraging professionals to use this protected time for assessment endeavors. Senior leadership was encouraged to assign staff to attend the series and all staff were encouraged to attend via multiple mediums of communication (i.e. emails, calendar holds, Teams posts, verbal communications via all SA meetings). The series was strategically planned for the entire year, so participants could block their time in advance and avoid scheduling conflicts. Dates and times were prominently posted (e.g., Teams, Division-wide calendars), ensuring SAPros knew when, where, and how they would be spending their time. Additionally, the 1.5 hours of work during each session allowed staff to receive assistance from assessment experts regardless of their stage in the assessment process. This enabled staff to use their time effectively to complete assessment tasks.
Beyond addressing Expectancy, Value, and Cost within each session, we also tackled these globally when advertising the year-long training. We began by critically examining the messages communicated to SAPros by Division leadership. Utilizing the multilevel assessment process framework (Strine-Patterson, 2022), SA Assessment Professionals actively engaged senior leadership (Associate Vice Presidents (AVPs) and Vice President (VP)) in discussions about the importance of this professional development series to increase leadership’s value of assessment work. Leadership buy-in was essential to instill the value of assessment work across the Division. Part of this strategy involved encouraging AVPs to designate attendees from their areas with the goal of assessing a program that year. This would not only communicate the value of assessment work from leadership but also create an expectation that assessment would take place in their areas, thereby increasing the perceived value of assessment among SAPros. In addition to encouraging AVPs to appoint attendees, the sessions were open to all staff within the Division.
The “Assess & Progress” series has now been facilitated for two consecutive years, having just finished its second annual cycle. Based on feedback from participants and emerging research in the field—including recent findings from Strine-Patterson et al. (2024) and Rehr et al. (2024)—we made targeted adjustments to the series between its first and second iteration. These refinements aimed to better align with evolving professional development needs and to support more meaningful engagement with assessment beyond measurement alone.
In alignment with recent scholarship (e.g., Rehr et al., 2024; Strine-Patterson et al., 2024), particular attention was given to staff engagement in evidence-based program planning and the use of assessment results throughout the second facilitation of “Assess & Progress”. As highlighted in the literature (e.g., Finney & Horst, 2019; Hoffman, 2015), and more recently reinforced by Rehr et al. (2024) and Strine-Patterson et al. (2024), many SAPros receive inadequate training in assessment, and much of the current training—whether in graduate programs or professional development—centers on measurement, with insufficient attention to the planning and use of results phases of the assessment cycle (see Figure 1). Strine-Patterson et al. (2024) argues for a more realistic model of assessment expectations in student affairs, in which professionals are encouraged to focus on theory-informed program planning, implementation fidelity, and the use of results, while partnering with assessment specialists for methodological and measurement expertise. This model was reinforced during the second iteration of the “Assess & Progress” series.
With the recent conclusion of the second iteration of “Assess & Progress,” we have begun to reflect on what’s next for continuing to improve the series. One future consideration we have identified is to conduct a second needs assessment to evaluate the impact of the professional development series and to determine if the needs of SAPros have changed since the initial needs assessment was conducted in 2019. This needs assessment could serve as a way to evaluate the series’ impact and explore remaining barriers to assessment engagement. Additionally, the results may help us reflect on whether the culture of evidence in the Division of Student Affairs at JMU has shifted meaningfully over time. In alignment with recent literature (e.g., Rehr et al., 2024; Strine-Patterson et al., 2024), particular emphasis for the needs assessment will be placed on evaluating staff engagement in program planning and use of assessment results.
Click here to access resources from our 2025 NASPA Presentation!
Click here to access the Student Affairs Assessment Improvement Rubric and associated resources directly!
References
Barron, K. E., & Hulleman, C. S. (2015). Expectancy-value-cost model of motivation. Psychology, 84, 261-271.
Culp, M. M., & Dungy, G. J. (Eds.). (2012). Building a culture of evidence in student affairs: A guide for leaders and practitioners. NASPA-Student Affairs Administrators in Higher Education.
Duncan, A. G., & Holmes, R. H. (2015). Tenet three: Lay the foundation for a sustainable assessment culture. In R. P. Bingham, D. A. Bureau, & A. G. Duncan (Eds.), Leading assessment for student success: Ten tenets that change culture and practice in student affairs (pp. 41–50). Stylus.
Finney, S., & Horst, S. J. (2019). The status of assessment evaluation and research in student affairs.
Higgins, E. T., & Rholes, W> S. (1978). “Saying is believing”: Effects of message modification on memory and liking for the person described. Journal of Experimental Social Psychology, 14, 363-378. http://dx.doi.org/10.1016/0022-1031%2878%2990032-X
Hoffman, J. (2015). Perceptions of assessment competency among new student affairs professionals. Research & Practice in Assessment, 10(2), 46-62. https://www.rpajournal.com/perceptions-of-assessment-competency-among-new-student-affairs-professionals/
Pope, A. M. (2020). Evidence-informed programming in Student Affairs: A mixed methods study examining behaviors, perceptions, and barriers related to the use of theory and research in program development (Doctoral dissertation, James Madison University).
Rehr, T. I., Holliday-Millard, P., Gill, T., Jankowski, N., Boren, S., Levy, J. D., & Lovette, S. (2024). Assessment in higher education and student affairs graduate education: Professionalization and its implications. Journal of Student Affairs Inquiry, Improvement, and Impact, 7(1), 82-100. https://doi.org/10.18060/28006
Schuh, J. H., Biddix, J. P., Dean, L. A., & Kinzie, J. (2016). Assessment in student affairs (2nd ed.). Jossey-Bass.
Strine‐Patterson, H. J. (2022). Assessment is a leadership process: The multilevel assessment process. New Directions for Student Services, 2022(178-179), 61-76. https://doi.org/10.1002/ss.20429
Strine-Patterson, H. J., Tullier, S., LaFrance, S., & Lovette, S. (2024). The call for student affairs assessment professionals and units– And other strategies for improving assessment in student affairs. Journal of Student Affairs Inquiry, Improvement, and Impact, 7(1), 121–158. https://doi.org/10.18060/28001
Walon, G. M., & Wilson, T. D. (2018). Wise interventions: Psychological remedies for social and personal problems. Psychological Review, 125(5), 617-655. http://dx.doi.org/10.1037/rev0000115
Wawrzynski, M. R., Brock, A., & Sweeney, A. (2015). Assessment in student affairs practice. In M. J. Amey & L. M. Reesor (Eds.), Beginning your journey: A guide for new professionals in student affairs (pp. 121-142). National Association of Student Personnel Administrators.
Wild, A., LaFrance, S., & Stewart, J. (2025, March). Assess and Progress: A Strategy for Building Time and Space for Assessment in Student Affairs. Presented at the annual meeting of NASPA, New Orleans, LA.