91探花

Administrative & Student Support Unit Review (ASSUR)

Why is ASSUR reporting important?

The purpose of the non-academic program review is to assure that 91探花鈥檚 non-academic support and administrative services 鈥 athletic, administrative, and co-curricular units 鈥 engage in systematic and integrated planning and continuous improvement by:

  • Aligning unit goals and activities with the institution鈥檚 mission, vision, and strategic priorities
  • Stimulating proactive planning and achieving outcomes that support the strategic priorities
  • Measuring unit outcomes and gauging performance against external peers to demonstrate ongoing service effectiveness, efficiency, and quality
  • Ensuring stakeholder satisfaction
  • Perpetuating a culture of continuous improvement and organizational innovation

The goal of the ASSUR process is to integrate our planning and evaluation processes by aligning unit mission, vision and strategic goals with institutional mission, vision, and strategic goals and to measure and document unit effectiveness vertically and horizontally across the institution to identify areas of improvement and innovation that can be aligned with resource management decisions. 鈥

In addition to being good practice, ASSUR鈥檚 integrated planning, evaluation, and continuous improvement framework helps us meet institutional accreditation requirements, particularly , of the Higher Learning Commission鈥檚 (HLC) Criteria for Accreditation.

Who will be able to access and see our ASSUR reports?

Institutional Effectiveness & Analytics (IEA) publishes all submitted ASSUR reports to the Administrative and Student Support Unit Review page of the Assessment Clearinghouse, so that documentation is preserved through leadership changes and for HLC reporting purposes. All reports are available for review by HLC representatives during the university鈥檚 4-year assurance review and 10-year reaffirmation process.

Publishing the reports in the Assessment Clearinghouse, provides an easy avenue for sharing of ideas, and examples across units. Sometimes, units submit high-quality reports that make good examples for others to see.

Should our unit create new outcomes every year?

One part of demonstrating unit effectiveness is to demonstrate improvement over time. If units create new outcomes every year, it will not be possible to show the effects of the unit鈥檚 efforts to improve its processes, services, or products. Outcomes should be applicable to the unit over several years.

However, there are a few instances where units will need to change some outcomes:

  • They have met their success criteria/targets for their existing outcomes for many years in a row and would now like to focus on improving other processes, services, or products.
  • They have developed a new strategic plan and want to align their outcomes with their new plan.
  • They have realized that some of their current outcomes are no longer useful for helping them improve.
Our unit just developed a new strategic plan, and we want to create all new outcomes to align with our plan. Is this, okay?

Yes, it is good practice to align your administrative assessment efforts with your unit鈥檚 strategic or operational plan. If your unit can align some of your previous outcomes with the new strategic plan this can eliminate the need to create all new outcomes, assessment, and success criteria/targets.  In addition, you will want to make sure that your new outcomes are structured so that they will be useful to your unit for several years. For advice on how to do this, please see 鈥淗ow should our unit determine and develop its outcomes?鈥 below.

How should our unit determine and develop its outcomes?

Each unit should have at least one outcome for each strategic initiative and measure the outcome every year. Your unit鈥檚 outcomes should be:

  • Focused on the processes, services, and/or products that your unit would like to improve.
  • High-level enough that they can be used over multiple years to guide continuous improvement.
  • Specific enough that they are measurable.
  • Derived from the goals and strategic initiatives of your unit.
  • Related to things that your unit can control.
examples of outcomes
How specific do our criteria for success/targets need to be?

The more specific your targets are, the better they will demonstrate improvement over time. Here are some examples of criteria for success/targets:

examples of targets
Does our unit need to develop student learning outcomes?

Units that are responsible for delivering content to students, will develop separate student learning outcomes, assessment plans, and provide evidence of student learning and use of student learning evidence through the separate Program Outcomes Assessment (Assessment Clearinghouse) and reporting process with IEA.

Important note: For student programs or services where the expected outcomes relate to student attitudes, confidence, motivation, etc., these outcomes should be represented in an ASSUR outcome and should not be represented in a student learning outcome.

How do we create direct measures for our outcomes? We have a survey that asks our stakeholders if they are able to or know how to use our services. Is this a direct measure?

A survey of stakeholders鈥 opinion of their abilities or knowledge is not a direct measure. Direct measures require stakeholders to demonstrate their abilities or knowledge or may relate to efficiency of processes and/or effectiveness of programs or services for stakeholders, etc.

Here are examples of direct and indirect measures are:

Examples of direct measures

For additional assistance in developing direct measure please contact Joni Wadley, Senior Director for Institutional Effectiveness in Institutional Effectiveness & Analytics (schallej@ohio.edu).

Our unit is ready to report the results of our assessments. What information do we need to include in the Summary of Results?

The last column of the ASSUR Reporting template in Module 2-4 for units to report the summary of their assessment results.  The unit should first indicate if the Criteria for Success (Target) was met or not met. The unit should then provide the actual results, e.g. 18 out of 20 or 90%.  Finally, the unit should provide any additional relevant notes or comments about the results. Here are some examples for the Summary of Results column:

Example of Summary of Results
We have reviewed our results and completed our Summary of Results columns for Modules 2 - 4. Our unit is now ready to develop an improvement plan. What information do we need to include in Module 5: Improvement, Innovation, and Integration?

 The unit should focus initially on the columns 鈥渋ntentional changes,鈥 鈥渆nvironmental forces/factors,鈥 and 鈥渋ntended results.鈥  The unit should first describe their planned improvements or innovations based on the results from Measuring Outcomes, Assessing Stakeholder Satisfaction, and Gauging Performance Against Peers, under column 鈥渋ntentional changes.鈥 The unit should then describe any environmental forces/factors contributing to this change, that may possibly hinder the implementation of these changes, or that result in a modified change rather than the actual intended change.  Finally, the unit should describe intend results that should occur once the improvement or innovation has been implemented.  In other words, what are the post-improvement expected results?

Once the improvements or innovations have been implemented and the units have reassessed and provided another round of their Summary of Results for Modules 2-4, the unit can then proceed with providing responses to the 鈥淎ctual Results.鈥 It is expected that the unit will provide some explanation as to whether the improvement or innovation was successful by comparing whether the actual results either met or exceeded the intended results or whether there were gaps.  If there were gaps identified between intended and actual results, then the unit should describe how the operational improvement strategies will be refined. Under the 鈥淚ntegration鈥 column the unit should describe how the results from evaluating operations resulted in informing unit planning and budgeting/resource allocation.

Here are some examples to help you with completing Module 5 Improvements, Innovations, and Integration 鈥淚ntentional Changes,鈥 鈥淓nvironmental Forces/Factors,鈥 and 鈥淚ntended Results鈥 columns:

Example of Improvement part 1

 Here are some examples to help you with completing Module 5 Improvements, Innovations, and Integration 鈥淎ctual Results鈥 and 鈥淚ntegration鈥 columns:

Example of Improvement part 2
What do we do with the feedback we receive from Institutional Effectiveness & Analytics?

Every report submitted to Institutional Effectiveness & Analytics goes through a double-review process, where feedback is provided directly into a feedback rubric. This feedback is given to assist the unit in improving its assessment and evaluation processes and is intended to be used to inform the unit throughout the ASSUR reporting cycles.

Do we need to resubmit our annual report after we receive feedback from Institutional Effectiveness & Analytics?

No, units do not need to resubmit their reports after they receive feedback from Institutional Effectiveness & Analytics. This feedback is provided purely to help the unit improve its processes for future reporting.

Our unit has created outcomes, measures, and criteria for success/targets, but we want to be sure we are on the right track before we start collecting data. Is there a way to get feedback before we submit our annual report?

Institutional Effectiveness & Analytics is happy to provide feedback at any point. This includes providing feedback on outcomes, measures, and criteria for success/targets during the planning phase, and on report drafts before final submission. Please feel free to send drafts to Joni Wadley, Senior Director for Institutional Effectiveness in Institutional Effectiveness & Analytics (schallej@ohio.edu) to receive feedback.

Our unit still has questions. Can we get some extra help?

Absolutely, please contact Joni Wadley, Senior Director for Institutional Effectiveness in Institutional Effectiveness & Analytics (schallej@ohio.edu) to set up a meeting or get more information.