Building and Sustaining an Institutional Assessment System SACSCOC

Building and Sustaining an Institutional Assessment System SACSCOC

Building and Sustaining an Institutional Assessment System SACSCOC Annual Conference December, 2018 Timothy S. Brophy Professor and Director, Institutional Assessment, Office of the Provost University of Florida, Gainesville, Florida Todays Goals To review elements of effective assessment systems in the context of sustained excellence

Share the components of the University of Florida Academic Assessment System Common Challenges for Sustaining Excellence in Assessment Institutional size and scope Multiple colleges/departments Diverse programs - Certificate, Undergraduate, Graduate, and Professional Available personnel

Institutional consistency Outcomes Assessment reporting Cycles of planning and reporting Common Challenges for Sustaining Excellence in Assessment Management and Tools Faculty assessment resources Templates, guidelines Professional development for faculty Honoring unit autonomy, disciplinary distinctions, and institutional requirements

Faculty comportment What is an Assessment System? The Assessment System is a coordinated and carefully designed set of processes and tools used by university accreditation coordinators, administrators, and faculty to submit, review, store, and

access academic program assessment plans and assessment data reports Reliability and validity procedures are built into the system at the institutional level to ensure data integrity and appropriate inferences pertaining to

institutional effectiveness Institutional framework Purpose, Mission, and Vision Establish Purpose and Mission Purpose Why you exist The purpose of Institutional Assessment is to support the University of Floridas mission by establishing, maintaining, and refining the universitys institutional effectiveness and assessment processes. Mission What you do

The mission of Institutional Assessment is to lead the universitys efforts in accreditation and institutional effectiveness, assessment support, and to maintain transparent communication with all UF stakeholders. Develop your vision Vision Your mission achieved with excellence We envision the University of Florida as an institution where all units and academic programs contribute to the fulfillment of the university mission by establishing goals and outcomes, regularly assessing these with valid, reliable measures, analyzing and interpreting

the data collected, and using the results for continuous improvement. System Inputs Data Sources Our system inputs come from two different programs accreditation planning and reporting program our internal UF approval system is used

for submission of new and modifications to existing Academic Assessment Plans There is an established institutional cycle for inputting this information Planning Timeline/Cycle Program/Unit: Develop Academic Assessment and IE plans and data reports System entry: Submit reports

System entry: Submit for institutional review Program/Unit: Implement plan and collect data The Academic Assessment Plans Mission Alignment Student Learning Outcomes Assessmen

t Oversight Academic Assessme nt Plan Curriculum/ Assessment Maps Methods and Procedures Assessmen t Cycle

Institutional Effectiveness Plans Mission Alignment Assessment Oversight Goals Institutional Effectivenes s Plan Details (administrati on, student

services, research) Measures Data Reports Over 500 reports annually Results are reported for each of the SLOs and Goals Use of Results for program improvement is reported holistically System Processes

Faculty oversight Institution- level Academic Assessment Committee Joint committee four members from the Senate, four appointed by the president, one student, and several liaisons Duties: Review and approve Academic Assessment Plans, including Student Learning Outcomes Improve the efficiency of Institutional Assessment processes Approval and Management Process The University of Florida Assessment Plan Approval Process

Program/ Departmen t Student Academic Support System College Academic Assessment Committee

University Curriculum Committee Committee Review Student learning outcomes to ensure they follow UF guidelines Student Learning Outcomes reflect the curriculum, the discipline, and faculty expectations; as these elements evolve, learning outcomes change.

Recent the outcome reflects current knowledge and practice in the discipline. Relevant the outcome relates logically and significantly to the discipline and the degree. Rigorous the degree of academic precision and thoroughness that the outcome requires to be met successfully. Distinguish outcomes from outputs

Distinguish outcomes from program goals Ensure that outcomes are measurable and valid for the SLO SLO/AAP Approval Process Program/Department College Academic Assessment Committee University Curriculum Committee Student Academic

Support System Prepares the submission Receives program/department submission Institutional Assessment review and initial recommendation Chair review and

initial recommendation Screened for alignment with the catalog Submits request to the approval system Reviews and takes action - submits to Institutional Assessment

Academic Assessment Committee review and recommendation University Curriculum Committee review and recommendation Entered into catalog Assessment and Data Reporting Assessment and Institutional Effectiveness Establish

Mission, Goals, and Outcomes Modify and Improve Interpret and Evaluate the Data Assessment Planning Implement

the Plan and Gather Data Data Reporting November 1 all reports are due Communication We use a distributed leadership model Each of our 16 colleges, 4 Senior Vice Presidential units, 10 Vice presidential units, the Graduate School, The Libraries, and the Florida Museum of Natural History all have appointed SACSCOC

Coordinators These individuals meet as a group when needed, usually twice a year We communicate with them, they communicate to their faculty and adminstration Validity, reliability, and Fairness Validity Validity is a unitary concept it is the degree to which all the accumulated evidence supports the intended interpretation of test scores for the proposed use. APA/AERA/NCME, Standards for Educational and Psychological Testing, 2014.

For institutional assessment, the evidence is SLO data (the test scores) and the proposed use of this data is to determine the degree to which an SLO has been met by students in the program. Interpretation the faculty set thresholds of acceptability, and make inferences from the SLO data as to the degree to which their students achieve the SLO. Checking for Validity at the institutional level All plans and data reports are reviewed by Institutional Assessment staff All measures of goals and SLOs are reviewed to ensure that they lead to data pertinent to

the goal or outcome (validity) If there are questions, the plan or report is returned for modification or clarification Reliability/Precision and Fairness In its broadest sense, reliability refers to the consistency of scores across replications of a testing procedurethis is always importantand the need for precision increases as the consequences of decisions and interpretations grow in importance. Fairness has to do with equitable treatment of all test takers,

lack of measurement bias, equitable access to the constructs as measured, and validity of individual test score interpretations for the intended uses. APA/AERA/NCME, Standards for Educational and Psychological Testing, 2014. Checking for reliability/Fairness at the institutional level Reliability and Fairness of SLO assessments is the responsibility of the academic program faculty we do not monitor this Faculty have access to the built-in reliability functions of our Learning Management System (Canvas) they can program the LMS to collect data on their program SLOs

The General Education SLOs are also programmed at the course level to provide institutional data through the LMS We do monitor the reliability of our QEP measures, which are administered institutionally System Outputs SLO Approvals Program leaders are informed via automated email on any actions taken by the Academic Assessment Committee Options we use are:

Approve Comment Conditionally Approve Table (rarely used) Recycle Denied Constructive feedback We provide feedback on all data reports and

request modifications if needed. We allow 2-4 weeks for the modifications to be completed Most common requests: report improvement actions as a decision made based on the review of results, in the past tense Remove any future tense phrases in the improvement actions Examples of Feedback Art History (PhD) (program goal and SLO report)

Excellent data summaries and documentation. PG1, PG 2, PG3, PG4, PG5, PG6, SLO1, SLO2, SLO3, SLO4 slightly revise Use of Results to read as a decision made based on use of results (past tense). College ofthe Pharmacy (Institutional Effectiveness report) 1. Your report of Actions for Improvement for Goals 1 and 4 do not follow our guidelines for reporting. Please include who reviewed the results, and state the actions

to be taken as results of decisions made based on the review. Refer to your Goals 2 and 3 Actions for Improvement as examples of how this should be reported. Please avoid using any future tense phrases (will do, plan to do, etc.) Planned Improvements Research with our faculty Our Research Patterns emerged from the annual data reports Inconsistencies across programs Lack of understanding of institutional processes

Anecdotal evidence from the SACSCOC Coordinators Individual reports from coordinators about activity in their colleges and units Academic Assessment Committee interest Phase 3: Faculty Focus Groups The Das & Gater findings revealed a need to better understand faculty engagement with assessment Academic Assessment Committee developed these research questions:

How are UF faculty engaged in academic assessment processes at the University of Florida? In what ways could this information lead to the modification and improvement of institutional assessment processes? Participants 16 focus groups (one in each college), N = 146

Tenure and tenuretrack faculty; all were invited to join groups Delimitation: Limited to faculty who were available at the scheduled times in each college Question Categories Perceived Value of Assessments

Instructor Assessments Assessment at the Department/Program/M ajor level Faculty Engageme nt with Assessme nt Closing question What havent we asked you today that you would

like to talk about? Results 34 sets of data field notes, recordings Loaded into NVivo11 Coded first for response categories 8 response categories and three themes emerged Response Categories Data Category Assessment methods and processes Challenges

Concerns Context Data gaps Needs Use of Results Value Description Various assessment types used for the assessment of student learning Issues that impede assessment or make it challenging Areas that cause concern or are barriers to assessment they would like to do Factors that influence assessment that

faculty cannot control Information that faculty would like to collect but cannot or do not What faculty would like to have to facilitate their assessment processes The ways that faculty use the results of their assessments What faculty value about assessment Theme 1: Assessment is Valued Faculty quotes

If you dont determine theyre learning, why are we here? There is value in seeing students succeed, and assessment provides information that is used to re-examine

student knowledge. Findings The value of assessment is often directly associated with standards of the field Use of results is consistent but purposes and methods vary Most prevalent: assessment data used to modify instruction to advance student learning Open dialogue is a primary assessment methodology for those who teach/mentor in one-to-one teaching situations Faculty want to learn from their peers sharing assessment methods and processes is

valuable Theme 2: Influential Conditions Faculty quotes The type of assessment I use depends on the size of the class. Our

disciplinary accreditor requires a set of national exams that all of our students must take. Why cant we use these as outcome measures? Findings Two conditions that impact assessment were common

across the colleges: Class size Disciplinary accreditation Class Size Primary driver for assessment methodology: number of students in the class Large classes constrain assessment choices to large scale measures (such as exams scored electronically) There is a tension between what faculty want to do to assess their students and what they feel they must do because of class size

Disciplinary Accreditation Disciplinary accreditors often require student learning measures and some prescribe student learning outcomes Some disciplinary accreditors have

established assessment standards Frustrations: Aligning disciplinary accreditation requirements with SACSCOC requirements Appropriate use of required third-party exams

Theme 3: Misconceptions about SACSCOC Reporting Findings We didnt ask, but Accreditation reporting was raised in nearly every college Three misconceptions emerged: All student learning must be quantified Academic assessment is limited to specific categories and types The data disappears Misconception 1:

All student learning must be quantified Our faculty are very engaged in gathering anecdotal evidence, but push back with quantification of student learning information. Subjective data cannot be quantified. Likely arises from a UF requirement to provide a rubric We ask for summary data; for some, this has been conflated with quantification of student learning data Misconception 2: Assessment is limited to specific categories and types

The criteria for SACSCOC are limited; I feel like my hands are tied. Florida regulations require certain categories of student learning outcomes The UF Graduate School also has established outcome categories However: additional categories are permitted Misconception 3: The data disappears

Faculty related concerns about not knowing what happens to the data they report Data reporting is done in our accreditation module from a third party provider Access is limited to those who have a role in the software program This is a legitimate concern Cultivating Engagement: Actions taken based on our analysis Recommendations for Action Finding Value

1. Share assessment work across colleges Influential conditions 1. Class size 2. Disciplinary accreditation Recommendations 1. Continue the UF Assessment Conference, and develop an online mechanism for faculty to share their assessment work with others.

1. Develop faculty workshops in conjunction with the Office of Faculty Development and Teaching Excellence on using Canvas assessment tools to facilitate data collection for multiple assessment methods. 2. Work with specific disciplines to maximize use of student learning data collected for disciplinary accreditors for regional accreditation reports. Modifications based on our analysis Finding

Recommendations Misconceptions 1. Quantification of 1. Develop online tools to student learning clarify what can be data reported. 2. Limitations on 2. Develop online tools to assessment clarify assessment types. outcomes and measures. 3. Develop an accessible

3. Faculty are not view of student learning clear on how the data reports, perhaps data they report through visualization. is used at the institutional level, nor do they have ready Continued research We have completed a study with our Lecturers Data is being analyzed now

Timothy S. Brophy, Ph.D. Professor and Director, Institutional Assessment [email protected] 352-273-4476 Brophy, T. S. (2017). The University of Florida Assessment System. In D. Miller & T. Cumming, (Eds.), Educational assessment in higher education: Theory and applications (pp. 184-202). Sterling, VA: Stylus.

Recently Viewed Presentations