Evidence of Student Learning
A page within Institutional Research, Assessment & Planning (IRAP)
Evidence of Student learning graphic
The results of our assessment processes are shared on this page. They are broken out into areas for University Wide projects, Program Assessment, and General Education Assessment. The process of collecting this evidence is an ongoing endeavor and therefore results will be updated as they become available. If you are interested in any item presented here or have questions, please contact Patrick Barlow, University Assessment Coordinator.
Assessment Results
University-Wide Evidence
Evidence of Student Learning: University Wide Assessment
National Survey of Student Engagement (NSSE)
Our results from the National Survey of Student Engagement (NSSE) have indicated patterns of engagement that help support student learning and some areas for improvement. It is difficult to capture the richness of this data set but university staff and instructors are presented with the reports and seek to use them as guide to process or program refinements. See below for a selection of our results. Note that UWL, along with our peer UW Comprehensive Universities, administer the NSSE on a three year cycle, most recently in Spring 2023 with plans to administer again in Spring 2026.
Previous Administration 2020:
Program Assessment Evidence
Evidence of Student Learning : Program Level
Each academic program at UWL makes use of assessment to collect evidence on student learning of the intended learning outcomes for the program. These results are reported on a regular basis. Exemplars of reports of programs across our colleges are shared below.
College of Arts, Social Sciences, & Humanities
- Archaeological Studies
- Sociology
College of Science & Health
- Biology
- Exercise & Sports Science: Sports Management
College of Business Administration
- College Wide Core Assessment
- Economics
School of Education, Continuing, and Professional Education
- Educational Studies
General Education Assessment Evidence
Evidence of Student Learning: General Education Program
The General Education Program has created a new assessment process to align with the new curriculum that will begin being offered in the Fall 2025 semester. Within this plan each category of the curriculum is aligned to one learning outcome and will be assessed using one of the AAC&U VALUE rubrics. Data collection is in the beginning phase so there is limited data to share currently. For more details on that process please look at the General Education Assessment Committee page located on the Faculty Senate website.
Results from the Fall 2025 Assessment Cycle indicate student works showing evidence of written, spoken, and quantitative literacy. Faculty teaching courses in these three categories of the program made use of the aligned VALUE rubric to assess student works and assignments. The results for each category are presented below.
Written Literacy
Program Learning Outcome: Develop ideas effectively in writing by integrating evidence with clarity and precision.
VALUE Rubric Dimension Applied: Content Development from the Written Communication Rubric
Target Course: English 110 College Writing II
| Table 1. Results of the Fall 2025 Assessment Process: Written Literacy | |||
| Rubric Level | Rubric Dimension Language | Count | Percentage of All Ratings |
| Capstone 4 | Uses appropriate, relevant, and compelling content to illustrate mastery of the subject, conveying the writer's understanding, and shaping the whole work. | 30 | 3% |
| Milestone 3 | Uses appropriate, relevant, and compelling content to explore ideas within the context of the discipline and shape the whole work. | 239 | 24% |
| Milestone 2 | Uses appropriate and relevant content to develop and explore ideas through most of the work. | 483 | 48% |
| Benchmark 1 | Uses appropriate and relevant content to develop simple ideas in some parts of the work. | 211 | 21% |
| No Evidence 0 | Lack of any traits from Benchmark 1 | 30 | 3% |
| Total Ratings | 993 | ||
Spoken Literacy
Program Learning Outcome: Create and deliver presentations to influence diverse audiences' knowledge, attitudes, values, beliefs, or behaviors.
VALUE Rubric Dimension Applied: Delivery from the Oral Communication Rubric
Target Course: Communication Studies 110 Communicating Effectively
| Table 2. Results of the Fall 2025 Assessment Process: Spoken Literacy | |||
| Rubric Level | Rubric Dimension Language | Count | Percentage of All Ratings |
| Capstone 4 | Delivery techniques (posture, gesture, eye contact, and vocal expressiveness) make the presentation compelling, and speaker appears polished and confident. | 82 | 7% |
| Milestone 3 | Delivery techniques (poster, gesture, eye contact, and vocal expressiveness) make the presentation interesting, and speaker appears comfortable. | 351 | 32% |
| Milestone 2 | Delivery techniques (poster, gesture, eye contact, and vocal expressiveness) make the presentation understandable, and speaker appears tentative. | 519 | 47% |
| Benchmark 1 | Delivery techniques (poster, gesture, eye contact, and vocal expressiveness) detract from the understandability of the presentation, speaker appears uncomfortable. | 137 | 12% |
| No Evidence 0 | Lack of any traits from Benchmark 1 | 16 | 1% |
| Total Ratings | 1105 | ||
Quantitative Literacy
Program Learning Outcome: Analyze quantitative data to reason and communicate arguments across varied contexts
VALUE Rubric Dimension Applied: Application/Analysis from the Quantitative Literacy Rubric
Target Courses: Math 115, 116, 123, 150, 151, 160, 175, 207, 208, 215, 216; Statistics 145; Computer Science 115, 120; Philosophy 101; Music 115
| Table 3. Results of the Fall 2025 Assessment Process: Quantitative Literacy | |||
| Rubric Level | Rubric Dimension Language | Count | Percentage of All Ratings |
| Capstone 4 | Uses the quantitative analysis of data as the basis for deep and thoughtful judgments, drawing insightful, carefully qualified conclusions from this work | 416 | 14% |
| Milestone 3 | Uses the quantitative analysis of data as the basis for competent judgments, drawing reasonable and appropriately qualified conclusions from this work. | 832 | 28% |
| Milestone 2 | Uses the quantitative analysis of data as the basis for workmanlike (without inspiration or nuance, ordinary) judgments, drawing plausible conclusions from this work. | 951 | 32% |
| Benchmark 1 | Uses the quantitative analysis of data as the basis for tentative, basic judgments, although is hesitant or uncertain about drawing conclusions from this work | 586 | 20% |
| No Evidence | Lack of any traits from Benchmark 1 | 147 | 5% |
| Total Ratings | 2932 | ||
The General Education Committee is reviewing this information in the Spring 2026 semester and evaluating ideas for how best to apply them to the program curriculum.
Course Level Evidence
Course Level Assessment Evidence
Individual instructors are also active in the direct assessment of learning outcomes specific to their individual courses. Direct assessment involves the collection of information from student artifacts, works, presentations, performances where they are asked to directly display the intended student learning outcomes. Instructors may also desire to assess newly designed courses or use of new teaching strategies through this level of assessment.
Instructors who are being reviewed for promotion are expected to report on the direct assessment of student learning outcomes as indicated in the JPC guidelines (Faculty) or IAS-PC Guidelines (Instructional Academic Staff). The candidate should show how the results from an assessment informed a change in teaching and re-assessment of student learning. Examples of evidence appendices regarding direct assessment are available off the Provost’s promotion resource page.
Examples of Course Level Assessment
Below are some examples for course level assessment as used in the promotion process. UWL Faculty and Instructors can access via Office 365 using your login credentials.
-
- Borja (MUS) - Assessment of Music History Sequence
- Borja (MUS) - MUS 201 Musical Cultures Assessment
- Chedister (MTH) - Use of Exit Ticket for Formative Assessment in MTH 136 Math for Elementary Teachers II
- Hawkes (ART) - Course Revision Case Study: Intro to Digital Photography
- Hawkes (ART) - Sample Assignments and Assessment Documents
- Cooper Stoll (SOC) - Sample SLOs and Assessment Instruments for Foundations of Sociological Analysis
- Cooper Stoll (SOC) - Sample SLO and Assessment Instruments for Sociology of Race and Ethnicity
- Dale (POL) - Direct Assessment for POL 251 Political Theory
- Dale (POL) - Direct Assessment for POL 355
- Haried (IS) - Assessment of Problem Solving SLO in the Information Systems Major
- King-Heiden (BIO) - Assessing Biology Major SLO 'Interpreting and Producing Graphs' in BIO 105 General Biology
- Graham (MKT) - Direct Assessment for MKT 365, Direct Assessment for MKT 386
For more information concerning course assessment, contact University Assessment Coordinator Dr. Patrick Barlow .