Loyola University Chicago

Faculty Center for Ignatian Pedagogy

Determining Acceptable Evidence

Travelers refer to landmarks on maps to track their progress and to determine if they are going in the right direction.  Similarly, it is important for instructors and students to be able to track their progress in a course. Assessment is the process by which we track, or gather information about, student learning.  Since learning is an internal process, we cannot see if learning is happening unless we ask students to do assessments to show evidence of that learning. If learning outcomes help us identify the desired results of learning, then assessments help us determine what constitutes acceptable evidence of the results.

What is the difference between assessment and grading?

Sometimes the processes of grading and assessing are conflated; they are, however, distinct activities.  The purpose of grading is to rate a student’s performance, to give it a numeric value. The purpose of assessment, on the other hand, is to gather evidence of what students have learned to determine their performance against the stated student learning outcomes: did they attain the desired outcome performance? If students are not attaining the stated outcome the instructor can then adjust instruction or course content based on the students’ performance. While grading generally evaluates individual student performances, data from assessments can serve as a bigger picture of what students are learning at a course and a program level.

Types of Assessments

Formative Assessments

Assessments take many forms but are generally in one of two categories: formative and summative. Formative assessments help identify where learning gaps are and provide students with feedback to improve performance for the remainder of the course.  Formative assessments are used to continually gather evidence of students’ learning and can be graded or ungraded. Feedback on formative assessments shows students what they are doing well and what they can improve. With information on what they are doing well, students can deepen their learning and strengthen their skills. With information on what to improve, students can correct errors and misconceptions. In this way, formative assessments are a powerful learning tool for students.

What are some examples of formative assessments?

Examples of formative assessments include:

  • learning portfolios  
  • drafts of papers with instructor comments
  • 1-minute papers asking a specific topic-related question

Summative Assessments

Summative assessments provide the cumulative measure of students’ learning or proficiency at the end of a course or learning unit. Summative assessments are formally graded since they are used to make final judgements of a student’s performance.  Students typically do not receive any feedback on summative assessments prior to turning them in. 

What are some examples of summative assessments?

Examples of summative assessments include:

  • mid-term and final exams
  • final papers and reports

Note that some assessments like papers can function as formative or summative. If students receive feedback on different portions of the paper as they write, the paper functions as a formative assessment. If students do not receive any feedback prior to submitting, the paper functions as a summative assessment. 

What are some considerations for determining assessment strategies?

Variety

Assessments should vary so that students can demonstrate evidence of learning in multiple ways. Some students perform better on different types of assignments. If instructors choose to give only assessments in the form of multiple-choice tests, for example, a percentage of students will not perform as well.  In addition to tests, consider using papers or presentations as assessments so that students have a variety of ways to show their learning.

Frequency

Doing assessments regularly helps to give students multiple opportunities to track what they are learning. With the feedback provided through these assessments, students can gauge how their performance matches the course expectations. Additionally, the more frequent the assessments, the more information instructors have to determine whether the course is succeeding in helping students meet the learning outcomes.

Alignment

Assessments should map back, or align, with the learning outcomes. Alignment occurs when an assessment strategy accurately gathers evidence of mastery of the learning outcome. For examples of assessments that align with learning outcomes, see Part 2 of this guide.  Since we create our assessments to gather evidence of student mastery of the learning outcomes, then it is important that the learning outcomes are clear.

When we think about aligning learning outcomes and assessments, we also want to think about how to gather evidence that students are able to demonstrate both lower-order and higher-order level thinking skills illustrated in Bloom’s Taxonomy. Carnegie Mellon University’s Teaching Excellence and Educational Innovation Center provides some great examples of how to select assessment strategies that align with different levels of Bloom’s Taxonomy.

Type of learning outcome

Examples of appropriate assessments

Recall
Recognize
Identify

 

Objective test items such as fill-in-the-blank, matching, labeling, or multiple-choice questions that require students to:

  • recall or recognize terms, facts, and concepts

Interpret
Exemplify
Classify
Summarize
Infer
Compare
Explain

Activities such as papers, exams, problem sets, class discussions, or concept maps that require students to:

  • summarize readings, films, or speeches
  • compare and contrast two or more theories, events, or processes
  • classify or categorize cases, elements, or events using established criteria
  • paraphrase documents or speeches
  • find or identify examples or illustrations of a concept or principle

Apply
Execute
Implement

Activities such as problem sets, performances, labs, prototyping, or simulations that require students to:

  • use procedures to solve or complete familiar or unfamiliar tasks
  • determine which procedure(s) are most appropriate for a given task

Analyze
Differentiate
Organize
Attribute

Activities such as case studies, critiques, labs, papers, projects, debates, or concept maps that require students to:

  • discriminate or select relevant and irrelevant parts
  • determine how elements function together
  • determine bias, values, or underlying intent in presented material

Evaluate
Check
Critique
Assess

Activities such as journals, diaries, critiques, problem sets, product reviews, or studies that require students to:

  • test, monitor, judge, or critique readings, performances, or products against established criteria or standards

Create
Generate
Plan
Produce
Design

Activities such as research projects, musical compositions, performances, essays, business plans, website designs, or set designs that require students to:

  • make, build, design or generate something new

 

Notice in the examples above that the level of difficulty of the assessment strategy is appropriate for the corresponding level of Bloom’s Taxonomy. For higher order thinking skills such as create or evaluate, students make something or judge something. For lower order thinking skills such as identify or classify, students recall or recognize terms or find examples. 

What happens when learning outcomes do not align with assessment strategies?

When learning outcomes and assessment strategies are misaligned, students can become frustrated or disengaged. Let’s put ourselves in the shoes of a student. The student is aware that the learning outcome is to identify how trauma impacts different parts of the brain. With this outcome in mind, the student teaches herself to recognize the impact of trauma on different parts of the brain. When she gets to class, though, the assessment requires her to do a presentation about the causes of trauma. Because she prepared for class by identifying how trauma impacts different parts of the brain, she is not prepared to make a presentation about the causes of trauma. She may also feel like her time preparing for class was not useful. In this case, the assessment does not align with the content learned.

What other considerations are there for determining assessment strategies?

Authentic assessment 

Conventional assessments, like tests, tend to be indirect measures of student learning. On the other hand, authentic assessments tend to be direct measures of student learning since students apply and demonstrate knowledge rather than merely recalling it.  To be authentic, an assessment should require students to use the skills they have learned to analyze or solve a problem. Authentic assessments measure students’ ability to transfer their learning from the more abstract to the more concrete and realistic. An example of an authentic assessment is a case study that asks students to use concepts or principles they are learning to analyze or make recommendations about the case.

Assessment and Universal Design

Another important consideration when designing assessments is that they are accessible to all students. Incorporating universal design into assessments ensures that all students will have an equal opportunity to perform well. CAST Professional Learning  has an informative list of tips to keep in mind when considering how to incorporate universal design into assessments.

Assessment and Academic Integrity

Loyola University’s Office of Online Learning provides guidance on how to design assessments that minimize opportunities for academic dishonesty. Below are some high-level suggestions.

Assign writing assessments early on in the semester

After a few written assignments or an initial writing assignment, we get a clear sense of students’ individual style. This familiarity conveys to students that we are present and recognize the idiosyncrasies of students’ work.

Set up exams carefully

  • Place questions in random order so students have more difficulty sharing answers.
  • Set a time limit for a test to prevent students from looking up answers or seeking help from others.
  • Give slightly different questions to different students 

References