Authentic, Alternative Assessments in Pivot Interactives

November 11, 2022

Authentic, Alternative Assessments in Pivot Interactives

Alternative Assessments are a great way to assess open-ended problem solving skills and give students a chance to freely apply the models they've developed. An increasing number of teachers are using Pivot Interactives activities as assessments, which have been shown to be effective tools for developing and assessing lab skills  

When developing Pivot Interactives, one goal was to streamline the practice and assessment of skills that far beyond solving standard word problems. With Pivot Interactives, students quickly make their own measurements of real events, and then transition to critical thinking and learning how to apply models. We've been using about one of these activities per week, as our class transitions away from paper-and-pencil word problems, toward interactive video in Pivot Interactives as our medium for practice and assessment.

As teachers, we want our students to explore open-ended lab-based activities and assessments as much as possible. But practical constraints -- lab equipment availability, set-up time, student absences -- present obstacles to making this type of assessment a frequent feature. Pivot Interactives allows students to work on some of the same skills, as they apply scientific concepts to real situations. The ease-of-use of the platform means these types of activities can be a regular part of our instruction.

Blowdart Cart Collision is an activity that assesses both physics concepts and science reasoning. In part one of the activity, students watch a video of a blowdart fired towards a foam block mounted on a stationary cart on a low-friction track. The video abruptly ends just as the dart makes contact with the foam.

Blowdart Cart Collision in Pivot Interactives

The activity asks students to use the available video analysis tools, like ruler & stopwatch, to predict the speed of the cart (with dart embedded) after the collision. I asked students to report all measurements, and show the process they used to make their prediction.  

Pivot Interactives as their response. One student, Alexis, took an unusual approach. She solves collisions by expressing the final and initial system momenta as a ratio equal to 1. This is an artifact of how our class explored collisions. As a class we measured dozens of collisions and saw that if the external impulse on the system was small, the ratio of final and initial momenta is 1.

Students were also asked to state what system they were analyzing, and list any assumptions then made while making their prediction. Almost all of them stated the assumption that there were no net horizontal impulses on the cart-dart system during the collision.   

After they submit their prediction for the final velocity, Part Two of the activity unlocks, and students can watch a video of the entire collision. Now they can measure the velocity after the collision to see if their prediction was correct. Students said they liked this more than a traditional problem, where they never really find out if their answer describes real life.

 

Of course students focus on whether their prediction was "right", that is, was the cart's velocity the same as their prediction. But I'm interested in their ability to reason about the discrepancy between their prediction and the outcome.  

I customized the activity to fit my classroom needs and asked a series of questions:

  • Which measurement had the greatest uncertainty? Could error in that measurement be responsible for the discrepancy?
  • Were your assumptions in part one incorrect? What external forces might affect results? Are these effects consistent with the discrepancy you found?
  • What if the track were tilted so the right end was lower, would that account for the discrepancy?

These types of questions develop students' science reasoning. The motivation to understand why there was a discrepancy seemed to drive students, making them willing to work to consider possible causes. With Pivot Interactives, collecting data about real events is easy, so there is more time for students to dig into questions that develop science reasoning and critical thinking.

 

The last question on the assessment was: How could you determine if the track was tilted so the right end was lower? What evidence could you gather even if you could not go back to the original apparatus?  I was hoping students would say that they would just check the outcome of some of the other trials shown in the activity to see if there is a trend towards one type of discrepancy or not. But a few students had a better idea: plot the position vs time for the cart after the collision and look for a changing slope.

 

I used the "Grade by Question" feature to make it easy to give students feedback. I selected the key questions in the activity where I wanted to provide detailed feedback. The Grade by Question view allows me to provide quick feedback and scores for the questions that matter most and accommodates standards-based grading.  

As teachers, we want our students to explore open-ended lab-based activities and assessments as much as possible. But practical constraints -- lab equipment availability, set-up time, student absences -- present obstacles to making this type of assessment a frequent feature. Pivot Interactives allows students to work on some of the same skills, as they apply scientific concepts to real situations. The ease-of-use of the platform means these types of activities can be a regular part of our instruction.

 

Curious to see what a Pivot Interactives activity looks like? Try an Activity here

Ready to explore the library? Sign up for a Free Trial or view pricing here.