# Authentic Assessment with Pivot Interactives

Several teachers have posted recently about lab practicals. Frank Noschese posted about his lab practicals on Twitter, as did Marianna Ruggerio on her Twitter, and Kelly O'Shea has been blogging up a storm about group lab practica. These are a great way to assess open-ended problem solving skills, and give students a chance to freely apply the models they've developed. Frank and Marianna's examples featured Direct Measurement Videos, which have been shown to be effective tools for developing and assessing lab skills (note: the Direct Measurement Videos are now available on Pivot Interactives.)

When developing Pivot Interactives, one of our goals was to streamline the practice and assessment of skills that go beyond solving standard word problems. With Pivot Interactives, students quickly make their own measurements of real events, and then transition to critical thinking and learning how to apply models. We've been using about one of these activities per week, as our class transitions away from paper-and-pencil word problems, toward interactive video as our medium for practice and assessment.

Here's an example of an activity that assesses both physics concepts and science reasoning. In part one of the activity, students watch a video of a blowdart fired towards a foam block mounted on a stationary cart on a low-friction track. The video abruptly ends just as the dart makes contact with the foam.

The activity asks students to use the available video analysis tools and predict the speed of the cart (with dart embedded) after the collision. I asked students to report all measurements, and show the process they used to make their prediction. They either typed their response (some students like using LaTeX to enter their responses), or took a photo of their work and uploaded the photo as their response.

Here's an example of student work. Roberto likes using LaTeX, so he typed his responses in:

Students can upload images, so some students do their work on paper, and upload the image into Pivot Interactives as their response. Alexis took an unusual approach. She solves collisions by expressing the final and initial system momenta as a ratio equal to 1. This is an artifact of how our class explored collisions. As a class we measured dozens of collisions and saw that if the external impulse on the system was small, the ratio of final and initial momenta is 1. This shows up in Alexis work in the image she uploaded as her response:

Students were also asked to state what system they were analyzing, and list any assumptions then made while making their prediction. Almost all of them stated the assumption that there were no net horizontal impulses on the cart-dart system during the collision.

After they submit their prediction for the final velocity, Part Two of the activity unlocks, and students can watch a video of the entire collision. Now they can measure the velocity after the collision to see if their prediction was correct. Students said they liked this more than a traditional problem, where they never really find out if their answer describes real life.

Of course students focus on whether their prediction was "right", that is, was the cart's velocity the same as their prediction. But I'm interested in their ability to reason about the discrepancy between their prediction and the outcome. The activity asked a series of questions:

Which measurement had the greatest uncertainty? Could error in that measurement be responsible for the discrepancy?

Were your assumptions in part one incorrect? What external forces might affect results? Are these effects consistent with the discrepancy you found?

What if the track were tilted so the right end was lower, would that account for the discrepancy?

These types of question develop students' science reasoning. The motivation of understanding *why* there was a discrepancy seemed to drive students, making them willing to work to consider possible causes. With Pivot Interactives, collecting data about real events is easy, so there is more time for students to dig into questions that develop science reasoning and critical thinking.

The last question on the assessment was: How could you determine if the track *was* tilted so the right end was lower? What evidence could you gather *even if you could not go back to the original apparatus*. I was hoping students would say that they would just check the outcome of some of the other trials shown in the activity to see if there is a trend towards one type of discrepancy or not. But a few students had a better idea: plot the *position vs time* for the cart after the collision and look for a changing slope.

Our recently-released "Grade by Question" tool makes it easy to give students feedback. I selected the key questions in the activity where I wanted to provide detailed feedback. The Grade by Question view allows me to provide quick feedback and scores, and accommodates standards-based grading (more in a later blog post).

As teachers, we want our students to explore open-ended lab-based activities and assessments as much as possible. But practical constraints -- lab equipment availability, set-up time, student absences -- present obstacles to making this type of assessment a frequent feature. Pivot Interactives allows students to work on some of the same skills, as they apply physics concepts to real situations. The ease-of-use of the platform means these types of activities can be a regular part of our instruction.