Document Type

Article

Publication Date

7-1-2023

Published In

Physical Review Physics Education Research

Abstract

The increasing and diversifying student enrollments in introductory physics courses make reliable, valid, and usable instruments for measuring student skills and gains ever more important. In introductory physics, in addition to teaching facts about mechanics, we also seek to teach our students the skills of “thinking like a physicist,” or expertise in and intuition for physical problem solving. How and when these expert, intuitive problem-solving skills emerge during a STEM education, or what the most effective teaching methods might be, are not certain. A facile survey to measure students’ “physics-thinking” skills in a pretest and post-test format is therefore desirable to measure and evaluate different pedagogical approaches. Prior investigators codified these skills as “epistemic games” (e.g., order-of-magnitude estimation, evaluating extreme cases) and developed and validated the math epistemic games survey (MEGS) to measure students’ ability to employ these techniques. The original survey instrument is reliable and valid but has drawbacks in its length and in students’ ability to recall questions between administrations. We employed factor analysis to split the MEGS into two mutually exclusive subtests and measured them to be equivalently reliable and valid as the full-length MEGS as originally formulated. The “split MEGS” is well suited for use as a pretest and post-test instrument to measure gains in expertise in problem solving in introductory physics courses.

Creative Commons License

Creative Commons Attribution 4.0 International License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Comments

This work is freely available under a Creative commons license.

Included in

Physics Commons

Share

COinS